Introductory Cyber Security – A collection of Links to Help Get Started

Each of the links below contains some valuable information as it pertains to security professionals. Someday maybe I’ll organize them a bit better, but for now I just wanted to get them posted so they wouldn’t be lost.
Hack, Learn and Earn by joining the HackerOne Community – HackerOne

Hacker Blogs We Love Reading | HackerOne

Useful Online Resources for New Hackers | HackerOne

WPScan by the WPScan Team

GitHub – ethicalhack3r/DVWA: Damn Vulnerable Web Application (DVWA)

Kali Linux Tutorials – HackersOnlineClub

Cybrary – Online Cyber Security Training, Free, Forever

Bughunter University

What Great Hackers Have in Common | HackerOne

Shodan

37 Powerful Penetration Testing Tools For Every Penetration Tester — Software Testing Help

The Top 5 Pen Testing Tools You Will Ever Need

Plans and Pricing

Metasploit Unleashed – Free Online Ethical Hacking Course

Introduction – Metasploit Unleashed

Avoiding Wrong DNS Server With VPN – Choose Which Settings are Used

Over the last few days I’ve been dealing with some VPN / Remote Desktop issues that made me think I remote desktop was not working with my VPN. Instead it turned out to be related to DNS problems within the VPN that stemmed from a configuration issue with the adapters on my machine.  I decided to write it up because it was a real pain to figure out. The good news is that it’s an easy to fix.

I was trying to do something very simple that I had done many times before which was to connect to a and then to remote desktop from there to a machine on the network providing the VPN. I was attempting to use the fully qualified hostname as the remote computer name and for some reason it just wouldn’t connect. Why all of sudden it wouldn’t work was really driving me bonkers.

Short version: It wasn’t working because the DNS being used wasn’t the DNS that had been passed along by the VPN. Instead, the DNS being utilized was my computer’s normal DNS even though the VPN software clearly showed it wanted me to be utilizing it’s own.

A quick cmd->nslookup after I was connected to the VPN easily showed that I wasn’t using the DNS server the VPN had prescribed.

>nslookup
Default Server: google-dns-made-up-name-here.com
Address: 8.8.8.8

To be correct it would have looked more like this:

>nslookup
Default Server: dnsserver1.example.com
Address: 192.168.2.22

In addition, a ping of the fully qualified server name I was trying to reach received responses from the gateway IP of the computer – not the computer itself. See below:

>ping dev21.example.com

Pinging dev21.example.com [64.199.2.6] with 32 bytes of data:
Reply from 64.199.2.6: bytes=32 time=65ms TTL=53
Reply from 64.199.2.6: bytes=32 time=63ms TTL=53
Reply from 64.199.2.6: bytes=32 time=66ms TTL=53
Reply from 64.199.2.6: bytes=32 time=59ms TTL=53

That IP is the gateway to the network where the server is located, but it’s not the internal IP address of the server, dev21.example.com, so we’ve got work to do.

Why wasn’t my computer using the appropriate dns server per the VPN (192.168.2.22)? The answer is painfully simple and exceedingly annoying and, to be honest, still slightly unexplained. I know what the problem is and how to fix it, but I’m not quite sure how it came to be. I run more than one VPN tool and I’m wondering if a secondary VPN didn’t bump the settings of a different interface during it’s install or something… I really can only hypothesize about the how, but I can fix it. So let’s get to it.

The problem was that the priority of the network adapters (yes, there is such a thing) was set such that the original computer adapter was given priority and therefore data flowed through it when possible.

The fix? Super simple! Reprioritize the adapters so that the VPN adapter got the highest priority and the other adapter some lesser priority. By configuring it this way when connected to the VPN traffic
will flow through the VPN rather than through the normal routes.

After reprioritizing the adapters my nslookup looked like this instead:

>nslookup
Default Server: dnsserver1.example.com
Address: 192.168.2.22

That’s more like it! Instead of using the normal public DNS it is now using the VPN’s DNS. And guess what? If I attempt to remote desktop it works. If I ping the internal server
that I was trying to reach the response shows that it found the server instead of just the gateway. And that’s really all there is to it!

In order to change this priority in Windows 10 you have to update the metric for the adapters to get them in the right order. Doing so is simple. Just remember, lower metric is higher priority. Here’s a listing of how to do this in Windows 10 which i found here.

To configure the Automatic Metric feature:
1. In Control Panel, double-click Network Connections.
2. Right-click a network interface, and then click Properties.
3. Click Internet Protocol (TCP/IP), and then click Properties.
4. On the General tab, click Advanced.
5.To specify a metric, on the IP Settings tab, click to clear the Automatic metric check box, and then enter the metric that you want in the Interface Metric field.

And that’s it. Just bump the one for the VPN up to the higher priority and poof, the problem goes away. Here are some netstat outputs of before and after. You can see the orders have switched
on the two highest priority interfaces.

BEFORE 
Interface List
13...b1 33 a4 6e c1 41 ......Intel(R) Ethernet Connection (2) I219-V
19...02 f2 1a 7d a7 06 ......TAP-Windows Adapter V9
 1...........................Software Loopback Interface 1
15...00 00 00 00 00 00 00 e0 Microsoft ISATAP Adapter #3
18...00 00 00 00 00 00 00 e0 Microsoft ISATAP Adapter #5
22...00 00 00 00 00 00 00 e0 Microsoft Teredo Tunneling Adapter

And another netstat with it fixed:

AFTER 
Interface List
19...02 f2 1a 7d a7 06 ......TAP-Windows Adapter V9
13...b1 33 a4 6e c1 41 ......Intel(R) Ethernet Connection (2) I219-V
 1...........................Software Loopback Interface 1
15...00 00 00 00 00 00 00 e0 Microsoft ISATAP Adapter #3
18...00 00 00 00 00 00 00 e0 Microsoft ISATAP Adapter #5
22...00 00 00 00 00 00 00 e0 Microsoft Teredo Tunneling Adapter

What about our ping test? Let’s try it.

>ping dev213.gmcps.com

Pinging dev21.example.com [192.168.3.16] with 32 bytes of data:
Reply from 192.168.3.16: bytes=32 time=66ms TTL=127
Reply from 192.168.3.16: bytes=32 time=56ms TTL=127
Reply from 192.168.3.16: bytes=32 time=58ms TTL=127
Reply from 192.168.3.16: bytes=32 time=56ms TTL=127

DNS is now working and my ability to remote desktop via computer name while on VPN has been restored! I hope this helps someone!

NUnit or xUnit or MSTEST

I was recently asked to make a recommendation for a unit testing framework for a team of about 10 developers that work in a .net environment within Visual Studio. There were no limitations specifically called out and very few preconceptions that would make some decisions harder to deal with than others. Several of the devs had never used an automated unit testing framework before and for those that had there was fairly even mixture of those that had used xUnit vs nUnit. Interestingly, nobody had used the built in to Visual Studio MSTest product, and nobody seemed overly excited about trying it – this seemed to be based on vague recollections of thother colleagues who had told them it wasn’t quite up to snuff. Nobody was unbendable so it was really just a matter of trying to find something that would work for all of them. I didn’t want to say “I recommend X” because as a general rule I like to provide a bit of information about WHY I make a specific recommendation. So I decided to write up a little of the research I’ve done and decided to share it will everyone. If you or someone you know is in the same decision making mode then you should consider reading this and the linked sources. It will provide some of the information you need to decide for yourself and know which things might be worth looking in to further. Here goes.

Source 0:
http://xunit.github.io/docs/why-did-we-build-xunit-1.0.html
This is worth reading to understand xUnit and its history. xUnit was created by creators of NUnit due to some NUnit shortcomings and philosophical differences / learning from previous mistakes.

Source 1:
http://www.slant.co/topics/543/compare/~nunit_vs_xunit-net_vs_mstest
NUnit – this first writer seems to lean in this direction because of the documentation of nUnit more than anything else. However, they also admits to possibly lagging update frequency for nUnit.

xUnit – also good reviews, comparable with NUnit. Lots of extensions (many shops would not even use most of them or create our own, but options are good). Many seem to think that xUnit is more user friendly.

MSTEST – good because integrated into VS, but that’s about it. This is a recurring lone good point.
Hasn’t been updated in several years (important for later reviews) – recurring complaint..

Source 2:
http://blog.ploeh.dk/2010/04/26/WhyImmigratingfromMSTesttoxUnit.net/

This is a document about why someone decided to leave MSTest for xUnit. The date (2010) should be noted, but supposedly (see source 1 above) the
product MSTest hasn’t really been updated since about that timeframe anyway.

Big One:
They mentioned several reasons for migrating from MSTest to xUnit, but the ones that would matter most to me are listed below.
1. Having to deal with Visual Studio actually breaking tests which causes false positives and wasted time.
2. The inability to run paramaterized tests. I would want to write one test and send it several groups of data and there’s no support for that in MSTest.

Example of how that works in xUnit:
Theory
Inline Data (4,5) // weight, zone
Inline Data (3.6, 2) // weight, zone
Inline Data (75, 27) // weight, invalid zone

Now I can write one xunit test which would test all of the above “theories”.

I could write one test in xUnit that would be fed all of these and could fail differenty for each one.

In MSTest I would have to write a separate test for each case even though they’re testing the same thing (yes, this is probably more integration test than unit test, but it would be done with the same unit testing framework as all the other tests).

Even if it’s a bad idea to create tests in this way (testing multiple things in one test is borderline not “unit” testings), I’d still like the ability to do so in the right situation. As the developer, you should get to decide when the situation calls for things and be wise about when you do something out of the “best practices” umbrella.

Take away quote from this article as it relates to MSTest:
“It not one big thing, it’s just a lot of small, but very annoying things. After three iterations (VS2005, VS2008 and now VS2010) these issue have still to be addressed, and I got tired of waiting.”

Source 3:
http://stackoverflow.com/questions/22650263/visual-studio-2013-mstest-vs-nunit

Also mentions that MStest hasn’t changed much in since original version.

nUnit declared to be “better” than MSTest, but xUnit recommended over nUnit.

Parameterized tests mentioned here again. For MSTest would have to use xml files and things are just complicated. Other two do much nicer.

Source 4:
https://www.reddit.com/r/csharp/comments/4198ei/should_i_use_nunit_or_xunit/?st=ipzluxp0&sh=9b0c568d

Discussion about nUnit vs xUnit. I would say if the page was given 10 votes it was 7 of the10 for xUnit. Pretty much everyone agreed that they are comparably good and just pick something and use it.

Source 5:
http://gavindraper.com/2015/05/19/asp-net-5-on-os-x-unit-testing/

Since Apple is a customer of the developer shop it’s always nice to be able to say that in theory what we do could be done on a mac as well. Based on the the link above I get the feeling it would be fairly straight forward to utilize the exact same test source code on a mac running visual studio as would be utilized on a windows machine also utilizing visual studio. Note: There was some question from one of the developers on the team that maybe xUnit would be a problem for developing on a mac computer and that tests would have to be rewritten, etc. I link cited above (source 5) seems to say otherwise and put the dev’s mind at ease.

Source 6:
https://app.pluralsight.com/player?course=automated-testing-fraidy-cats&author=julie-lerman&name=automated-testing-fraidy-cats-m2&clip=6&mode=live
Number of types of Possible assertions is NUnit vs MSTest is heavily tilted in NUnit’s favor. More than twice as many built in. This is important because it just means that your tests can be made slightly more specific with slightly less programmings. Since writing tests often feels like “extra work” to those who are new at it the best situation involves having lots of ways to do it as easy as possible.

Source 7:
https://en.wikipedia.org/wiki/List_of_unit_testing_frameworks#.NET_programming_languages
A list of lots of other options if it turns out you don’t like any of these 3 for some reason.

Howto set up:
xUnit – https://xunit.github.io/docs/getting-started-desktop.html (pretty easy looking: 2 nuget packages is all you need supposedly)

nUnit – https://www.nuget.org/packages/NUnit/ (nuget will get you there)

MSTest – fully integrated so nothing to do to be able to use it

My recommendation:

In the end I recommended xUnit. This is mostly based on a couple of things. xUnit has updates far more often than NUnit and this is important to companies who have teams of the developers that tend to keep updated development environments. For example, when Visual Studio 2015 came out, Nunit all of a sudden quit working. This broke developer’s machines and made them have to either choose not to test, choose to rewrite their tests, or use an old version of Nunit to work around it. xUnit didn’t have the same problems. To be clear, I’m not saying that it is NUnits fault, I’m just saying that it happened and it took them several months to release a new version with the fixes in place. That’s a long time to go without your tools working right when you’re trying to get a piece of software ready to ship or released to a production environment. All that said, even though I say xUnit is my recommendation, I don’t fault anyone for using NUnit. I tend to agree with the sentiment in source 4… xUnit and NUnit are comparably good and using either one of them will be of benefit. If you have your own thoughts feel free to add them in the comments section below. Thanks for reading and I hope you found the information useful.

 

Security in the Cloud

As long as cloud computing has been a buzzword in the field of technology, cloud security has stood as a major adoption inhibitor for many organizations. After years of worry, the Open Data Center Alliance (ODCA) is telling people that it is okay to embrace the cloud, according to CloudPro UK.

At an OCDA event Marc Ramselaar stated that accidents attributed to the cloud often are not true cloud issues and that the perception was not the reality in recent high profile issues. Ramselaar believes that what is actually a security issue with mobile phones (or whatever other device is accessing the data in the cloud) is being perceived as something that has been enabled by the cloud. He argues that it’s not the cloud that is insecure, but rather how people are using it.

In my opinion you can’t really separate the two when you’re talking about people using the tool. To argue that it’s not the cloud rather it is the device is akin to arguing that it’s not EVER a computer’s fault for being easy to steal data from… it’s always the users fault because they choose easy to guess passwords. If cloud computing is to be truly secure then it is important to have policies set up that will enhance security and that will mean not letting devices that are not secure have access to data that is meant to be secure.

Their is a tendency to outsource the cloud service (storage, computing, software) and that tendancy could require a greater level of monitoring and control in order to maximize security.

There’s also the concern about who is responsible for securing data. Much cloud technology is a service, but providers often do not want to put themselves on the line for your data. In my mind I’m okay with this to a degree. If the data is accessed by internal breach then the service provider should be held responsible. But if you write an iPhone app that accesses data in the cloud in an unsecured fashion then you’re the one responsible.

Scarily, at the time of this writing, 75 percent of providers (people who sell cloud based services) believe that their services did not adequately protect their customers’ data. In addition, 69 percent thought that securing that data was not their responsibility. This is scary because of the blanket statement made. It basically sounds like these people believe they are never responsible for the data when they are in fact the last line of defense.

The fact of it is that good protection of data in a cloud environment requires protection from both parties, and even though it may not be clear who bears the responsibility, the costs of a security snafu is one that no one wants to bear. If the data is yours then you need to protect it and you need to know that the services you utilize in your own computing systems are designed and built with security in mind.

[Note: this post was transplanted a previous project of mine at securityincloudcomputing dot com. I let that site go some time ago but felt that that article was evergreen enough to move]