[COURSE_PTHE] 13. Web应用

1. 简介:网络应用(Web Applications)

    This lesson looks at what all the differing aspects of web application exploitation, how to detect attacks, and how to assess so you can defend against it.

    Welcome to the Web Applications module of our Penetration Testing and Ethical Hacking course. The Web Applications module looks at the attack factor of web application hacks.

    The Web Applications modules differs from the previous Hacking Web Servers module because it presents an opportunity to look at and discuss both old and new technology and web architecture – the factor in web hacking that defines web hacking. If you know the architecture, you know how to approach and set up your hacking session when conducting your penetration test.

    The Web Applications module also focuses on the dynamics of today’s bundled technology and the coming together of web server components such as searches, mobile devices, the user experience, other web applications and social networking.  You’ll learn the basics of bringing together old and new web technologies and how it can be exploited to identify penetration testing opportunities and parameters.

    The topics explored in the Web Applications module include:

  • Whiteboard, which shows the interrelationship of all the basic components utilized for this module
  • And the following simulation labs:
    • burpSuite Lab
    • HTTP Lab
    • IDServ Lab
    • nikTo Lab
    • virusTotal Lab
    • wGet Lab

    Leo Dregier here now the last module we talked about the web server components in themselves but now we get to talk about the web applications and this is a literally Pandora’s Box. There is so much to talk about this is a jam packed module because we get to tie everything from the old and the new in together and then attack and exploit it all. So some of the attack factors you have got to look at the basics here – what is the architecture in the way that the web server is setup. So that if you know that then that tells you exactly how to approach the attacks. Now your traditional threats confidentiality integrity and availability. They still exist but there is other principles involved that are very, very important as well for example authentication, authorization and session handling as well we get to tie in a lot of the new technologies because most of our web components like mobile phones or iPhones or Android they connect to web based application in themselves. So while as simple as HTML going back and forth between the client and server. There is a lot more to it than just that in the web 1.0 world – it was very static but now we go into the web 2.0 world where we have lots of dynamic content and there is realistically so much to talk about. Because you have search capabilities you have social networking. You have mobile application you even have registration components and things like Ajax which really just enhance the user experience. So whether it would just something like Wikipedia or email or instant messaging all of this get tied in to the web application. Now what works very, very well for us in terms of penetration tester is most software developers are not security engineers and most security engineers are not software developers. So only in the last five or six years is that world actually been coming together but they have been very resistant to actually doing it together. So the whole lifecycle in which applications get developed we can basically take that apart and exploit that. Also the hacker has a huge head start here because how long is it going to take for company X to go ahead and find a vulnerability. Go ahead and write off some code to fix it – take it through their patch management process. Move that through the change control process and ultimately get it to the point where they have fixed their problem. So the hacker has a huge head start – so between all of the technologies that we bundle together here and then all of the life cycle components there is lots of room for penetration testing. Now in this particular module there is realistically so much information here it could be its own five to ten day course. As a matter of fact most of the time while I am teaching live the most common critic is man I wish we had more time. So we are going to move very, very fast through this module but nonetheless you can take it apart one step at a time. So let us go ahead and get started.

 

2. 框架

    This whiteboard lecture video covers web applications in great detail. Web applications play an important role in every organization. Cyber defense requires a thorough understand of web application security issues.

Welcome to Web Application Pen Testing. This whole module is huge. There is so much that we realistically you could do here. It could take months and months and months to build proficiency here. This its own course this is its own subject matter. This is its own field of study – so we are going to highlight it as opposed to getting all the nitty gritty details. So it has to deal with web applications there was a book written some time ago called the web applications hackers handbook and it will be the go to book for years and years to come. This book has been light years ahead of just about anything that you will find in this subject matter. There is also some really other good books out there. I don’t want to discount those but this is basically the new foundation for how to approach this whole field. So the concept here is besides the obvious web application. What are components of them – how do they go wrong? So let us start here with some basic concepts here. Because the web applications this is where we get cross side scripting. It exploits the relationship between the client and the server. Cross side scripting you also have vast amounts of information leakage. Some of it out there which is not harmless – not harmful in itself but could result in a secondary attack which could be harmful. So information leakage things as simple as error messages while they don’t seem like a big deal but if that error message allows you to exploit or gain the insight necessarily to exploit the service that ultimately could lead to credit card numbers on your website. Contents spoofing – websites hold content legitimate content and bad content. So you can spoof that content – weak authentication – everything about the authentication process. Authentication in a nutshell is I am acclaimed identity – prove it. We all connect to a web server, so let us exploit that relationship – cross site requests forgery. You kind of have to deal with this one backwards. It is a forged request that goes from one side to another. Brute forcing – Mama are we there yet? Mama are we there yet? Mama are we there yet? Keep trying things over and over and over again if someone is not monitoring the web application. You can just have at it – predictable resources – for example it is pretty well know that you can go to /administrator and that is normally the administrative login for the web portal that stuff realistically should be changed. SQL injection its own field of study in the terms of databases. Well most web resources are stored in the database whether it be mySQL or MSSQL or Oracle whatever this is – SQL injection is its own field of study. Session fixation – this is where the hacker has some information and he needs to get a victim to concentrate on what the web hacker is now saying. Hey please click on this because this will get me in. You have tricked the user into clicking on something and then ultimately exploits something. No session exploration or indefinite session times. This is also another problem with web servers. Web 1.0 concepts versus Web 2.0 concepts Web 1.0 concepts these were more or less static sites or resumes or all about the business at hand. Web 2.0 is less about the business and more about the end users using the website. So a great example of web 1.0 versus web 2.0 is just a very static web page that is Web 1.0 things like YouTube or any customer oriented website like YouTube where it is all user generated content or user generated features. Those are Web 2.0 concepts. So we need to look at those just a little bit more – things like blogs you can have everybody and their uncle go to their website and post the blog. What is user generated content therefore Web 2.0 you have concepts like Ajax. Google used these, YouTube uses these – this when you start typing and it starts to predicts and narrow down what you are actually searching for – or even flash you could consider that web 2.0 oriented or tools like jquery or cloud concepts in general this is really Storing things out on the internet in some sort of public fashion like a great example of cloud storage would be DropBox or Wikipedia, online dictionaries and things like that or gaming sites or every traditional RSS or social networking in general. So we are very much Web 2.0 now it is all about the end users and making the web site valuable for the users. The site that you are on right now – web 2.0 oriented. It is focussing on the end users – so when it comes to hacking web applications there is all sorts of threats that could go wrong. Things like cookie poisoning – web servers – store little pieces of code on your client side computer. While You can poison them directory traversal, navigating predicting and enumerating what the directories look like – very easily you can figure out if it is Unix oriented or Apache versus Windows oriented. Unvalidated input – can you just supply anything to the web server – SQL injection notice I have a note here Chichi I used Chichi whenever I possibly can. SQL injection sheets cheat sheet it is cross scripting cheats apparently I can’t say the cheat sheet but that is okay. Use them to your advantage and instead of trying to memorize all of the possible combinations of SQL injections for MSSQL or mySQL, Oracle – use cheat sheets them to our advantage. Same thing with cross eyed scripting – you want to know what they look like there is plenty of cheat sheets out there use them. It is a great way to kind of keep a lot of information on the tips of your fingers – cross request forgery. What we talked about – form tampering, insecure storage or how does the web server store its information that the users are uploading are there picture directories or are they video directories. What are the permissions on those directories? How are you handling the errors? Are you giving an error message to your end user – does the end user even need to see that. Buffer overflows are its own field of study log tampering, clearing your logs changing integrity of them. All of the account management remember Web 2.0 is user centric. So user is going to need to have accounts. How do you manage them – how do you things like password reset functions and things like that. How do you manage sessions? Can you just go to the online site and add something to a check out without creating an account. Ultimately you are storing things on a server! Platform specific exploits either with the application in themselves be a Joomla, Drupal WordPress or maybe at the operating system level. Is it Unix versus windows etc authentication hijacking – see the session hijacking module cookies snooping just finding out what is in the cookies in itself. Session fixation tricking the user malicious code execution denial of service that is its own field of study. No encrption, no SSL, no IPSec, no transport level security – even XML poisoning these are all potential threats for things to wrong in the world of web application pen testing. So let is look at the counter measures – become an expert – it is really that simple. Normally I would like to list off top ten or fifteen here. In this case there is just way too many I would fill this whole board four to five times over. There is so much that goes on specifically in this subject matter you have to become an expert this is not something that you are going to learn overnight. Learning each one of these techniques just learning cross side scripting and Form tampering that could take a lot to learn as opposed to buffer overflows or SQL injections. Some of it you have to learn the whole field of databases before you can just become good at SQL injection versus you want to become a great web application pen tester. You have to understand all of the components of HTML. One of the best tools out there for learning this is a tool called the burp suite. It is something that you can use to dissect the Web process over and over and over again. So let us go ahead and look at some hands on examples.

 

3. burpSuite工具

    The first simulation in the Web Applications module introduces burpSuite.

    This lab demonstrates how to use the burpSuite tool, a session scan within Kali Linux.  It’s a 2-part setup and configuration process – one part works and is set up on the web browser, while the other part is setup and works from inside the web application itself.

    When conducting penetration testing, implementing the correct configuration is essential to both performance of the test and obtaining accurate results. This Web Applications lab highlights that point.

    Hi Lei Dregier here. I just want to show you the basic setup of how to get the burp suite up and running. Burp suite is something that you use a lot in the web application and pen testing world and it could be a little tricky to setup a few – they are not used to it. However you may find it both to be intuitive – so on the Kali Linux distro I want to go over – click on Kali Linux go down to web applications and then web applications proxy please go ahead and select the burp suite. Or you can just type in burp suite in the back prompt. So burp suite will come up and this is basically a two part setup here one inside the application and then two in your web browser. So we have got a little alert here there is a newer version available and that is okay. We are not going to update that now. I just want to share the basic setup – so we are going to go into the proxying options and basically look to see that this is isn’t that setup. So listening on my 127.0.0.1 interface AD specifically. This is a very important address under proxy options because you have to basically tell your web browser to point your traffic to yourself. So as long as this setting matches the one on your web browser and that is the one you set it up. Everything will be fine – you can do this locally or remote. Here we are going to locally – so next I am going to open up a web browser in this case we are going to use isuite so that it would be more than fine. We are going to go to the preferences and then we are going to go to the advanced tab network and down to proxying. Under network ‘advanced network settings’ click on the setting and use system proxy auto detect to manual we are going to set this up to 127.0.0.1 and we are going to use port 880 these would be the settings of that – that you would want to use now keep in mind here that I am just telling you – I could use my IP address or the loop back address and then I will make sure that port 8080 matches – then there is plenty of ways to trick the end user into the setting up their proxy settings in this fashion. You could have stripped it you could do it as an update. you could do it through a group policy anyways is fair game but once you trick them to basically pointing you as a proxy. You get to capture all of that traffic – so go ahead and click enter and then go to some web page – I am going to go to www.leodregier.com and you noticed that it will kind of reason to hang out here. This is good because if it does freeze – then you can go back over to the burp suite and then you can see the inner set traffic noise that means that my host or the burp suite actually received the request and I can actually start evaluating that client traffic. So you can see a get request and the host the user agent it is coming from a Linux or web browser it is accepting the html application or xml the language is English. The encoding gzip or deflate etc. etc. and I am going to do ahead and forward that on to the next piece of traffic and then you can evaluate each one of these which is nice because if you wanted to – slowly analysis point – you can however you can also set it to just forward everything and then the end user wont experience the delay because notice here what the end user would see. If we go back over to here the page loaded it within itself. But end user may actually be you for example go to a large site like CNN – notice it freezes i go here forward and then CNN and then it stil hasn’t loaded yet and then I can afford the next one and then CNN probably still hasn’t loaded yet and then go to the next piece of traffic and the next one and the next one. So in that case the end user would definitely be impacted in the browsing experience. So you don’t want to keep forwarding these pieces of information. Also keep a eye for anything and basically like for example I have a – these are flashed lights since we started the proxying request is going so you can see the summary here or you can keep going back and forth for it. Alright, that is basically how you set it up. Then you can start going into the detailed analysis of basically what is this doing. So just from the two sides that I went to so like leodregier.com it basically pulled a directory structure. I can pull my favorite icon. You can see that it is a WordPress site because wp_content you can analyze all sorts of parameters that headers and if you want to view this stuff and hexadecimal. You are certainly see the actual code there – so even something like we go back up to here to Twitter or anything that was reference on the site can also be here. Twitter there was some Twitter account code of learning you can go to the LinkedIn profiles and see if you can pull any of the LinkedIn names and he says my first and last name Leo Dregier or YouTube, go to the YouTube and it is the Code of Learning. So I am ultimately playing – user names or ID’s or conventions that are related to this website. So it is a great way to kind of socially engineer the different conventions or points of presence or internet preference that a particular website is using and then you can go into other tools like spidering using this as a scanner. Setting it up as an intruder looking and setting different payloads they have a repeater that is built into this or sequencer or a comparer for different websites and things like that and some basic options like http or SSL this is an extremely, extremely, extremely powerful tool. We will probably cover that with maybe later videos but I just want to show you the basics of setting up the burp suite. So you can use this for sniffing web application. I would use this in web application pen testing. I would use this in session hijacking. I would use this in the man in the middle attacks. I would use this in web application vulnerability identification. Maybe even some exploits so that is the over view of the burp suite. Thank for watching and checkout all of the videos in this series, my name is Leo Dregier thanks for watching.

 

4. HTTPRecon使用

    This next lab in the Web Applications series examines HTTP reconnaissance aspect of Penetration Testing and Ethical Hacking.

    The HTTPRecon lab demonstrates the HTTP/HTTPS URL traffic scanner. This lab is a very straightforward tool that gives you all the parameters and statistics for HTTP and HTTPS server traffic monitoring at the “server” session level.

    Hi Leo Dregier here. I want to talk to you about http recon it is really, really easy and simple tool to use basically just load the tool in this case I have the tool in my toolkit http recon you just open it up and basically your target and http or https cybrary.it port 80 and then analyze the traffic and basically run through a script here with all of these different parameters. These parameters are set basically in a configuration which you can do when the scan in not running. So just let the tool run and then you will get a summary and in this case it is going to guess that it is an apache server. So send http request 1.1 you got 200 it guesses it the cybrary.it web server has apache 2.2.2 is using php 5.4 it has got a php session id so I pull the session id and try making sense out of that and pull some different cash it is on the XML RPC – it has a link. It tries to find the encoding and then it is in the character set UTF-8. Now specifically what I like doing with this tool is – one you can open the website in the web browser. So you can go directly to it here without typing that is always nice. And in the reporting to generate the report go ahead and select all of this and use this as HTML. You could realistically do it as any other reports but watch what happens when you do a HTML. We are going to save this to our desktop. Save it and it opens it up and this is perfect, perfect documentation to add into a web application pen testing report. It has got all of your screen shots and pictures and embedded into this and so it is just real nice and easy just to see exactly what went back and forth to and from the server for each of the connection request and then you can analyze this. So just having basically the pictures here is absolutely priceless in terms of documentation. So other than that it is really simple – the tool – now you can basically set it up to finger print – basically putting your server set it and forget it boom it goes. So really easy tool to use – again thanks for watching my name is Leo Dregier and I am sure you have checked this out by now on Facebook, LinkedIn, YouTube and Twitter.

 

5. lDServe使用

    As we continue with the Web Applications module, we now focus on user activity in our next penetration testing tool, IDServe.

    The IDServe lab demonstrates the server ID user session at the web server level and what information you can capture when conducting this penetration web application test.

    IDServe provides all the server session info including the IP address, the port, the specific server type and its name, as well as the power source.

    Leo Dregier here. I want to talk about a tool ID serve you can get this at www.grc.com/id/idserve.htm Once the website pops up you basically can scroll down to see download here now. It will download your web browser and then you can open the tool. So we are going to open the file and it is basically right here. So you can go to the web page clearly – so we don’t need to do that but then go ahead over to server query and then type in www.leodregier.com query the server or whatever your target is – in this case I am going to query my own personal website and scroll all the way up and then basically read this like a book. So you can see the web address is the name then it results in this IP address it is listening on port 80. It was able to connect requesting the servers default page, the server returned the following response headers. So it was listening on http 1.1 so the date the server is apache. It is powered by php 5, 3.26 at the moment and it comes in hml rpc.php file on the web server. It closed the connection and completed the query and it says that the server was identified as apache. So very, very, very easy tool to run cybrary.it let us run another serve and you can see that it identifies it as apache and deviant server or google.com query the server basically sees that it is a google web server. GWS I only did that just to show you the variety or cnn.com query the server and this is active in the sense that I am reaching on touching a target here. It is relatively non-intrusive the websites are basically disclosing this information. So it gives me a quick sanity of guessing the type of web server it is running in this case cnn using njinx which is somewhat of a popular server of course IIS in apache. Apache is going to be the most popular web server but there are others you saw the GWS for Google you saw njinx for CNN. So if you quickly need to identify the type of platform that your destination is running the id serve tool is a great way to get a quick web server footprint. So that is it on Id server really easy tool to use hope you enjoy the videos and I am sure you guys are connected now on Facebook, LinkedIn, YouTube and Twitter.

 

6. nikTo使用2

    For this next Web Applications module lab, we utilize nikTo for our web application penetration test.

    This lab covers the basics of nikTo, a utility for scanning web servers at the host level.  With this tool you must have the current version, so you’ll also learn how to navigate for query version, downloading and installing the current version.

    NikTo pulls general information but it also lists all the vulnerabilities it finds, and the path order of those vulnerabilities within that host.

    NikTo is a small powerhouse of a utility that provides a wealth of information essential for Penetration Testing and Ethical Hacking.

    Hey Leo Dregier here. I want to go over and cover how to setup a basic nic2 scan. So I am going to switch to my virtual machine carried over here. I am just going to do a nic2 space.h and then it will bring up the help file and nik2 is relatively easy program to run. It is basically nik2 then the web address and then some additional configuration options but specifically what we are going to want to do first is update the talk. So that is easy enough in itself and I – update and go ahead let that run and it will see this version is updated. Please upgrade to 2.1.5 or better. Use github version so that is fine because we have a updated version of nik2. So it is apt get install niktl and you can read this it says the following packages will be upgraded – needed to remove needed 391 K of archives who then fetch the archives read the database, unpack the replacement files and then it is setting up the nik2 application. So you can see specifically here 1.60 Kali so that is going to be pretty consistent and set it up and now it running. Now we should be able to do it – So nic-update once you have nik2 dated go ahead and select a target to scan. So we are going to do that by doing nik2 – tak h for host scan. I want to scan my own web site www.leodregier.com just so you guys know. I do have a web application intrusion detection system and if you guys decided the same thing there is a very good chance that you are going to go ahead. So you might not want to do this to websites that you do not own or are not yours. So go ahead and go a nik2 target website hit enter. See diversion 2.1.6 your target IP, the port, the start time. You found that the server was apache running php 5.3 the anti-click jacking extreme header is present and come header link found and you are just going to at this point go get a cup of coffee and come back and you could read this because it will take some time for it to run. but each vulnerability that it will find will trick into this window. The only thing that I would do differently in the real world is I would actually just append the results to a file and call that the file date time stamp and then the destination.txt or something like that and you can do that by just appending the command that I have already entered with a space greater than space whatever the file name you want is. Hit enter and all of these of results will be actually sent to the text file. Now if you get impatient here and you click enter bunch of times then it is not going to help. You basically just have to go ahead and wait. So we are going to go ahead and let the talk run and then once it finishes we will do a review. Now also if you want to what you can do here is you can sniff the traffic of the update. If you just want to see that it is running. So I just setup a basic key shark pet capture just to show that the traffic is in fact going out. There is a variety of tcp connections simultaneously we would also do this on a ether ray as well. So I will just do it – and you get to see it working in the background analyzing the target website. So I will just go ahead and let these run and you guys can go ahead and watch these for a few minutes and get an idea of what the tool is actually doing. Now that the scan is completed let us go ahead and make some sense out of the scan results. So we are basically just did a regular nik2 scan against a target website and again you should only do this against websites that you have permission to do so. The server is apache – php version number – no cgi directories found. You can always check that again – you can dash c all to force the check of a possible cgi directories and then again there is a lot of positives in there. So it checks the robots.txt that is public information anybody can check that anyway. Secure control panel optional there is a web server control panel which is generally popular in the hosting world. You can see if there is a web mail directory and then it till start pulling open security vulnerability database and then actual number associated with those. Here you can actually research of these number and get a background check. My my complaint about a nik2 web scanner is that it does generate a lot of false positives. You actually have to quantify these but nonetheless for an open vulnerability scanner it is not too bad. It generally tells you obvious stuff like it is a WordPress site you probably could have got that just from the directory structure in itself. It has got an admin login page section found so that happens to be under control panel. You can see the wpadmin or WordPress login page. You might want to hide those or protect those just to make the obvious go away. You made approximately 6,613 requests zero errors.

 

7. VirusTotal使用

    This next lab in the Web Application series introduces you to VirusTotal.

    VirusTotal is a utility that confirms the target “is” or “is not” a virus.

    This lab delivers a thorough demonstration on the use of this file and url scanning tool, and helps us understand its importance as a “go-to” resource for Penetration Testing.

    Hey Leo Dregier here. I want to talk to you about virustotal, virustotal is just one of many websites that you can go to submit any sort of file or executable or URL that you want analyzed and it will basically determine if it is a virus or not. So we are going to test this out to see what it identifies – first go ahead and choose a file. Go ahead and grab – I am going to go ahead and grab something out of the ethical hacking toolkit here. So we are going to go to viruses and worms – virus construction kits – virus maker and grab and JPS. Upload that and then scan it you have to upload the file to complete the hash on it. It allows it to be easily compared to any other known file of the exact same integrity. So file already analyzed this file was last analyzed by program on 12:25 it was first analyzed on 2007. So it has been around that long. Detection ratio 46 out of 52 you can take a look at the last analysis or do it again. So we will just look at the last one for simplicity. So you can see it computes SHA 256 hash on it. It identifies it as a back door these are actually all over the actual names. The vendor probably could reference. So it will come up as any single one of these. So basically you see that it is a back door or trojan you also can see the difference in a virus vendors on the left here. The programs versus the specific virus in that program and then the of course the updates. The detail of the file, where it comes from, file version. So you can get pretty good specifics, tool upx can analyze this as well. That is another tool commandment tool that you could easily use here. The names – the virtual addresses and the five hashes the different files that is actually used or what they are called. A number of physical resources by type – there are some of the meta data so you get the idea any sort of relationship to other hashes any additional information. Most of these can hashes here – not really too much you can learn about – it already too obvious that is relatively it. That is virus total it is a quick way to grab a file and I wonder if this is malicious or has any else around it. So I would include that in my quick sanity check – hey let us look at the obvious first. This is just an easy web site that anybody can use – everybody from grandma to grandpa down to a seven year old I am sure. So everybody should do this exists or websites like this. So that way when you are analyzing code – we have done the hard work to make them tap into that – the real tricky part comes in when a website like this does not come anything. It is not getting any results and then you have to wonder am I on chartered territory now? So my name is Leo Dregier thank you for watching, don’t forget to check us out on Facebook, LinkedIn YouTube and Twitter.

 

8. WGet使用

    The final lesson in the Web Application series focuses on the actual web application/web server data.  WGet is the utility we use to accomplish web data analysis in penetration testing.

    The WGet lab demonstrates how to retrieve data on a web server that you specifically know is there. This is an extremely powerful scripting tool that offers a wide selection of switch options for total customization so you can scan for targeted information.

    You’ll also learn how to effectively use the WGet tool more precisely, including how to use FTP specific parameters for conducting targeted FTP scans and web application vulnerability searches as part of a thorough penetration testing strategy.

    Leo Dregier here, I want to talk with you about a tool called web getter or wget this is great for retrieving something from a web server if you know exactly what the target is and what you want to retrieve. So it will go into the internet – grab whatever you want from a particular server and then copy it to the working directory that you are working in. So if I just print my working directory you can see / so we are going to do a wget and give us the format of a command so it is a missing the URL. So we need – wget options in the URL or if you want to see the detailed parameters then you get tab help and that will give you the full capabilities of this tool. We are not going to be come over this now and this video but I just want to illustrate that it is extremely powerful of utility especially for scripting. You have got basically stuff like versions or if you would like to go into the background you have got the specific login options. If you run for post login or things like that it is always from the go. If you want to download the number of tries where you want to put the output. What do you want to do with the server response. If you want to just see if it is there and not actually downloaded – you can use this tool in spider mode. You can specify you only go wrong before networks or only over IPv6 networks. Specific user names and passwords for ftp and http purposes – so if you do need something that requires user names and password. Certainly you are going to add that in here for scripting purposes because you don’t have usernames and passwords and cleared that. If you want a grab a directory structure or not it has got a handful of http options again user names passwords, client settings, headers or cookies or post its – strings and add that to your logs. If you want to go over https or transport level security in one of the files and the certificates and their versions and one of the private keys in uses and things like that. You have ftp specific options and I would say ftp username and password parameters instead of the other user and parameter. Items from before so if you are specific ftp user the specifics of ftp and it will work much better. If you want a cursor all over the files and directories and again you have some additional recursive options like accept this or reject that. Okay we are going to actually just go into a simple version of this wget http:// www.facebook.com/robots.txt hit enter and it will go out to the internet to get robots.txt from Facebook. And there you go so resolving facebook.com it when out connecting to Facebook founded on this IP port 80 if you are not connected. http request for a response got an okay and then you can see basically it tried to copy the file and it saved it. So if I do an LS I should be able to see that robots.txt – now we can edit all hard drive. Now we can actually edit that – if I were to use the contents of this and you could do this dozens of different ways. You could do a head tail – less more, whatever you prefer. So I am just going to tail the file – tail robot.txt and you can basically see the last ten lines by the fault of that file just to prove that there is something. Or we can do – something like last robots.txt and you can go line by line and what is interesting here you can see all of the spidering that Facebook specifically does not allow. I have not done anything unethical here – robots.txt file is probably available just about everybody’s site it is extremely popular within the web world just about every website has and always should be publicly accessible for read access. But i can learn some stuff there – I can learn that basically there is phps you are probably using apache server. I can see some of the directories like hash tags and photos and check point and hash tags and it is specific to the actual spiders that it is telling that not to index the sites. So here is the google bot no don’t check these things. Here is IA archiever don’t check these things. Here is the MSN bot disallow these activities, here is the nav robot, disallow these activities or sizz in the bath – disallow these activities. Slur which like a common program for indexing etc. etc. Yahoo – nonetheless you get a basic idea of how to go to a website. Grab a file copy it to a local host and then basically analyze that file. So it is relatively simple but any sort of web application pen testing reconnaissance for websites. You definitely want to know how to use a wget. So hope you enjoyed the video and I will see you in the next one.

posted @ 2015-09-30 22:08  It's_Lee  阅读(155)  评论(0编辑  收藏  举报