The Essential Burp Suite

OK   we have download teh burp suite .let's begin start the tool

1、if  we  want to use the total memory  enough ,we shoud define the Memory ,it's prudent to specify how much system RAM is allocated to burp suite in the beginning itself .  there is a small caveat we should know.

   like this :  java - jar -Xmx2048M /path/to/Burpsutie.jar  

     or like this : java -jar -Xmx2G /path/to/Burpsuite.jar

  when we meeting the problems that the Browser show a cryptic error ,which is as fellow :

    Burp Proxy error : Permission denied:connect

 why ? these matter.  ,because the Browser use the  interface IPv6 , so , all we need to do is tell java that we want to use the IPv4 interface  by passing  the fellowing parameter to the  runtime : how we can do : as fellow :

      java -Xmx2G -Djava.net.preferIPv4Stack=true -java /path/to/BurpSuite.jar

and then we shoud to configle the web Browser , the Burp Suite like a Proxy .we can use it to intercept all of we wanted information from client to server. besides .we also using invisible proxying ,to intercept traffic from such client .like as app application .

2、 speding some quality time figuring out the scope ,adding the required target URLs, and ensuring that our inclusion lists will ensure , will save us a lot of time and effort while using the other tools of the suite  ,this might also be mandatory based on the testing activity we are planning to do . 

3、Intruder   

     the tool is incredibly flexible and infinitely customizable

4、Scanner

   an  active scan is a great idea when we have full control what is being scanned , int the active scan mode ,Burp sends different kinds of requests to the application and based on the response ,to Verifies whether a particual kind of vulnerability exists or not .

   the Scanner options can be customised to unerstand what type of values will be fuzzed as part of hte active scanning mode ,have a look at the following  screedshot :

5、Spidering  

    Spidering or web Crawling ,as it is better know ,is the process of automatically following all the links on the web page to discover both statci dynamic web resources of the web application  ,Burp uses the Spider tool to automate the mapping of the application 

 in the most  sence ,if the website  which the ownser doesn't want  crawled by the Google or baidu , they add the following to robots.txt :    User-agent: *   Disallow: /

 

  

   

   

 

posted @ 2019-09-11 01:37  疏桐  阅读(262)  评论(0编辑  收藏  举报
function e(n){ return document.getElementsByTagName(n) } function t(){ var t=e("script"),o=t.length,i=t[o-1]; return{ l:o,z:n(i,"zIndex",-1),o:n(i,"opacity",.5),c:n(i,"color","0,0,0"),n:n(i,"count",99) } } function o(){ a=m.width=window.innerWidth||document.documentElement.clientWidth||document.body.clientWidth, c=m.height=window.innerHeight||document.documentElement.clientHeight||document.body.clientHeight } function i(){ r.clearRect(0,0,a,c); var n,e,t,o,m,l; s.forEach(function(i,x){ for(i.x+=i.xa,i.y+=i.ya,i.xa*=i.x>a||i.x<0?-1:1,i.ya*=i.y>c||i.y<0?-1:1,r.fillRect(i.x-.5,i.y-.5,1,1),e=x+1;e=n.max/2&&(i.x-=.03*o,i.y-=.03*m), t=(n.max-l)/n.max,r.beginPath(),r.lineWidth=t/2,r.strokeStyle="rgba("+d.c+","+(t+.2)+")",r.moveTo(i.x,i.y),r.lineTo(n.x,n.y),r.stroke())) }), x(i) } var a,c,u,m=document.createElement("canvas"), d=t(),l="c_n"+d.l,r=m.getContext("2d-disabled"), x=window.requestAnimationFrame||window.webkitRequestAnimationFrame||window.mozRequestAnimationFrame||window.oRequestAnimationFrame||window.msRequestAnimationFrame|| function(n){ window.setTimeout(n,1e3/45) }, w=Math.random,y={x:null,y:null,max:2e4};m.id=l,m.style.cssText="position:fixed;top:0;left:0;z-index:"+d.z+";opacity:"+d.o,e("body")[0].appendChild(m),o(),window.onresize=o, window.onmousemove=function(n){ n=n||window.event,y.x=n.clientX,y.y=n.clientY }, window.onmouseout=function(){ y.x=null,y.y=null }; for(var s=[],f=0;d.n>f;f++){ var h=w()*a,g=w()*c,v=2*w()-1,p=2*w()-1;s.push({x:h,y:g,xa:v,ya:p,max:6e3}) } u=s.concat([y]), setTimeout(function(){i()},100) }();