Often overlooked recon step — analyze the sitemap.xml of target

3 min readJul 18, 2023

Sitemap.xml is still (should be used anyways for SEO) in addition to robots.txt for a guide so the crawlers/bots know what pages to crawl and not. Sitemap.xml takes it a step further:



DJ Substance displaying sitemap.xml
DJ Substance displaying sitemap.xml on a random target

In order to get this more readable, so you know your attack surface better, make the following script, but first wget a sitemap.xml file.

wget https://jiffylube.com/sitemap.xml
# Verify the sitemap.xml is populated


# Call this smscrape.sh ; chmod +x smscrape.sh
# If you have issues, type set -x (turn on bash debugging)
# post a comment if your sitemap isnt parsing ill help fix it

if [[ $# -eq 0 ]]; then
echo "Please provide the sitemap.xml file as an argument."
exit 1


# Check if the provided file exists
if [[ ! -f $sitemap_file ]]; then
echo "File $sitemap_file does not exist."
exit 1

# Define colors
nc='\033[0m' # No color

# Use awk to extract path and information, format output, and highlight priority
awk -F'[<>]' -v light_cyan="$light_cyan" -v cyan="$cyan" -v purple="$purple" -v white="$white" -v gray="$gray" -v blue="$blue" -v pink="$pink" -v hot_pink="$hot_pink" -v nc="$nc" \
'/<loc>/{loc=$3; i++}
/<priority>/{priority=$3; color = (priority > 0.56) ? hot_pink : pink}
printf purple "[%d] " nc cyan "URL: " nc white "%s\n" cyan "%s: " nc gray "%s\n" cyan "%s: " nc purple "%s\n" cyan "%s: " color "%s\n\n",
i, loc, "Last Modified", lastmod, "Change Frequency", changefreq, "Priority", priority}' $sitemap_file

# eof

Save the file above in smscrape.sh and chmod +x smscrape.sh it.

Most people dont realize how much info the sitemap can provide. Lets just use this jiffylube.com example. To execute the script above:

bash$ ./smscrape.sh sitemap.xml        # only takes 1 arg

Lets take a look at the output, it should be colored in Bash, if not try: TERM=”ansi”, then type reset:


<snip> — take a look at the size of this site

The point of me showing you this, and recommending this approach to add to your recon tool box, is not just the URL (You should be looking for dynamic links like &id=23&test=123 — as you could SQLMap those), however in this case I find the Last Mod, Change Freq. and Priority interesting, and information that wont be found elsewhere.


Modify the script to sort by Priority 1 (As far as i know it is 0–1 so .5 is valid) 1 being most high priority, what this means and with the change frequency, is you now know, what files to “keep an eye on” . Once you find a potential attack target that is modified often, start making nightly copies of the file (use a proxy). Over a week, diff <file1> <2> < 3 > etc.

If this doesnt jump out as a useful tool, then probably should read on Mozilla html5 basics. Hopefully this helps ya’ll

any feedback always appreciated.






twenty years professionally as a Network Engineer, more recently I have focused on red teaming mostly, but I am always up for learning and exchanging info