Last updated at Sat, 19 Aug 2017 02:10:17 GMT

The introduction of the scan export/import feature opens up the ability to merge sites, at least through the Ruby gem.

Imagine a scenario where you had split up your assets into several sites, but now you realize it would be easier to manage them if you just merge them into one. Maybe you have duplicate assets across sites and that wasn't your intent. The script below allows you to merge multiple sites into one. I replays the scans from each site into the new one (in just a fraction of the amount of time it originally took to run the scans).

I'll let the comments in the script mostly speak for themselves, but a quick run-through is this:

  1. Designate the sites to merge into a new site
  2. Copy the asset configuration from those sites into the new one
  3. Collect the scans (the script only pulls in scans from the last 90 days) to merge
  4. Import those scans into the new site in chronological order.
#!/usr/bin/evn ruby  
require 'nexpose'  
include Nexpose  
  
nsc = Connection.new('your-console-address', 'nxadmin', 'openSesame')  
nsc.login  
at_exit { nsc.logout }  
  
# You can merge any number of sites into one.  
to_merge = [38, 47]  
  
unified_site = Site.new('Unity!')  
  
# Grab assets from each merge site and add them.  
to_merge.each do |site_id|  
  # Merge based on configured assets.  
  site = Site.load(nsc, site_id)  
  unified_site.assets.concat(site.assets)  
  
  # To merge based on actually scanned assets:  
  # nsc.assets(site_id).each do |asset|  
  #   unified_site.add_asset(asset.address)  
  # end  
end  
  
# Will still need to configure credentials, schedules, etc.  
unified_site.save(nsc)  
  
# Collect the scan history from each site, limited to the last 90 days.  
since = (DateTime.now - 90).to_time  
scans = []  
to_merge.each do |site_id|  
  recent_scans = nsc.site_scan_history(site_id).select { |s| s.end_time > since }  
  scans.concat(recent_scans)  
end  
  
# Order them chronologically  
ordered = scans.sort_by { |s| s.end_time }.map { |s| s.scan_id }  
  
zip = 'scan.zip'  
ordered.each do |scan_id|  
  nsc.export_scan(scan_id, zip)  
  nsc.import_scan(unified_site.id, zip)  
  
  # Poll until scan is complete before attempting to import the next scan.  
  history = nil  
  loop do  
    sleep 15  
    history = nsc.site_scan_history(unified_site.id)  
    break unless history.nil?  
  end  
  last_scan = history.max_by { |s| s.start_time }.scan_id  
  while (nsc.scan_status(last_scan) == 'running')  
    sleep 10  
  end  
  File.delete(zip)  
  puts "Done importing scan #{scan_id}."  
end  

Just a few caveats.... With scan import, you pull in all data from the scan as it originally ran. If you deleted assets, for example, they will return in the new site. Also, the looping mechanism of the script is a bit clunky. If you have a lot of data to import, you may want to export all the scans and change the way you wait on scans to be finished, in order to be more fault tolerant. And scan import is all or none, there is no way to split up assets that were in a scan.