cyberax

Website screenshots

5 posts in this topic

Hi All!,

Does anyone know of a way to take snapshots of a webpage and save them to an image without using an xserver? I have done a lot of looking around and have found nothing. Ideally it would be something you could run from the command line. I'm an admin for my computer societies webservers and we need to take hourly snapshots of our webpages. The server is freebsd based. Our current method is kinda messy. It involves starting X every hour and killing it horribly in a script. A cleaner method would be great.

Any suggestions?

0

Share this post


Link to post
Share on other sites

I was thinking that there might be a way to convert the webpage to a pdf, then to an image.

Been searching google, but its hard to find these tools (without paying for them :-) )

0

Share this post


Link to post
Share on other sites

how about having some script that gets the file from the web, including all of the image locations, or the images themselves, (perhapse a special program could read the uri 's in the html file, and get all specified images, and css docs)

I dont know BSD or the dificulties particular to this situation.

Though I imagine something

Say you are validating the page by taking an image of it manually, to make sure it hasn't been hacked.

Assuming the only way to hack the site would be by changing the html/css/images, then there could be a program that could be an extension of your borwser. You go to the site, and run the extension, it checks the html of the site to a locally stored version, and even to several previous versions if you prefer. It could DL the current version, and even have a series of thumbnails of the images stored locally, while grabbing the images from the site, giving a side by side comparision.

Once all that shit is done, then it seems it could be a stand alone that does this automaticly, physically comparing the raw information in the image file to a local copy.

I'm sure there is already somthing like this out there, I'd figure out who'd use such a tool, like uber-professional web developers, or security or somthing. For instance, what does the gov do to monitor thier whitehouse.gov site for unauthorized changes? IS some guy sitting there staring at the site all day, or could thier be some website security suite sold to either big companies with in house web design (not likley that even exists)

Or to big web design firms, who are responsible for the security of thier customers sites, with varying levels of security.

Just an Idea anyway

0

Share this post


Link to post
Share on other sites
how about having some script that gets the file from the web, including all of the image locations, or the images themselves, (perhapse a special program could read the uri 's in the html file, and get all specified images, and css docs)

Would HTTrack be of any help?

Here's some more info:

http://en.wikipedia.org/wiki/Httrack

Edited by Istrancis
0

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!


Register a new account

Sign in

Already have an account? Sign in here.


Sign In Now