I happened to read article on how earliest images of the moon were better than people realised. That made me curious about the images and I found that they are available on NASA PDS Imaging Node The images are in raw 16bit integer format and need to be converted into something that a modern computer can understand like .png. I’m certain those images are already out there in a viewable format, but I saw it as an opportunity to practice. (Also if you move few directories up to the ‘EXTRAS’ folder, you’ll find reduced resolution .png images there)
For the practice exercise I decided that I want them in full quality on my homepage. Quick search lead me to the perfect tools for the job: IIPImage server and IIPMooViewer.
First I needed to get the IIPImage server running. Luckily there is already a package for debian based systems.
sudo apt-get install iipimage-server
# This goes to /etc/nginx/sites-avaiable/default (or wherever your nginx config file is)
location /fcgi-bin/iipsrv.fcgi {
fastcgi_pass localhost:9000;
fastcgi_param PATH_INFO $fastcgi_script_name;
fastcgi_param REQUEST_METHOD $request_method;
fastcgi_param QUERY_STRING $query_string;
fastcgi_param CONTENT_TYPE $content_type;
fastcgi_param CONTENT_LENGTH $content_length;
fastcgi_param SERVER_PROTOCOL $server_protocol;
fastcgi_param REQUEST_URI $request_uri;
fastcgi_param HTTPS $https if_not_empty;
}
man iipsrv
cd /usr/lib/iipimage-server
# replace with your own path
env FILESYSTEM_PREFIX=/path/to/images ./iipsrv.fcgi --bind localhost:9000
Now that the server is running we’re gonna need a client to view images. For that I’m gonna use IIPMooViewer.
For a quick and dirty solution we can clone it to wherever our nginx webpages are stored, /var/www/html by default.
cd /var/www/html
git clone https://github.com/ruven/iipmooviewer.git iip
// The *full* image path on the server. This path does *not* need to be in the web
// server root directory. On Windows, use Unix style forward slash paths without
// the "c:" prefix
var image = '/image.tif';
Finally I needed to get and convert all the images from page i mentioned at the beginning. There is no “download all” button on the page, so we have to use something like wget command.
This first step is probably unneccesary, but I figured it would reduce the amount of pages wget has to crawl through. So I got the list of all directories I wanted to download .IMG files from.
# download file -> grep FRAME_XXXX -> remove duplicates -> prefix with rest of URL and write to directories.txt
wget -q -O- https://pds-imaging.jpl.nasa.gov/data/lo/LO_1001/DATA/LO1/ | egrep -o "FRAME_[0-9]{4}" | sort -u | sed -r -e "s/(.*)/https:\/\/pds-imaging\.jpl\.nasa\.gov\/data\/lo\/LO_1001\/DATA\/LO1\/\1/g" > directories.txt
cat directories.txt | xargs -L1 wget -r -np -l 1 -A IMG
chmod +x ./convert.c
./convert.c
find . -name *.IMG | sed -r -e "s/(.*)\.IMG/\.\/convert.out 16500 \1\.IMG\necho Creating \1\.tif\nvips im_vips2tiff \1\.bmp \1\.tif:jpeg:50,tile:256x256,pyramid\necho Deleting intermediate file \1\.bmp\nrm \1\.bmp/g" > commands.sh
chmod +x commands.sh
./commands.sh
That should convert all the .IMG files to .tif, then you need to move them into the directory where we had image.tif earlier. The .tif files should be about 20x smaller than the raw .IMG files. IIPMooViewer also has an example for gallery.html, that’s what I ended up using. I also had to add “server”: “/fcgi-bin/iipsrv.fcgi” to the image properties (for every image) in gallery.html.
Here it is: (broken because my VPS provider just can’t git their stuff together and I can’t bother to deal with it again)
Something I want to do in the future is to have a 3D view and map the images to a sphere. All the images also have labels about where the shots were taken. This could potentially be enough information to project the images to a sphere. That would be much harder than what has been achieved so far, but it should be possible. I would definitely have to write my own IIPImage client, but I’m not even sure if that’s going to be enough.