Attilla Danko's
Boring Home Page

Just the Links

TOC:
Clear Sky Charts
Seeing Observations Database
How to buy a telescope
The Four Observers
Observing-Expletive Scale
(Not for minors)

Off Site:
BigDob
Ottawa Astronomy Friends

Clear Dark Sky / Clear Sky Charts Update

As some of you may know, Attilla Danko died 2024-11-28.

Many of you are wondering what will happen to the Clear Sky Chart Forecasts, so I thought I'd share what I know.

Attilla wrote the code to do forecasts for himself, but then decided to share similar forecasts with anyone who wanted them. He made a hobby out of looking for what cloud forecasts he could find and trying to compare their accuracy. He'd find other useful links and add them to forecasts - but sometimes only the sponsored forecasts.

He organized a letter writing campaign that helped Environment Canada decide to make the cloud forecasts a formal product - he would never have been able to do these charts without Allan Rahill's invaluable contribution, which we will forever be grateful for.

He fully expected that within a few years of him starting his forecasts, someone else would do a better job and take over forecasting.

And yet... people kept wanting his forecasts.

But because he always expected someone else to put him out of business, he never productized his code.

When he first became ill, I asked him about the succession plan for cleardarksky and he told me it should die with him. I didn't agree. I asked him to consider letting someone else take over, but he wasn't really trying hard to make that happen.

It was only while he was in hospital early October 2024, that he finally agreed to show me how to do the most basic things, like add a sponsorship or a new chart. Since then, I've been doing that maintenance. (I'm the "colleagues" he refers to below - we never did get anyone else on board). We did agree that to reduce the workload, the site should be demonetized, which happened late November.

However, he was too ill to give me a full guided tour of the code. And looking at the folders, there are still personal financial files and emails intertwingled, so I can't just zip it and offer the code to other people. Because of this, please don't ask me how you can help: I haven't been able to think of anything appropriate that other people can do at this time, and it sucks to have to tell people this one by one.

I can - and will - continue to create new charts (but maybe not as fast as Attilla did). The generation code is automated, so as long as nothing goes wrong (or changes), forecasts will continue to be created. But I have a complex estate to settle now, and it will be quite some time before I will have time (and energy) to try to sort through 20+ years of unproductized code development (including stale files and other surprises) and learn enough python to figure out how to package the code for handoff. Hopefully I'll be able to get to this before something breaks, but no guarantees.


Clear Dark Sky forecast design summary

Several people have come forward to express interest in helping take over Clear Sky Chart code long term. Some think that it's a simple matter of tapping into Enviroment Canada's data, so I thought I'd try to capture some of the design considerations in the current implementation of cleardarksky forecasts.
When Attilla started the forecasts, only images were available, so that's what his code uses. Since then, Environment Canada has implemented a datamart. Despite now being retired, Allan Rahil has kindly supplied the following information:

All images are available on this Environment Canada website  https://weather.gc.ca/astro/index_e.html

For the sustainability of astronomical weather forecasts, before [Allan's] retirement, 
all datasets were converted to grib data and are available in the datamart. 
Everyone is therefore free to produce images provided that the source (ECCC) is mentioned. 
Using GRIB data has advantages over images because they are available sooner after the model runs 
(4 times per day) and there are also online interpolation tools available. 
Here are the websites to extract the data;

https://dd.alpha.meteo.gc.ca/model_gem_regional/astronomy/grib2/
https://dd.weather.gc.ca/model_gem_regional/10km/grib2/

TCDC= Total cloud cover
TMP = temperature (2 meter)
SEEI = seeing
TRSP = transparency
WSPD = total wind speed (10 meter)
WDIR = Wind direction
RH = Relative Humidity (2 meter)

You will find information on all variables from these sites;
https://eccc-msc.github.io/open-data/msc-data/nwp_rdps/readme_astro-rdps-datamart-alpha_en/#liste-des-variables
https://eccc-msc.github.io/open-data/msc-data/nwp_rdps/readme_rdps-datamart_en/

You will find information on gib2 format, decoding, processing, interpolation and visualization on this site:

https://eccc-msc.github.io/open-data/msc-data/readme_grib_en/

It seems very likely that clear sky charts should convert to the datamart, but this wasn't something Attilla had done.

Instead, his code monitored the Environment Canada website, for when the new images would be available.
When they were, he'd download them all (just once per cycle, but hundreds of images), and generate the 6000-7000 forecasts from that data. He didn't want to overload the Environment Canada servers by requesting all the images for every chart, which is one of the reasons he never implemented a dynamic chart location feature. When only an incomplete set of data was available, the generating code would fall back to older forecasts. Some of the forecasts would get released in tranches. There was code to handle that, and regenerate if a subsequent tranch showed up before the next planned generation cycle.

He'd also grab the forecasts from the European forecasts (ECMWF) which he would get from the Norwegian site, met.no . (Fun fact: the Norwegian word for "cloudy" is "skyet". Get Google to pronounce it for you. NSFW). This is done on a one by one basis, so to keep the server load low, it was originally only done for sponsored charts.

He also obtained copies of the latest light pollution surveys.

Anything image based, would of course need a transform to identify which pixel in the map image corresponded to the longitude and latitude for any given chart. Any time the map image format changed, he'd reverse engineer the transform needed. And when maps had political boundaries drawn on them right ontop the best pixel to use (obscuring the forecast data for that site), he'd calculate the second best pixel to use, and use that instead.

His code would generate large format charts, and smaller versions suitable for embedding. With the right incantations to avoid caching, many clubs embed the forecast for their favourite observing location on their website. This is a popular feature that would be good to maintain in any future iteration.

Now it starts to get tricky: MUCH of the code complexity was to create reliability. Any time the forecasts went down (which they did, like clockwork every single time we took a trip until he refused to travel anymore), he's get lots of unhappy-users email.

So he implemented code (version controlled with SVN) to support multiple computers generating the charts in loadsharing and failover mode. And there are three internet service providers with three different last mile techs (fibre, copper phone line, and cable) to two physical locations for redundancy. A UPS in the house doesn't help enough when the power outage takes down your ISP too.

And there are currently two but previously up to three webhosters, with a dns server to do loadsharing or manually allow disabling a broken hoster. The contents of the sites are refreshed every time the forecasts are updated, with complex code to account for the fact the different hosters require different directory structures. And code to tar/extract files, because hosters have limits on how many files you can upload. And code - lots of it - to support sponsorships.

For the sponsored charts, he'd archive forecasts to allow a history.

He also calculated chart usage stats, which, along with sponsorship information would determine the order in which he generated the charts. And... to delete charts that weren't used enough.

He also linked Near-Realtime Satellite Imagery, Sun & Moon Data, a Road Map, a Topo Map, Civil Weather, Satellite prediction, and a light pollution map. Each of those would change every few years, which meant that every year, something would need updating to accomodate whatever changed.

To help him assess the quality of forecasts, he also had code (not visible publicly) to grab copies of cloud covers reported in metars (oktas). He'd compare those to cloud model forecasts for the airport locations, to assess whether including the forecast was worth it. He used that to decide to include the European model, but not the NOAA cloud model.

I'm sure I've forgotten a number of the design details, but the above list should give you a good idea of what's involved.


-Ingrid
PS. Yes, I'm still adding new charts as requests come in.


I've left Attilla's original description about the charts here below.


Colleagues

Because of declining health, I recently asked some colleauges to healp with the great many mails I get from chart users and sponsors. So someone other than Attilla Danko may reply. Also, although in the past I've been able service chart and sponsor requests within a day (sometimes within an hour) that might be considerably extended. Your patience is appreciated.

Me

I'm a retired software weenie, that's the technical term, and an amateur astronomer.

I never had a use for a personal website until I heard about the computer language Python. I figured any language named after Monty Python's Flying Circus had to be cool. But to learn Python, I needed a problem to write code for. I found it tedious to add 5 in my head to convert UTC to EST in using the astronomy forecast maps at CMC, so I started writing code.

Next thing I know, I'm writing optical character recognition code, reverse-engineering map transforms, writing javascripts and web databases, writing failover and load-sharing code for windows and generating Clear Sky Charts. Because of the very cool numerical model Allan Rahill (of CMC) wrote, the Clear Sky Charts turn out to be just about the most accurate forecasting device for astronomers. Then word got around.

I'm generating clear sky charts for >3000+ observatories and observing sites in North America and having an absolute blast. I wish knew how to turn clear sky charts into a livelihood so I could do it full time.

There are a few other things on this website that largely came about from the charts:

  • I wrote the Ottawa Astronomy Weather page to explain to my fellow observers in Ottawa why the local forecasts were so bad and where to get real astronomer's forecasts.

  • I wrote the Seeing Observations Database so Allan Rahill could get real data on astronomical seeing in order to tune his numerical seeing model.

  • I answer a lot of email. In a pitiful attempt to stem the flood, here are some of the questions i've answered:

And then there is just plain sillyness:

  • Many fine nights observing with buddies, who were also Monty Python fans, cause me to write (in a moment of weakness), a translation of the classic monty python sketch, The Four Observers, into the language of amateur astronomers.

  • Many fine nights observing with buddies who like to express themselves led me to realize that one could rate astronomical views by listening to people swear at the eyepiece. In another moment of weakness (is there a pattern here?), I wrote the Expletive Scale (Not suitable for kids) of astronomical observations.

I've come to appreciate the social aspects of the astronomy hobby. So I deliberately created a few places where astronomers could yak: