History and Technical Information

WXSIM is so unique that a first-time user will likely not realize just what it is supposed to, and can, do. Perhaps the best way to convey what the program is about and put it into proper context is to give a (fairly!) brief history of how and why I've developed it. In addition, some users will want to know some of the technical details, so that the program won't seem quite so much of a 'black box'.


First of all, I've been a 'weather nut' since the age of twelve, along with a strong interest in astronomy that started much earlier. Since I like to apply numbers to anything I'm interested in, I quickly became fascinated with temperature forecasting. During a typical day at home, I would interrupt other activities every half hour or so just to check the temperature, and at times I even mounted thermometers at school so I wouldn't miss anything. From all this I developed a good bit of skill at forecasting temperatures under a wide variety of conditions.


By the early-mid 1980's, I had earned B.S. and M.S. degrees in physics and had the mathematical tools to approach the problem more objectively. My very first effort was to model a diurnal temperature curve assuming incoming solar (mainly visible) energy proportional to the sine of the sun's angle above the horizon and outgoing terrestrial (infrared) radiation proportional to the fourth power of the surface temperature (Stefan-Boltzmann law). The model was iterative, meaning that the state of the model was updated frequently based on the most recent state, to produce a type of integration with respect to time. A programmable calculator was used at first, with the location assumed to be the equator at an equinox, with the sun up 12 hours and passing straight overhead.


Two constants, or 'tuneable parameters' had to be introduced here, in order to tell the temperature how much of a 'kick' to experience from a certain duration (i.e. an hour) of incoming or outgoing radiation. These constants (call the one for incoming radiation 'A' and the outgoing 'B') depend on a number of factors, such as the intensity(i.e. W/m^2) of the radiation, the specific heat capacity of the surface, and others. At first I assumed the location to be the equator at an equinox, with the sun up 12 hours and passing straight overhead. Making the ratio A/B large increased equilibrium temperatures. Making the product AxB large increased the diurnal range.


By adjusting A and B, I quickly got realistic temperatures and diurnal ranges (assuming clear skies for now), but the high temperatures were occuring too late in the afternoon. To get a realistic (i.e. 3PM) time for the high, I had to 'crank up' the product AxB which made too large a range.


I knew that other factors were at work, such as heat flow from beneath the surface. Since the rate of heat flow is proportional to the temperature difference between two bodies, I used a simple 'restoring force', pulling the temperature toward some pre-set value much like a spring. By adjusting the 'spring constant' (tightness) of this 'spring', I quickly got much more realistic curves.


But what pre-set value was appropriate? At first I was modeling only the equator, so the daily mean, seasonal mean, and annual mean temperatures were all the same and quite appropriate. By this time, though, I'd generalized the code (now on an Apple II, I think it was) to calculate sun angles at any time of day, any day of the year, anywhere in the world. To include seasonal effects and model something corresponding roughly to sub-surface temperature at various depths (or perhaps upper air temperatures, as yet not explicitly included), I used four 'normal' temperatures, and annual normal, a seasonal normal, a time-lagged seasonal normal that I found necessary to get seasonal effects in phase with reality, and one I called the 'air mass temperature' (which I found later corresponds quite closely with the average boundary layer temperature).


Next I wanted to include clouds, which radiate quite a bit of infrared and can block a lot of sunlight. I also needed a way to specify their temperatures. Besides, even clear air radiates quite a bit of IR, so I knew I'd need to include the atmosphere in general. At first I chose four layers, roughly equal by mass, but with the nearer-surface layers weighted a bit disproportionately as they have a more direct effect on the surface. I studied graphs of transmission of various wavelengths through the various gases in the atmosphere and came up with algorithms for computing the albedo and emissivity of atmospheres with various amounts of cloud cover and humidity.


At this point I knew I could include radiative transfer among layers to iterate their temperatures, but since this would require much more computing time on a program that was beginning to get rather slow, I decided to leave these fixed at some seasonal normal values, determined from the various 'normals' mentioned earlier and a standard lapse rate.


Now the program was producing realistic temperature curves for seasonable weather under various sky conditions. I developed algorithms for computing the various normals so as to avoid having to enter them every time. I did realize, though, that the air mass temperature needed to vary with the air mass! The problem of how to determine an appropriate value was the next step.


I decided to let the program do a 'calibration run', starting from a normal temperature for the season at one of the two times of day when the temperature crosses the daily mean. I developed algorithms for these, and in my research found that they vary with wind speed and cloud cover (among other things), but typically occur around 10-10:30 AM and 8-9 PM. By letting the program run under the normal conditions until the intended start time, and then comparing the calibration run's temperature with the input initial conditions, a departure from normal was established, in turn helped determine the air mass temperature. Another approach I used as a back up was an input for the high and low temperatures in the last 24 hours, assuming some degree of persistence in the air mass temperature.


All the constants and algorithms discussed above have been part of the program since about 1983, and have been tuned and re-tuned countless times as I collected large amounts of data on temperature curves, climate and altitude effects, and much more. Meanwhile, I added more and more variables to the brew. A few of the more important examples will have to suffice here:


Wind speed:
Stronger winds mix the air better, causing both a sluggishness to respond to the radiative driving forces and a pull towards air mass temperature. In fact, I found I had to produce a careful balance of both effects to fit my collected data. In addition, temperature inversions can form at night with light winds. A rather involved routine is used to 'mix down' warm air from above at an appropriate time in the morning under such circumstances.


Dew point:
Formation of dew, fog, or frost releases latent heat that slows the cooling of the surface. The evaporation of this water in the morning temporarily slows the daily warm up. I've studied and tried quite hard, with rather good success, to model these effects.

Advection (of both temperature and dew point):
This is CRUCIAL for realistic temperature forecasting. I've developed a rather complex routine for this, using wind speed and upwind temperature and dew point profiles (derived by any of three different methods). The most advanced method involves the user inputting (or importing from a data file such as can be downloaded from the internet) actual data for upwind sites. Wind speed, sky condition, and even longitude (for time zone differences are taken into account). Generally, straight-line air trajectories are assumed, which is usually reasonable for the short term forecasts that are WXSIM's specialty.


Diurnal variations in temperature-related variables:
These include wind and low-level cloud cover. Wind speeds are usually stronger in the daytime, as mixing lets the surface 'feel' more of the usually-stronger winds higher up. At night, under temperature inversions, the wind may go calm even while significant winds blow aloft. WXSIM models these effects empirically, and even has an option to model sea breezes for coastal locations. Cumulus clouds may develop with daytime heating as air parcels rise. Low stratus can form in some conditions overnight, and then 'burn off' the next morning. Options to model diurnal variations in these types of clouds are available as well.


There are more such variables, but now back to the main story line!


Most of the initial work on everything discussed above was done on a Commodore 64, bless its slow soul! I finally got access to faster PC's in the early 90's and converted the code over to QBASIC. This afforded an opportunity for general improvements.


One such area (actually already well under way in the C64 version) was the inclusion of 'interrupt codes'- ways of injecting new information into the program as it runs. This feature is extremely important to operational use of the program, but may be overlooked by first-time users because it is very unique. To put this feature in context it is very important to understand more about the operational nature of WXSIM:


WXSIM is mainly a local model and is therefore generally ignorant of large-scale weather patterns (the advection routine being a partial exception). For this reason it can't forecast 'weather' in the broader sense of the word. While this may be a disappointment to someone unrealistically expecting to generate complete weather forecasts based on local information only, the interrupt codes make it possible to take full advantage of WXSIM's specialties (temperature, dew point, and various related phenomena) anyway.


Consider the following example. It's a clear evening with light winds, but the official forecast, the big computer models, and satellite pictures all suggest increasing clouds after midnight and then rain by morning. What you can do here is start WXSIM with the actual initial conditions and SET IT TO RUN SLOWLY (unless you have great reaction time!). When the model time reaches the point when you think (see, YOU get to be part of the forecast) the clouds should increase, a couple of keystrokes or mouse clicks will increase the clouds. The previously falling temperature 'inside' the program will perhaps level off or even rise as infrared radiation shines down from the clouds. When you think it's time, another keystroke or click starts precip of the desired intensity. Quickly WXSIM responds with faliing temperature and rising dew point until, if you make it rain enough, the air becomes saturated and the temperature and dew point largely stabilize. WXSIM also makes the decision as to whether any preicipitation you inject will be rain, sleet, snow, etc.


There are over 50 such interrupts; you can even model the effects of solar eclipses! Understanding this feature is the key to making WXSIM the 'interactive' operational model that it is. It is, in a sense, a 'partner' in your forecast.


By the time I had all of the above in the program, the importance of modeling upper air temperatures was becoming more obvious. I changed to a five layer model, making these new levels correspond rather closely to 5 of the 'mandatory' levels used in RAOB soundings and upper air plots. I analyzed hundreds of soundings (mainly for my area but many others around the world as well) to improve my algorithms for relating upper air temperatures (including their somewhat small diurnal variation) to surface temperature. In the process, I developed code for calculating heights of various pressure levels, thicknesses, and related parameters.


About the same time, I got Internet access and FINALLY had regular access to real-time upper air data. I made these temperatures and dew points part of the data input, though keeping the option to use the algorithms I'd spent so much time developing. In fact, you can input these initial conditions and then the older algorithms work with them to produce forecast values of upper air temperatures, which are in turn used in calculating new surface conditions, which then effect small changes in the upper air conditions, etc., in a finely tuned balance.


I actually 'marketed' this DOS based version on a small scale (several hundred downloads of the shareware version and about 15 full registrations). While I got less feedback than I would have liked, that that I did get was quite positive. A few of these users have been very helpful in providing feedback that has helped the program grow I was especially flattered that a couple of them were professional forecasters who found WXSIM to be very useful, especially for short-term temperature forecasting.


The DOS version was rapidly becoming 'archaic' as more and more users expected the convenience of Windows-based software. I decided in mid-1996 to make the change over to the Windows environment, via Visual Basic. This change afforded yet another opportunity for growth, reflected in the software you can download from this web page. Among MANY other things, you can now import the initialization data directly (at least if your site is 'on file and is an official reporting station). You can also import RAOB data to aid in initializing upper air data and can even import and use NGM and Eta model data (in the form of FOUS block data) to not only aid in initializing the program but also, optionally, to influence certain aspects of it during the run. You can also import a variety of model data from NOAA's READY site, for use in the program, via the 'Interrupt Planner', a graphical interface that can display imported model data, allow manual modification of this data, or let you enter your own program interrupts in advance.

In 2006, I finally produced what many users had long been requesting - a hands-off, automatic way to run the program. This has proven very useful to many people, especially owners of home weather stations who upload data and forecast to their web pages. I use this mode myself fopr most forecasts now, but still believe manual interaction with the program produces the best results. I think there will always be a role for each of these approaches.


The above is only a brief overview of what the program is and how it has come to be. I've collected and analyzed a great deal of data over the years and have run the program thousands of times, developing scores or hundreds of routines for dealing with new problems as they came up. The many constants in the program have been adjusted and re-adjusted frequently in light of new information. I'd be glad to answer questions if anyone would like to know more.