Tuesday, November 23, 2021

Update on Technical Details

Three months ago, during an extended break in the monsoon, I wrote here Technical Details As Acknowledgements. Much of that post was about the process of installing free software, which I've been using for a variety of things, including producing custom weather maps on my MacBook. One motivation for providing information about the process was that lessons learned might be helpful for anyone interested in doing something similar with various Python packages. Another goal was to acknowledge the effort that people put into making these software packages work and remain freely available.

Most of the details from three months ago remain relevant. A few things have changed, and I'll incorporate those changes into a description of the process starting from scratch on a new MacBook. I had been limping along on a MacBook Air that I bought almost eight years ago, cheaply at the time as a refurbished unit. But recently I succumbed to temptation and bought a new one, which is equipped with a processor chip in the Apple M1 family, known as Apple Silicon, but also as ARM to distinguish from Intel.

I had been reading online discussions about M1, and based on advice in those discussions I expected for now to be able to get only so far with installing things natively on the new Macbook; for some things I would need to keep using my old Macbook. Some online posts advised running Intel versions of software through Apple's Rosetta 2 transition program on the M1 machines. Some recommended that the only hope for doing things natively was to use one of the Conda packages. Those other online recommendations may have been true a few months ago, but it is not so now. Everything that I had been using is now running natively on my new M1 Macbook, with no need for a middle-man software manager like miniconda. (Homebrew is of course a middle-man, but it is much closer to do-it-yourself.) It's time to shut down the eight-year-old Macbook for good.

I have my apple ID registered as a developer. It doesn't cost anything to do just that. That allows me to download the latest versions of Xcode and the Command Line Tools from http://developer.apple.com/downloads/more. Xcode downloads as an .xip file, and the CLT as a .dmg file. To start the install process for either one, double click. It's my understanding that only the CLT are needed by Homebrew. But I always download the latest version of Xcode as well because I need it for a few small stand-alone programs. I always do a separate install of the CLT after installing Xcode, and that seems to ensure that the Homebrew formulas find the CLT in the expected place.

Once again here is the Homebrew installation instruction web page. After completing those instructions on the M1 Macbook, I did brew install gcc and then brew install python@3.9. If either of those failed, there would be no point in trying to go farther. But everything installed natively and automatically with no problems, including several dependencies for each. I then did brew install geos and brew install proj. Those two libraries are needed by the cartopy python package. Three months ago cartopy required an older version of proj, but now cartopy uses the current version. There are a couple of dependencies automatically installed with proj. Everything continued to install natively and smoothly.

A few python packages can be installed either with a Homebrew formula or with the ordinary Python package manager. That was the case several years ago for numpy and scipy, and for awhile I kept them updated with their Homebrew formulas. Then it seemed that these formulas were unsupported, and the recommendation was to install/update numpy and scipy as ordinary Python packages. So I had been doing that recently. But in the online discussions about M1 there were reports of problems with installing these packages, and the recommendation was to use the Homebrew formulas, which have been updated recently. So brew install numpy and then brew install scipy. Again there are several dependencies, and again everything installed natively and smoothly.

Just a few more manual brew install's: hdf5 netcdf eccodes and pkg-config. Then it's on to the python package manager, already installed as pip3 by the Homebrew python@3.9 formula. There is a deprecation warning that is printed with each package installed. It appears that Homebrew will have to change something in the future, but the warning can be ignored for now. I started with pip3 install matplotlib, which automatically installs a number of required packages. Most if not all of these appear to install as native, pre-compiled binary wheels. Then on to
pip3 install shapely --no-binary shapely
pip3 install pyshp

pip3 install pyproj. These three are from the cartopy installation page. Then pip3 install cartopy.

One of the python packages that can be installed with a Homebrew formula is ipython. But the latest version of python, 3.10, was released just last month. The Homebrew formula for ipython is already set to require that recently released version of python as a dependency, while many other packages still depend on python 3.9. So it's easier to just use the alternative, pip3 install ipython. That also installs a number of dependencies, and they all go into Homebrew's 3.9 site packages folder.

The pandas package installed with no problems with pip3, though it took a long time to compile. Then pip3 install cfgrib, pip3 install xarray and pip3 install MetPy, and back in business plotting grib files on my new Macbook.

Saturday, September 4, 2021

A Tale of Two Noras

A Tale of Two Noras

During the late afternoon hours of this past Tuesday the precipitable water values around Tucson and in all directions away from Tucson were about as high as they ever get. For good reason the National Weather Service had a flash flood watch in effect for Tucson. Below is what about as high as precipitable water values ever get around Tucson looks like.

I use a color range maxed out at 1.7 inches because at Tucson's elevation it doesn't get much higher, and attention is usually on the transition from slightly below to slightly above one inch. Obviously on Tuesday in order to get to the one inch neighborhood you had to go a long distance, either to the highest elevations in the sectors north through southeast of Tucson, or to the moderately high elevations of the Baja peninsula.

By Tuesday Nora's circulation had completely dissipated at all levels, leaving a broad and deep southerly flow at low to mid levels that was continuing to transport moisture northward. The mid level southerly flow veered to a 70+ knot southwesterly upper level jet extending across northern Baja just south of the California border. I picked up about half an inch of rain during the evening hours of Tuesday tapering off into the first few hours after midnight on Wednesday. Most of the eastern part of Tucson, east of about Swan Road, picked up about an inch. By sunrise on Wednesday the threat was past Tucson. The sky was clearing from the southwest as precipitable water levels were already dropping. It was obvious from the large scale radar composite that the upward motion associated with the upper level jet had shifted north and east of Tucson. The radio station that I listen to in the morning continued to report the forecast of a 40-50% chance of rain, but the on-air personality simply ignored the fact that officially the flash flood watch was still in effect for Tucson. The local paper took the opposite tack. The following morning, 24 hours later, the Thursday online edition still featured a story, last updated 21 hours earlier, detailing the flash flood preparations, without noting that the watch had finally been cancelled 18 hours earlier.

One week ago, when the final details of Nora itself were still uncertain, but the threat of flooding for the Southwest was already being publicized, the local paper recalled its snarky coverage of the 1997 edition of Nora, which produced only a few drops of rain in Tucson instead of the 6, 4, 2 inches that had been forecast. The 2021 paper claimed, It's possible the same thing could happen with Nora 2.0.

Here is what the precipitable water looked like on the afternoon of September 25, 1997, the day before the snarky 1997 newspaper article.

By that afternoon the low level circulation of 1997 Nora had had a complicated interaction with the Baja peninsula as Nora-1997 moved rapidly north toward the north end of the Gulf of California. Thick high level outflow debris clouds had overspread Tucson, darkening the sky dramatically compared to the morning sunshine. The wind had become a bit gusty from the south-southeast. I recall walking across the U of A campus with a colleague, a hydrologist, and he mentioned the anticipated (by anyone following the official NWS forecast) heavy rain. I said, "I wouldn't be surprised if we don't get anything." My colleague looked around at the overcast sky and at the effects of the wind gusts and sniffed, "Well obviously the hurricane is coming." The problem is that a hurricane and its environment can't be conceived as a system analogous to a baseball traveling through the air. If the hurricane and its far-flung moisture field were like a solid body, then the precipitable water would have increased dramatically at Tucson as the center of the hurricane raced north. In fact precipitable water did increase during that day over the western half of Pima County, but remained almost constant around Tucson from morning to evening, consistent with what the operational numerical models of the day had been predicting. Even scientists, who should know better, will stick with a conceptual model that is wrong long past the point when reality has given them an answer they didn't want to hear.

Access to the reanalysis dataset through NCAR is gratefully acknowledged:
National Centers for Environmental Prediction/National Weather Service/NOAA/U.S. Department of Commerce. 2005, updated monthly. NCEP North American Regional Reanalysis (NARR). Research Data Archive at the National Center for Atmospheric Research, Computational and Information Systems Laboratory. https://rda.ucar.edu/datasets/ds608.0/. Accessed 28 Aug 2021.

September 17: Corrected paragaraph before second figure, September vs. November.

Friday, August 27, 2021

Technical Details As Acknowledgements

The gauge on my patio has collected over twelve inches of rain since the last week of June. So this week's extended break from the monsoon was welcome, and it provided an opportunity to document some topics neglected in the previous two posts. Also, suggestions and lessons learned might be helpful for anyone wanting to create their own custom weather maps. The first lesson learned is that the link I provided in the previous post quickly turned bad. Here are replacement links to brief summary descriptions about the GFS, which includes the GDAS. Those summaries are on the NOMADS page, which provides public access to gridded data from a variety of National Weather Service models, as well as links to data from other modeling centers. Separately the National Weather Service provides a Model Analysis and Guidance page, where the gridded data have already been processed into standard weather map images of many varieties. If those standard maps are adequate, it might not be necessary for a user to deal with the gridded data.

My custom journeys through the model data are spread out across the screen of my MacBook in the form of maps generated through the python packages matplotlib and Cartopy. Other python packages are involved, which I'll discuss later. I like to keep track of what I'm installing and why. So my starting point is the Homebrew package manager. The starting point for Homebrew is its installation instruction web page. After Homebrew is installed, installation of a particular piece of software under Homebrew is handled by a Formula. Among the Homebrew formulae I've installed and kept updated for a long time are python itself (the formula includes installation of pip) and ipython. Some python packages require access to a library, which can be installed with Homebrew. A tricky detail is that Cartopy requires an older version of the proj library, which happily Homebrew provides as formula proj@7. A recent discovery for me is the formula eccodes, which installs the ECMWF package for decoding GRIB files. This library is needed for a python package to be discussed later. Independent of its companion python package, eccodes installs a set of command line tools. One of these tools is grib_ls, which comes in handy for figuring out what is actually in the file that was downloaded from the NOMADS site.

I keep the standard python packages (cartopy, pandas, numpy, matplotlib, scipy, etc.) updated manually with pip. There is one more tricky thing with Cartopy. A required package, Shapely, needs to be installed using the pip option --no-binary. For easy reading of GRIB files, the python packages cfgrib (interfaces with the eccodes library) and xarray are needed. I came to cfgrib through xarray, and to xarray through the python package MetPy. While intending to eventually use all of MetPy's computational features, for now I'm especially taking advantage of its addition to the map features available in Cartopy (i.e., from metpy.plots import USCOUNTIES).

Here's the easy reading part, illustrated by a line from my python script:
ds = xr.open_dataset(myFilename, engine="cfgrib", filter_by_keys={'typeOfLevel': levelType[levelTypeIndex], 'level':level})
where xarray was imported as xr, and myFilename was set to one of the files I downloaded from NOMADS. The optional argument filter_by_keys gets passed to cfgrib. This filtering is generally necessary, and will be discusssed later.

Jumping to the end of the process for now, assume that a downloaded GRIB file, or at least a subset of it, has been successfully ingested by xarray. Then it's only a matter of adapting from the many MetPy examples available. My personal preference for just viewing the fields is to rely on two matplotlib functions: pcolormesh for the scalars and streamplot for the wind components. For some reason pcolormesh understands the lon/lat/variable inputs when they are supplied as xarray DataArray structures, but streamplot requires extracting the .values property in order to supply ordinary numpy arrays.

So let's back up in the process to discuss downloading. As mentioned near the beginning of this post, the NOMADS page provides access to a wide variety of model gridded fields. I've been routinely looking at the GDAS analysis and the GFS forecast fields on their 0.25 degree grid. Those are just 2 of the nearly 100 datasets listed on the NOMADS page. Most of the datasets are equipped with the grib filter option. Clicking on the link in the previous sentence takes you to the NOMADS page of general instructions. As the instruction page explains, grib filter is best used for creating regional subsets. Following the instructions and finally reaching the "Extract" page, I initially skip past the Extract Levels and Variables section, and so that I don't forget about it scroll down to the Extract Subregion section. It's not enough to just fill in the top/bottom/left/right lat/lon. Also be sure to check the box after the words "make subregion." Returning to the select variables section, if I want just the variables corresponding to standard radiosonde observations I'll select the abbreviated names HGT, TMP, RH, UGRD and VGRD. Depending on the dataset you chose, there might be many more variables. You may need to view Parameter Table Version 2 to figure out the abbreviations. For the levels desired it might be safest to just select "all". Instead I select each level that I want. Some of the level options are actually levelTypes, as can be verified later by running grib_ls on the downloaded file. The variable PWAT needs the level (levelType) "entire atmosphere (considered as a single layer)," while REFC needs "entire atmosphere."

Here is the promised later discussion about the filtering that is generally necessary in connection with xarray's opening of a grib file to create a dataset. For an explanation of why filtering is needed, see in this early documentation for CFGRIB the section Devil In The Details. Basically each message (i.e., each 2_D grid) in a file downloaded from one of the NOMADS "Extract" pages will be a unique combination of the three keys typeOfLevel/level/parameter. But xarray tries to merge all the levels and all the parameters in the file into a level/parameter array. In order to keep xarray happy, it's often enough to restrict xarray's attention to one typeOfLevel. But it may be necessary to also restrict attention to one level, as in my line of code above. An example is when TMP and RH are at level 2 m above ground but UGRD and VGRD are at level 10 m. Xarray with cfgrib's help tries to create a table with 2 levels and 4 parameters, but is disappointed to find that half of the table's entries would be empty. Working out what is in the GRIB file and thus what filtering is needed is where the command line tool grib_ls comes in handy.

Once into a routine of what I want to download, I follow the suggestion at the bottom of the grib filter instruction page, the section "Scripting file retrievals." However instead of a script I just enter the commands interactively in the terminal window. At first I was always copying from my text editor (BBEdit) and pasting to the terminal window, with the first paste being the mutiple lines where I change to my local directory and set the date and time parameters, and the second paste being the curl line. But then I learned to just use the terminal window: ctrl-r to search the history file for text in the last use of the multi-line command, move with arrow keys and make minor edits, ctrl-o to execute that multi-line and bring up the next history line, which is the curl line ready to substitute the new ${hr} ${fhr} and ${date}. Hit return. It takes a little over 5 seconds to do the typing, and another 5 seconds for my 12 by 12 degree lat/lon regional subset 630 Kbyte file of 182 grib messages to download.

Monday, August 9, 2021

Mesoscale Analysis

Forty-some years ago analyzing an upper-air feature like this would have been more about art than data. Fortunately the experienced NCO forecasters who I worked with back then were incredibly perceptive artists. The main source of good data at that time was the FPS-77 radar at Davis-Monthan. The scant information on the upper air charts had to be finely analyzed to make sense of the situation. Now we have the Global Data Assimilation System, output available at a horizontal resolution of 0.25 degree (pixels roughly 25 km on a side), vertical resolution 50 mb or better, every six hours. Tucson is roughly near the center of the images below. The five 500-mbar images span the 24-hour period between midday yesterday and midday today. I haven't yet figured out how to make a loop. But anyway I think the evolution of this wave is best enjoyed one frame at a time.

A wind speed max at midday yesterday over extreme western Chihuahua was by early this morning anchoring the eastern side of a distinct trough, which was centered on an axis extending southwest-northeast through Arizona's Santa Cruz and Cochise Counties. Moderate precipitation was roughly aligned with the wind speed maxes, wrapped around the trough. Probably the evolution of the precipitation, and maybe of the trough itself, was tied to the availability of moisture. The images below show the precipitable water for midday yesterday and midday today. What was striking already yesterday was how the highest values of precipitable water, pegged out on the color bar at >1.7 inches, had pressed into the foothills of the mountains in east-central Sonora. Of significance for the coming days is that southwest New Mexico and northern Chihauhua have moistened since yesterday, even as central Chihuahua has dried.

Friday, June 11, 2021

Interesting Time

While attention in Tucson is on record high temperatures over the next several days, and an eventual increase in moisture, the screaming message from analyses and model forecasts is that there will be a weak or nonexistent mid-level capping inversion by the time the moisture arrives. The 500 mbar temperature this afternoon was -5 deg C. It will probably creep up to -3 C or so at times over the next two or three days. But by Tuesday afternoon the GFS run from 18Z has 500 mbar temperatures around -8 C in a pocket around Tucson. There is also at the same time a bit of an easterly wave racing along the southern extremity of the 500-mbar high. I like to loop the model soundings for Tucson at the NCEP models site. The GFS would have a SW-NE oriented line of storms moving through the Tucson region Tuesday mid afternoon, with measurable rain to the southwest, and significant cooling in Tucson by 5 pm. Things may not work out that way, but it is one of many interesting possibilities.

Sunday, March 21, 2021

Winter 2020-2021 Tucson Rain

Winter 2020-2021 Tucson Rain

A few days ago in an Associated Press story about the Climate Prediction Center's Drought Outlook for the Spring season, the story's lede was that the official forecast offers little hope for relief in the West. The story goes on to explain that the drought in the Southwest has developed from a combination of La Niña dry weather [this winter] coming after, in the words of the source NOAA press release, the failed 2020 summer monsoon.

As far as Tucson is concerned, April through June never offers hope for drought relief. These upcoming months are normally the driest three-month period of the year in Tucson, averaging less than three-quarters of an inch of rain total for the three months combined. The official seasonal outlook puts Tucson in equal chances for below normal, near normal or above normal precipitation for this April through June.

There are still a couple of slight chances for measurable precipitation over the remaining ten days of this month. But they won't make much difference for this November through March period, where the Tucson airport has had 1.42 inches, about 2 inches below average. It could have been worse, considering that it was a moderate La Niña (ONI -1.1 for January 2021). This was the worst winter for precipitation in Tucson since 2010-2011, which was also a moderate La Niña.

At my place I've done a bit better than the airport, 2.59 inches for November through March, but still a big drop off from the previous two winters. The summers following big winters may be just coincidences, but if last summer's monsoon deserved an F grade, the previous summer (2019, at least until after Labor Day) was a D. There's no reason to think that this coming summer won't be at least near normal, at least a B or a C, offering a little hope, though still three months away--this summer's monsoon can't be any worse than last summer.