Welcome to GOES-R 101. This presentation is designed to give a brief overview of what is coming for GOES-R. It is being offered by the GOES-R Program in cooperation with the Satellite Hydrology and Meteorology Forecasters Course. When we talk about GOES-R we are talking about the series. At this point the series consists of GOES R and S. GOES T and U are possible. This presentation does not cover all the aspects of GOES-R but we hope to catch your interest, give you and idea of what to expect, and where to look for more information. Many people have asked "Why are we starting to get ready for GOES-R now when the satellite will not be launched for a number of years?" For a number of reasons - the first one being experience particularly with the GOES 8 and later satellites. We don't want to start training after the satellite has been launched and we want our display systems capable of handling the new imagery and products. The second reason is that we can increase our awareness of GOES-R capabilities now by looking at other imagery and products. So if you are not aware of these products, we can point you in the right direction.
Center image: Lightning from a storm system extending from Argentina to southern Brazil on the evening of April 23, 2003. Photograph from the International Space Station, NASA Image Exchange, image number ISS006-E-48196.
During this presentation we'll address "Why?" and "When?", We'll look at "What Sensors and Communication Capabilities?" are planned, what major changes are expected, and how we can evaluate them now. We'll present many examples and finally give information links A greater breakdown of Sensors and Communication Capabilities will be shown shortly.
Basically, To maintain and improve our geostationary imaging capabilities. This starts with replacing the current GOES series. NOAA is already well into the next planning stage to maintain continuity of the GOES mission. Over the last 10-15 years we’ve seen a dramatic increase in many aspects of technology. It should not be that much of a surprise then that we will see a significant increase in spatial, spectral and temporal resolution of products.
This slide has recently been updated (December 1, 2009) and was even signed by Abby Harper the Systems Deputy Assistant Administrator of NESDIS. The earliest launch of GOES R will be in 2015 - late 2015. To put this in perspective with the rest of the chart, dark blue lines indicate when the satellite is operational or expected to be operational. Before launch, the satellite is named with a letter. After launch and when the satellite successfully reaches orbit, it is given a number. Within 6 months of launch and before the satellite goes into storage, there is a checkout period (this is not indicated on this graph). GOES 10, which was imaging over South America, was decommissioned on December 1, of this year 2009. It essentially ran out of fuel. GOES 11 and 12 are the current west and east satellites. GOES-13 is an on-orbit spare. GOES-O was launched on June 27, 2009 and when it reached its final orbit on July 7, it was renamed GOES-14. It is in post-launch checkout and the GOES-14 Science Test is occurring right now during the first two weeks in December 2009. Notice the * next to GOES 12. Backup and South American Coverage beginning 2010. Around the April 2009 timeframe, GOES 13 will replace GOES-12 and GOES-12 will be moved to 60 W Longitude to provide S. American coverage and backup. (GOES-O/14 information: http://rammb.cira.colostate.edu/projects/goes-o/ )
How will we get ready? Through informational presentations like this one, through COMET modules, and through direct dialogue via the Proving Ground concept. The Proving Ground encourages two way dialogue between research and operations. Research entities being those such as at CIRA here, at CIMSS, and at SPoRT interacting with those at NWS offices and National Centers. The Proving Ground concept is not new. It will help to infuse GOES-R like products into NWS operations via AWIPS The idea being that the user will give feedback on what they like, don't like, and what type of improvements are desirable. You can follow the activities of the Proving Ground by checking the link listed here.
Another component related to the proving ground is the development of Proxies. Proxies can be created in several ways: From existing satellite or surface based assets or they can be modeled. Note that this list is not all-inclusive.
If we have high resolution data such as on MODIS, or these lightning sensors, we can degrade the data to get a feel for what we will see on GOES-R and develop procedures to interpret and utilize the resulting data and imagery.
Simulated and Synthetic imagery - where are they coming from? - turn to next slide
Frame 1. The AWG was established to oversee the cal/val activities for GOES-R proxy data and the product algorithms. Some of the proxy data sets mentioned on the previous slide are coming out of the Algorithm Working Groups.
Frame 2. On this next slide you will see a list of the various expertise groups that are doing algorithm development. I do not have a web page to point you for information on the activities of the AWGs, but if you check out the proving ground web page and activities, you will find AWG products.
Frame 1. Now we'll move into what you really want to hear about - which sensors and what capabilities will be on GOES-R. There are four areas: Communication capabilities, Solar and Space Environment, the Geostationary Lightning Mapper, and the Advanced Baseline Imager. One item that is missing from this list is the Hyperspectral Environmental Suite or HES. This is actually a hyperspectral sounder. Due to budget issues, the HES was taken off the GOES R platform in 2006. The current plans exclude the HES from GOES R and S. I’ve heard that it has not been completely excluded from consideration for inclusion on the later satellite(s) in the series. You can obtain more information on benefits of HES in COMET module listed under links at the end.
Frame 2. First we'll focus on Communication capabilities. GOES provides more services than just those associated with the imager,
sounder and solar data - which on GOES-R will be imager, lightning and solar data. The direct readout services will include combined
datastream of HRIT and EMWIN Which is the High Rate Information Transmission and the Emergency Managers Weather Information Network.
The HRIT/EMWIN information stream will consist of selected imagery, weather charts, and environmental data products as well as text
messages of NWS watches and Warnings.
DCS stands for data collection Platform. Many of you benefit from weather data collected on a remote platform whose signal is
transferred through the GOES satellite. I actually accessed remote weather station data from the DCS platform back in the early 80s
before I viewed my first satellite image.
GRB - stands for GOES Rebroadcast. This is the GOES-R version of today's GVAR or GOES Variable format which (today) has calibrated
radiances and navigation information. Many universities and research organizations as well as the backyard enthusiast have direct
readout capabilities that will utilize the GRB services.
If you live along the coasts, you may be more aware of search and rescue satellite aided tracking or SARSAT capabilities. All GOES
satellites support this service. The satellite relays distress signals emitted from Emergency Position Indicating Radio Beacons
(EPIRBs) and other transmitting devices.
Frame 1. Second on the list and the first sensor suite we will talk about is the Solar and Space Environment Sensors
Frame 2. Why are we interested in Solar and Space Weather? The main reason being that the sun drives our climate. It can and does
affect many day to day satellite operations as well as other aspects of technology that we rely on. I won't go through all the details
of the new sensors and improved capabilities on GOES. Steven Hill who is a contributor to this GOES-R module, gave a 15 min presentation
on Solar and Space Environment sensors at the 2008 AMS meeting in New Orleans. At the end of this module there is a link to that
presentation if you want to hear a little more from the expert.
To put into perspective why these new capabilities will be important, on the next slides, we'll touch on how solar and space weather
affects imagery we view, navigation capabilities, and aviation operations which include health and communication safety.
Frame 3. This is a list of the Solar and Space Environment Sensors - the Solar Ultraviolet Imager, the Extreme Ultraviolet /X-ray Irradiance Sensor, the Space Environmental In Situ Suite, and the Magnetometer. In the future and before GOES-R is launched, we hope that both shorter and longer modules will be available to fill you in on advances in solar sensors and give you enough information to appreciate what they bring. When the systems are functioning properly, and significant events are detected, we hope that plans will be in place to mitigate harmful and disruptive impacts so that we won’t really hear about them.
This is a shortened example of the Martin Luther King Storm from the Solar X-Ray Imager that took place on January 20th of 2005. This example can be found on the Solar X-Ray Imager greatest hits web page located on the National Geophysical Data Center Web Site. "The movie perfectly illustrates the cause-and-effect relationship between intense solar activity and satellite disruptions. The flares on January 17th and 20th are closely followed by noise in the SXI telescope resulting from energetic ions penetrating SXI. Ions with sufficient energy and atomic number can penetrate satellite components and deposit charge along their path. Sufficient charge deposition can introduce erroneous information into solid state memory devices. The common term for this is Single Event Upset (SEU)"
I have seen ‘Single Event Upset’ mentioned in reports out of NESDIS SSD and wondered what it meant and what impact it had on the images. Now I know what it is referring to.
GPS is used for navigation and some of you may also realize that GPS can be used to get total column moisture. As these headlines show, GPS can be significantly impacted by solar flares.
12. Space Environment In Situ Suite (SEISS)
A quick overview of the Space Environment In Situ Suite it detects:
The elevated radiation detected by this sensor benefits many user communities - the airline industry, the satellite industry and manned space flight operations.
Airlines have becomes interested in polar flight routes, this interest picked up in the late 1990's. I listened to a talk by Mike Stills of United Airlines at the AMS conference in January 2008. One of their challenges during winter in the northern Pacific is strong westerly winds that delay or cancel flights. The solution that came up was routing flights over the poles. It cut down flight time in some cases by a couple of hours. This is not the chart he used, but you can see from this chart that the number of flights over the poles has increased since 2000. One of the FAA guidelines is that the flight has to maintain communications with ground points along its path. Inside the 82 degree circle, the flights rely on HF communications. When there is high geomagnetic activity, it adversly affects HF communication. When there are high heavy ion radiation events, it is a health concern for pilots and passengers. The new and improved solar and space weather sensors on GOES-R will help detect events that will adversely affect communications or pose health risks and allow the airlines to plan alternatives.
14. Sensor: Geo Lightning Mapper
Frame 1. Next on our agenda is a brief overview of the Geostationary Lightning Mapper or GLM for short. This is an exciting new sensor in particular because this is the first time we will have a lightning sensor on geostationary orbit. Steve Goodman, who is now the GOES-R program senior scientist, lobbied for many years to get a lightning sensor on GOES. We have Steve and colleagues to thank for this accomplishment.
Frame 2. This graphic depicts an estimate of the annual climatological lightning density in the GLM viewing areas. Values have been estimated from the Lightning Imaging Sensor (LIS) on TRMM and the Optical Transient Detector (OTD) Both are on polar orbiting satellites. The information is impressive. It will be interesting to compare this with duirnal lightning when it becomes available on GOES-R.
Frame 3. GLM detects total lightning strikes which includes in cloud, cloud to cloud, and cloud to ground. It complements today's land based and research based systems. In will provide increased coverage over oceans and dead zones over land. Steve has mentioned that one of the primary justifications for the GLM was that it would be beneficial for aviation convective weather hazards - in particular, foridentifying oceanic regions of convective turbulence.
Frame 4. As we saw from the graphic, it will have full disk coverage. GLM will have surface to cloud top vertical resolution . At nadir, the spatial resolution will be 8 km with approximately 12 km on the edge. It is specced to have 5 km mapping accuracy with high real time accuracy and less than 1 minute data latency.
15. Mapping storm initiation/growth/decay
Lightning data is beneficial in mapping storm initiation, growth, and decay. The loop on this slide depicts the integration of observations from different platforms. (Slow down loop to point out items) TRMM has provided the opportunity to pair lightning data with radar reflectivity, passive microwave, and infrared imagery and compare with 1 minute total lightning activity. This example is from the early May 1999 tornado outbreak in Central Oklahoma. The big red data blob in the center represents the area in which an F3 tornado occurred. The GLM sensor on GOES-R will have a 2 ms frame sampling capacity which equates to 500 frames per second. This can be alot of information to digest. One of the approaches to using the data is to grid it up and cluster it as shown here. When I loop this again, you'll get an idea of the type of individual bits of information being sampled. Here I'm showing a small subset of the 1 minute of total lightning activity. Research has shown that trending of changes helps identify severe storms for instance (point to) total lightning increases as the storm intensifies. This information can be used to increase the lead time on warning for severe and tornadic storms and ...
16. Total Lightning Impacts Decision Making
.. impacts Decision Making. Steve Goodman and colleagues have been involved in proving ground- like activities for a number of years that have involved the University of Alabama at Huntsville, the National Weather Service Offices and the Short Term Prediction Research and Transition Center or SPoRT for shortTeam In offices where they have been able to receive total lightning and evaluate its impact, they have found that it has helped in issuing correct severe warning decisions as well as the opposite - holding off on issuing severe warning decisions.
Notice that lightning data has even been included in a WES case for office training.
Alot training has come out of SPoRT, so check out their web page for more information.
Besides severe storm research, another active area of lightning research is directed towards tropical storms and hurricanes. How does lightning activity vary as the storms develop and decay? This example is from Hurricane Katrina in August 2005. The image on the left is from the Loa Alamos Sferics array which sees cloud to ground lightning in the rainbands and in the eyewall. On the right, we see LIS data from 4 different days showing lightning principally in the rainbands and occasionally in the eyewall. Later in this module, we'll see another tropical application of MSG imagery combined with global lightning network data.
Lightning on a geostationary platform IS NEW and it will provide total lightning day and night with greater spatial coverage. Look for training materials coming out of SPoRT to learn more about what GLM will offer.
19. Sensor: Advanced Baseline Imager (ABI)
Frame 1. Let's reorient in the presentation: we're on the last sensor the Advanced Baseline Imager or ABI for short.
Frame 2. The ABI will result in increased resolution in many areas including temporal, spatial, spectral, and radiometric. we'll only address the first 3 here. The ABI will also provide better navigation.
Frame 3. Let’s look at what we can expect with increased temporal resolution.
Frame 4. This is a good visual depiction of imaging cabability. In the future, the ABI will be capable of imaging the fulldisk in 5 minutes. It takes about 25 minutes (26:06) for the current GOES to do this. The ABI will be 5x faster. There will likely be various scanning strategies as there are now.
Frame 5. Two such proposed schedules are: Continuous full disk scanning every 5 minutes or a 15 minute cycle that will collect 1 full disk, 3 CONUS,and a mesoscale sector every 30 seconds. We'll demonstrate this 15 minute cycle over the next couple slides.
Data and imagery that you will see on the next slide were provided by the CIMSS AWG proxy team. They simulated imagery for ABI for the 7.3 um water vapor channel and a 15 minute cycle is shown.
21. Loop:Concept of flex mode scanning
This loop illustrates the concept of flex mode scanning and is for a 15 minute period. In it you will see 1 full disk, 3 CONUS sectors, and 15 mesoscale sectors. Again this is simulated imagery for the 7.3 um water vapor channel that will be on the ABI.
22. Reduced data outage during eclipse
I threw this slide in to remind you that starting with GOES 13 and continuing on future satellites, which will include GOES- R, we will see more imagery through eclipse periods. Recall that from the satellite perspective, the earth eclipses the sun and results in no solar panel power. Larger batteries will allow the sensors to scan through this period. On either side of the eclipse periods, there will be stray solar light coming into the instrument. NESDIS Office of Satellite Operations has determined that this stray light will not affect the health and safety of the instruments, however, there is a negative effect on the data collected. They have developed a method to take partial imager frames to scan away from the sun intrusion to retain as much good data as possible. During the GOES-14 check out, they ran tests on the procedure and we await the results.
The ABI navigation will be better than current GOES east and west and we also see this benefit starting on GOES-13 and 14. Hopefully GOES-13 will replace the GOES 12 satellite in the Spring of 2010.
24. Ex.: ABI Improved Navigation
This animation shows the same scene as viewed from GOES-12 and GOES-13, which has improved image navigation and registration (INR). You see less land movement and this will directly benefit for instance fire location. This example was featured on Scott Bachmeier’s GOES blog: http://cimss.ssec.wisc.edu/goes/blog/2007/08/03/wildfire-in-the-upper-peninsula-of-michigan/ He has other examples on his blog associated with the GOES-14 science test.
25. ABI: Increased spatial and spectral resolution
Frame 1. Now we’ll focus on the increased spatial and spectral resolution that the ABI will offer. Frame 2. Major changes are on the horizon. Spectrally there will be 11 more bands or channels than are on the current GOES. We'll be seeing an increase in spatial resolution by a factor of 4. I'll address these changes in two groups, the first group being in the visible and near-IR and the second group covering the short wave and long wave infrared and water wapor imagery.
ABI Visible and Near-infrared IR bands: Information for this table was gleaned from Schmit, T. J., M. M. Gunshor, W. P. Menzel, J. J. Gurka, J. Li, and A. S. Bachmeier, 2005: Introducing the next-generation Advanced Baseline Imager on GOES-R. Bull. Amer. Meteor. Soc., 86, 1079-1096.
ABI will have 16 channels. This table shows the first 6. From our current geostationary perspective, 5 of these channels will be new. As we get more channels, it becomes a little more difficult to summarize in a table such as this the sample or a primary use because there are many uses depending on your perspective. We can say that all these channels will view clouds, the exception being that channel 4 at 1.3 µm will not generally see the lower cloud or surface features because of water vapor absorption in the atmosphere in this region. The 2nd channel at 0.64 micrometers is very close to what is currently on GOES and is a heritage channel. It is closely associated with the red portion of the visible regions. (I say this because the visible spectrum is continuous and therefore there are no clear boundaries between one color and the next. The ranges for various colors are an approximation and sometimes will be different depending on the source of information.) Band 1 with a central wavelength of 0.47 um corresponds with the blue portion of the visible spectrum. The other channels are beyond the visible part of the spectrum that our eyes can detect. ABI will not have a green band so we cannot create a true color image (the red/blue/green). Experimental work by Steve Miller at CIRA indicates that we can simulate the green band from correlation between the red, blue, and the 0.865 um channel and we'll see an example of this later on. The channels in this table have very similar counterparts on MODIS, the polar orbiting satellite, and most of the channels depicted here have counterparts on the Severi instrument of MSG, which is the Geostationary European Meteorological Satellite. This allows us to see and experiment with what our European counterparts have discovered and also take advantage of what MODIS has available at higher spatial resolution, but lower temporal resolution. If you have not seen Scott Bachmeier’s VISIT session on MODIS products in AWIPS, I recommend you check it out. The Europeans now use many color composites of the various channels for interpretation and we’ll see a few here. One other item they have commented on is the high resolution visible imagery at 1 km (they went from 3 km to 1 km at nadir ) Notice that we will have increased resolution for the GOES heritage channel at 0.64 um (from the current 1 km to half km). This is one channel that we will not be able to get real-time geo examples of, but from what I see at half km MODIS imagery, I look forward to half kilometer geostationary imagery. We'll first view a graphic that shows the various channel locations from different platforms, then we'll see a graphic showing reflectance properties for snow, ocean surface, and water and ice cloud. Following that, we'll see an example from MSG and another one from MODIS.
27. Graphic: Visible/Near-IR channels for various sensors
Frame 1. This is a graph of atmospheric transmittances from a radiative transfer model showing a mid-latitude standard atmosphere with no clouds. The region extends from the ultravoilet into the near infrared and encompasses the visible portion of the spectrum. For reference: If the atmosphere were clear (no clouds), and we were looking down from an orbiting satellite, we would be able to see the surface features. The transmittance would be represented by a straight line across the top at the value of one. But as we can see, this line is very variable from 0.3 to 2.5 micrometers.
Frame 2. There are gases and aerosols that absorb radiation and if we are in regions between 1.3 and 1.4 or 1.8 and 1.9 for example, water vapor absorption prevents us from seeing surface features. We can also see other smaller regions where there is absorption by other gases.
Frame 3. Current GOES has one channel in this region. This is a general representation of the width of the channel. In reality, the response over this width is variable with a higher signal response near the center and less near the ends. A general representation is being shown here.
Frame 4. GOES-R will have 6 channels in this region. The Visible (red) heritage channel is narrower than the current GOES series.
Frame 5. MSG has 4 channels we can use to get an idea of what we will be able to see with GOES-R. Their high resolution visible covers a broad band. The 0.6, 0.8, and 1.6 provide a good sampling/representation of capabilities we’ll see on GOES-R particularly as it relates to vegetation health, increased capabilities in distinguishing between water and ice particles and particle size. Recall: MSG gives us increased spectral resolution and increased temporal resolution.
Frame 6. MODIS has similar channels for all the GOES-R bands so we can look more closely at which channels are better for particular applications. With MODIS we have increased spectral and spatial resolution, but we lack the geostationary temporal resolution.
28. Graphic: VIS/Near-IR Reflectance for ocean/ice cloud/water cloud
Show Frame 1 and 2. How do we distinguish between ice cloud, water cloud, and ocean surface using channels in the VIS/Near-IR? We can take advantage of their reflectance properties. Low reflectance features will appear dark on a visible image and high reflectance characteristics will appear bright. Values plotted here were derived from directly sampling features on MODIS imagery. So for instance, the low reflectance on water cloud in the 1.3 µm is more the result of water vapor absorption in the atmosphere that prevents the signal of the low level water cloud from reaching the satellite. In GENERAL, ice and water cloud are highly reflective in the visible wavelengths and decrease in reflectance as one moves to the near IR, with water being more reflective than ice cloud. I stress in GENERAL. When ice particles become very small (<10 µm) they become more reflective than water cloud. The small ice particle size has been linked to strong updrafts and severe storms, but it also happens for non severe storms. This is an ongoing area of research at CIRA and other locations. Let me call to your attention as well that the ocean surface has a very low reflectance across this region.
29. MSG RGB (1.6/0.8/0.6 um) snow/cloud street/cirrus/dust/ocean
(Note the link to turn on the overlay to see the placement of the channels on the wavelength spectrum). This image example is from MSG and was put together by trainers at the Hungarian Meteorological Service. This is an RGB image combination with the 1.6 um channel on red, the 0.8 um channel on green and the 0.6 um channel on the blue. As we saw from the previous graph, the ice cloud is highly reflective in the 0.6 and 0.8 um region and less reflective in the 1.6 µm so it appears cyan. Water clouds on the other hand are highly reflective across all the bands used and so appear white in this MSG example. Sea surfaces are dark (low reflection across all bands used.) After this example we'll look at the characteristics of snow in terms of reflectance and we'll see why this mimics the ice cloud and appears cyan. We'll also see why vegetated surfaces appear green and areas that are brown represent either bare ground or senesced vegetation. http://oiswww.eumetsat.org/WEBOPS/iotm/iotm/20070224_streets/20070224_streets.html
30. MSG RGB (1.6/0.8/0.6 um) Loop 24 Feb. 2007
Animation is great. We're seeing half hour data here. The Europeans call this their natural RGB - likely because some features such as green vegetation, low level water cloud which is white, and dark water surfaces look natural. I don't know about you, but cyan snow and ice clouds do not look natural to me. Animation helps us tell the difference as well as what I'll call textural appearance. Over time, your brain automatically detects the differences in the features. This loop shows another feature. Sincc these channels directly sense solar energy, they show the effects of varying solar illumination throughout the day. Brightening up through the mid portion of the day and then darkening again.
Time for a quiz question: What is this feature in the lower right portion of the imagery? While you think about the answer, it is a good opportunity to say that COMET will be coming out with a module in early 2010 that explains how to create RGB composites. The answer to the quiz question is blowing dust.
31. Graphic: VIS and Near-IR: Reflectance for snow/grass/soil
How do we distinguish snow, vegetation (here grass) and bare ground (here sandy loam from Texas)? By their reflectance characteristics in different channels. For example, if you focus on grass, there is a dramatic change in reflectance from low at 0.4 and 0.6 to high at 0.8 µm. In contrast to this, snow has a higher reflectance for the 3 lower channels (0.4, 0.6, and 0.8 µm) but drops off for the 1.6 and 2.1 µm channels. This bare sandy loam soil is in between with a lower reflectance at the shorter visible wavelengths and slowly increasing to a mid reflectance in the near IR. Notice that I didn't mention the 1.3 um channel. Recall from a previous graph that low level moisture absorption generally prevents the satellite from sensing surface or low level cloud features.
Dept. of Earth and Planetary Science, John Hopkins University, IR Spectroscopy Lab (Directional (10 Degree) Hemispherical Reflectance Coarse snow (modeled and measured), reddish-brown fine sandy loam soil from Lubbock Co., Texas (measured)
USGS Digital Spectral Library splib06a Clark and others 2007, USGS, Data Series 231. lawn grass
32. VIS/Near-IR MODIS Ex: snow /grass /soil /water
(Note the link to turn on the overlay to see the placement of the channels on the wavelength spectrum.) With this MODIS example, we'll first look at individual channels and point out obvious differences between snow, grass, soil and water features across the spectrum, and then we’ll look at a few common RGB combinations. With the recorded version, you can't stop and go back and forth on the images. If you want to examine the images in a little more detail, put on the overlay and take it off, either look at the browser version with notes or download the visitview sesssion and look at the visitlocal version.
Single channels:
Frame 1. We are looking at imagery in southern California and Nevada. This is the 0.4 um or blue channel. Most of the white is snow, these are (mostly) snow covered mountains they have a dendritic pattern. There are some low level and high level clouds around the edges of the image which are hard to discern right now. Here we see a region that is grey in appearance. It is a combination of snow on trees and other vegetation. Here it is difficult to tell the difference between the ocean and land surface color.
Frame 2. At 0.6 µm (or the red visible channel) we see the ocean features darken, and some of the land surface features darken. This is primarily because we have increased transmittance - the scene looks less hazy for low level features. The snow surface remains bright white.
Frame 3. When moving up to 0.8 µm, we see a dramatic brightning across much of the image. This is due to increased reflectance for vegetated surface features. Recall the reflectance curve for grass from a previous graph. Not all vegetation types look like grass. Ocean surfaces have remained dark. Snow and clouds are still a bright white. SOme of them are hard to discern from the bright background surfaces.
Frame 4. At 1.3 µm, almost everything goes dark. Why? This is a region of low-level water vapor absorption. we are seeing reflection from snow, particularly in high elevation areas, because the atmosphere is dry. Note that we do not pick up the lower level cloud at the south end of the San Joaquin Valley.
Frame 5. Looking at 1.6 µm, we see that the snow is darkening in other words becoming less reflective, and many vegetated regions are also darkening (from 0.8 µm).
Frame 6. At 2.25 µm, we see more darkening (from 1.6 µm) in the snow and vegetation.
RGB color combinations:
Frame 7. This next image is an RGB (2.2, 0.8, 0.6 µm) It is similar to the combination used for MSG we already saw, the difference is that the 2.1 um channel replaces the 1.6 um channel and gives the color image in many instances more contrast. This combination is used on the MODIS Rapid Response Web page. Bright cyan (~equal mix on blue and green color, absence of red color) is snow; darker cyan is vegetation (likely evergreen trees) with snow on or around them. The different shades of green represent different types of vegetation. Tan- brown- reddish brown indicate ground with sparse vegetation. Navy blue is the ocean. Off white is the cloud
Frame 8. RGB (0.4, 1.6, 2.2 µm) This is a product that is also created on the MODIS Rapid Response web page and is also being put out onto AWIPS by the SPoRT (Short Term Prediction and Research Transition Center). It is used to highlight differences between snow, vegetation, and cloud. The Great Falls Mt. WFO collaborated with SPoRT and used this false color product to monitor snowfall and river ice. They submitted a preprint to the AMS 2009 meeting so if you would like to know more about this, you can start out with that reference and/or contact them directly.
Frame 9. This last RGB combination mixes the visible, near-IR, and IR. Snow is magenta color, evergreens with snow have a purple hue and other vegetated and ground surfaces have a green/blue green appearance. Clouds are white, and the ocean is a deep blue.
Which RGB is the best or will be the standard? I dont' know. That's why we have the proving ground with research and operational folks looking at the pros and cons of the various combinations and helping to point us in the right direction for usage.
33. ABI IR Bands (3.9 to 13.3 um) table
This is the second set of ABI bands and we'll look at the infrared region. Again the information for this table was gleaned from a paper by Tim Schmit and colleagues and the reference appears in the student guide. For many of these channels, there is heritage from either the GOES imager or sounder. The new channels are at 8.5 and 10.3 µm and the exciting aspect of this 8.5 channel is detection of SO2 and sulfates, it also helps us get a better handle on the detection of ash/dust. Our European colleagues tell us that this channel is good to use for distinguishing fog particularly under cold winter conditions when the 3.9 um temperatures are noisy. Because it is used in place of the 3.9 channel, the product they create doesn't have that dramatic change from day to night. In a few slides, we’ll see an example of this RGB product.
The 3.9 and 8.5 channels as well as the three channels between 10 and 12 µm are window channels and can see surface features when clouds are not present. Many of these channels when combined with other channels will give improved information for distinguishing cloud type, cloud height, and cloud particle size. When there are no clouds combined image channels will help distinguish clear moist vs. clear dry conditions, volcanic ash, and dust.
34. Graphic: 3-14 um channels for various sensors
Frame 1. First I want to orient you to what you are looking at. On the y-axis is Brightness temperature going from warm at the bottom to cold at the top. The data that make this graph come from the hyperspectral AIRS instrument (Atmospheric InfraRed Sounder), on the polar orbiting satellite Aqua. It measures a couple thousand (2378) narrowly spaced wavelength bands in the shortwave to longwave region (approximately 1band every 5 nm). We are looking at examples of opaque targets for 1) a surface which is clear ground in Nebraska during the day and represented by the open black circles, and 2) a deep convective cloud at night represented by the open purple diamonds. (To focus, put text of surface and cloud top in the appropriate places. Each of these series of points represents the spectral information for one pixel (~13km x13km) in an AIRS scene. Example pixels from day and night are shown just to point out a few items of interest. Jog your memory and think back to basic radiation principles and the black body concept associated with the Planck function. We can use the Planck function to convert from radiation to brightness temperature. If our surface and cloud were black bodies and the atmosphere in between the satellite and the surface did not absorb, we would see a straight line for measurement across the spectrum. But what are we seeing? For the cloud top, we are seeing something close to that, but not exactly.
Frame 2. Between 9 and 10 µm, we see the affect of absorption by ozone. This is in the stratosphere which from the satellite perspective is warmer than the cloud tops. When looking through to the surface, this region appears colder than the surface, but warmer than the cloud. In the 3.9 region of the cloud, we see the affect of noise in the measurement at cold temperatures - points scattered over a 30deg temperature range. Because the surface measurement is from a day scene, in the short wave 3.9 µm region, the measured temperatures are warmer than those in the long wave IR region. This results from the reflected solar radiation at this wavelength. Where we have absorption in the atmosphere of H2O vapor and CO2, we see different patterns. For the cloud top, the satellite primarily sees the influence of the cloud top. For the clear sky scene, in both the water vapor region and the CO2 absorption regions, the satellite does not sense all the energy being radiated from the surface. It sees colder temperatures that represent absorption at levels higher in the atmosphere. In this SHyMet series, Scott Bachmeier talks a little bit more about why we have the scatter of observation points in the water vapor region. So pay attention when you are listening to Scott's presentation.
Frame 3. Where are the current GOES channels? On current GOES, we have 5 channels in this region note that we can only view 4 of them at one time and which 4 we use depends on the satellite. Point out channel differences between GOES 8-11 and GOES 12+.
Frame 4. On GOES-R, there will be 10 channels in this region. As I mentioned, some of these channels are heritage from the GOES Sounder, so for those channels we will have significantly increased spatial and temporal resolution. The 8.5 and 10.3 µm channels will be new to GOES.
Frame 5. What MODIS channels can we take advantage of to explore capabilities of GOES-R? There is fairly good coverage; lacking only in the water vapor region and at 10.3 µm.
Frame 6. What MSG channels can we take advantage of? MSG has good coverage across the spectrum. The channels are broader due to the nature of their sensor design. I would like to reiterate that information is not sensed equally across what is displayed here as a band for the channels. In general, the information close to the center of the band is weighted more that information on the edges.
35. VIS/ Near-IR / IR Ex: Fire/Smoke
(Note the link to turn on the overlay to see the placement of the channels on the wavelength spectrum)
Go to Frame 2 first. (0.64 µm displayed at 0.5 km) In certain parts of the country, being able to better detect fires and areas burned as well as where smoke disperses is very important for real-time hazardous situations. Some burn conditions create surfaces conducive to rapid water runoff and knowing where the burned areas are will be helpful particularly under future heavy precipitation events. Some of the new channels and increased spatial resolution on the ABI will help in these areas. This is an example from the MODIS Terra satellite of fires in Southern California in October of 2007. We are looking a little off center of the swath and are seeing the bowtie effects of scanning. (This is explained in Scott Bachmeier’s MODIS imagery in AWIPS.) This is what we would see with full resolution half km visible at .6 um (red). We’re seeing more detail than we currently see with GOES. It will be great to see something like this every 5 minutes.
Frame 1. (0.47 µm - 1 km displayed at 0.5 km) The whole scene looks hazy. Part of the explanation is that it is hazy - at short wavelengths, there is molecular scattering by atmospheric constituents. Note from the inlay graphic this is represented by decreased transmittance - molecules that are similar in size to wavelength of measurement scatter light. The other part of haziness is that there is light scattering by the small particles of smoke. This channel has been noted to have a sample use of daytime aerosol over land. Indeed, we can see the smoke over land quite well.
Frame 2. (0.64 µm) This image is being shown again in sequence to demonstrate there is less background molecular scattering in the atmosphere and we can see surface features and smoke cloud features more clearly.
Frame 3. (0.8 µm - 1 km displayed at 0.5 km) Notice that the background surface lightened up. Most of this corresponded to vegetation. Remember from (slide 31) the graph that vegetation becomes considerably more reflective (or brighter) when you cross 0.7 µm. The smoke becomes less visible: as the wavelength of observation increases (the small smoke particles are becomming less effective scatterers as the wavelength of measurement increases). With these two effects (vegetation more reflective and smoke less visible), it is more difficult to see the smoke over the land. Because the ocean surface has very low reflectance, the smoke is visible over it. This channel has a sample use of aerosol over water.
Frame 4. (1.61 µm - 1 km displayed at 0.5 km) As we move out to longer wavelengths, the smaller smoke particles are less effective at scattering the light and the image appears clearer. Some of the background surface reflectance characteristics have also changed. We are also starting to see another affect. Can anyone guess what these little white dots are? Yes they are associated with intense fires. Moving along to the next slide
Frame 5. (2.25 µm - 2 km displayed at 0.5 km) At 2.2 µm, our spatial resolution degrades. Even with spatial degradation, white spots are more noticeable. (The spots have not saturated.) We are getting out into the tail region of solar energy and into the tail region of the infrared emitted energy. The satellite sensor is just measuring energy at a particular wavelength range and cannot tell whether it is solar or emitted. When we do see an increase of energy here from fires, it means that the fires are hot.
Frame 6. RGB [2.2, 0.8, 0.6 µm] This combination is the same one we saw for snow. It is from the MODIS Rapid Response Team and is similar to the MSG combination. It shows smoke, green vegetation regions, reddish/brown burned areas. We are viewing many reddish brown areas. Are they all from this burn? We can look at other imagery to determine where the hot spots are.
Frame 7. (3.9 µm 2km displayed at 0.5 km). Here we are seeing a number of hot regions that stand out. This MODIS channel saturates at 564K; ABI specifies saturation at 400K; current GOES saturation temperature is 330K. This image shows many fires that saturated at 564K. We will see better characterization of fires with the ABI in terms of intensity and spatial representation.
Observed warmest temperatures for fires UL 409K; UR; 564K other big fire 564K; fire lines 355 - 462K Background temperatures: over water 288K; mountain tops 287-301 avg. 292K; warm areas without fire 309-319 avg. 315.
Frame 8. (11.2 µm - 2 km displayed at 0.5 km) The 11 um channel doesn't have the temperature sensitivity of the 3.9 um channel, but it will give us improved information because it will have higher spatial resolution.
Frame 9. (Brightness temperature [3.9 to 11.0 µm]) Doing a simple brightness temperature difference between the 3.9 and 11.0 µm channels, results in a good view of where there are significant hot spots.
Frame 10. (False RGB [BT3.9-11.0, 0.8, 0.6]) If we take the brightness temperature difference and combine it with the visible channels that highlight the vegetative/burned component, we obtain this False RGB view. This particular scene shows many of the aspects of the fire that we looked at in individual channels (smoke, green vegetation, active fire areas, burned areas). Will this be a GOES-R product? I don’t know. Products like these will be introduced into offices via the proving ground and we expect to get feedback on their effectiveness and see what other areas they may or may not be beneficial for.
The AWG is testing algorithms for fire detection, size and intensity determination using improved spatial resolution from the 3.9 and 11.2 µm and a greater dynamic range with the 3.9 µm channel. They are also checking out the utility of information from new channels such as the 2.25µm .
There won’t be a green (0.55 µm) component on GOES-R perhaps we can simulate it. ‘True’ color satellite imagery is produced from MODIS because it has the red (0.65), green (0.55), and blue components (0.47µm). Many believe that simulated true color imagery is preferred by analysts over panchromatic visible because it is visually intuitive, less ambiguous, and has higher information content (feature recognition). It is being produced by synthesizing the missing green band. The green is created by correlation between red (VIS0.65), blue (VIS0.47), and near IR0.86 with the green (VIS 0.55) on MODIS. The example depicts a wildfire scene in Southern California (October 2007). A MODIS image can be seen to the left, a simulated true color image (approximate image) is in the center, and a difference image to the right. Most of the differences are below 10%, only areas under heavy smoke cover show differences up to 25%.
37. MSG RGB Ex: low clouds in Winter IR channels
The regular fog product BT 3.9-10.7 does not work as well in the winter when there are very cold temperatures (3.9 has more noise at cold temperatures). Our European counterparts have found that the 8.7 µm when combined with the 10.8 µm channel detects clouds in a similar manner as the 3.9 differenced with the 10.8 µm channel but without the noise at cold temperatures. Here is an example of the product they create. (Are you noticing that they use a lot of RGB imagery). This example was from February 2008, and was put together by Mária Putsay of the Hungarian Met Service. This product is the 24-hour cloud microphysics RGB image. (note the link to turn on the overlay to see the placement of the channels on the wavelength spectrum) It uses the brightness temperature difference products 12-10.8 and 10.8-8.7 along with the BT for 10.8µm. The new channel is the 8.7, and even though the 12.0 µm channel is not new to us, it has disappeared from GOES starting with GOES-12 and will not return until GOES R. Light yellow/green color indicates fog or low clouds, pink color indicates clear land. High-level ice clouds appear with a reddish brown color (thick cloud) or black (thin cloud). We see clear areas in France and Southern Europe, fog/low cloud over UK, Germany, Hungary and Romania. Because this RGB doesn’t have the 3.9 µm component and can be made for a 24-hour period, it is particularly useful for creating night-day loops. Our European counterparts tell us this is the best combination to use for fog/low cloud detection during twilight.
38. IR channels: MSG RGB Ex: low clouds in Winter loop
In the animation one can nicely see the spreading valley fog in Germany and eastern part of France. One can also notice the formation of high-level lee clouds over the Southern Carpathian Mountains (persistant for several hours) and later over the mountains of Bulgaria and the northern part of the Czech Republic.
39. Aerosol / Dust Optical Thickness Retrieval
Here is another application of MSG Seviri data that we can benefit from with GOES-R data. On the right is an RGB image using the same combinations for RGB as the previous fog/low cloud example. It uses the three channels 8.7, 10.8, and 12.0 in brightness differences combinations and as an individual component (12.0-10.8, 10.8-8.7, 10.8). It looks similar to the previous example for some features (dark red for opaque cloud tops), but different for others (here the background land surface is whitish blue (not pink) and the dust is pink). When using channels linked to temperatures, the colors and interpretation will change to reflect the season, location, and in some cases, the emissivity of the background surface. But after looking at the imagery over time, you will become accustomed to what it represents. On the left part of this example is a retrieval of the dust loading associated with the event. This is another example of algorithms being developed/adapted and tested through the AWG.
40. IR channels: Dust RGB from MSG 3 March 2004
credits (note the link to turn on the overlay to see the placement of the channels on the wavelength spectrum)
41. IR channels: Dust RGB from MSG 3 March 2004 - Loop
42. IR channels: Ash and SO2 RGB from MSG - May 2008
Chaiten eruption in Chile Here is a third example of the same RGB combination, different application, different location. The BT difference for 12-10.8 is well known as an aid in distinguishing ash from other meteorological clouds and the 8.7 µm channel is used to detect sulfur dioxide from volcanic eruptions as well. The SO2 plume appears as a yellow-green color and the ash takes on a pinkish appearance.
43. IR channels: Ash and SO2 RGB from MSG - May 2008 - Loop
This is a 24 hour period showing imagery every 2 hours. The volcano erupted a numner of times over several months. We see ash and SO2 in the South Atlantic going up and over the ridge (this is the southern hemisphere!)
What benefit are 3 water vapor channels on GOES-R? One application depicted here is in relation to detection of mountain wave amplitude based on detection of cloud signal at different heights in the atmosphere. This first image is the observed WV band at 6.5 µm on GOES-12. (note the link to turn on the overlay to see the placement of the channels on the wavelength spectrum) This image is at 4 km resolution and the band is broad. The next 3 images are simulated ABI bands from MODIS imagery and are displayed at 2 km resolution. We immediately notice that the wave patterns are much clearer than on GOES-12. These figures come to us courtesy of K. Bedka and W. Feltz at CIMSS
45. IR channels: RGB Air Mass Product with Lightning
This example combines lightning investigations with what the Europeans call their RGB Air Mass product. It has been provided by John Knaff at CIRA. For this example, GOES-R channels have been simulated using MSG imagery modified by algorithms developed under AWG. The RGB composite uses information from 4 channels: 2 in the water vapor region at 6.2 and 7.3 µm, one in the ozone region at 9.7 µm, and one long wave infrared channel at 12.2 µm. These are used as differences as noted on the left. Red shows the brightness temperature difference between 6.2 and 7.3µm scaled from -29C to -4C; Green shows the brightness temperature difference between 9.7 and 12.2 µm; Blue is the 6.2 µm scaled from 243 K to 208 K In the image, cloud is white, green areas are tropical air masses characterized by low ozone levels, the red colors are associated with dryer regions and the blue colors indicate a cooler air mass. As we loop the imagery on the next slide, you will be better able to see this. Hourly lightning from World Wide Lightning Location Network (WWLLN) is shown by the red dots. John and colleagues are looking at these data now to get a better idea of how they can be used in the future and one of the questions they are addressing is: How is lightning density related to hurricane intensity change? In the tropics, among other environmental factors, a low sheared environment is favorable for development or intensification of the storm. John has stratified the lightning data into low shear vs. high shear groups. Preliminary results for the high shear regime suggest that higher observed lightning rates may be an indicator of extra-tropical transition at higher latitudes or dissipation if the storm is at low-latitides.
46. IR channels: RGB Air Mass Product with Lightning - Loop
In the upper right region of the loop, we see a drier airmass. In the upper left region, we see cooler air being wrapped around the system. On the lower half the image, we basically see the tropical air mass characterized by low ozone levels.
This loop has been provided courtesy of Li and Jin at CIMSS. Total column ozone was derived from MSG SEVIRI channel 9.6 µm and other IR channels. This is an AWG activity. They took an algorithm developed for the GOES Sounder channels and modified it for MSG. This activity demonstrates the flexibility of the code. The data has been applied to studies of atmospheric dynamics, reflecting the relationship between ozone and potential vorticity in the stratosphere: high levels of Ozone indicate the position of the upper level low. For air quality, this information will be used primarily as a source function in air quality and ozone prediction models.
If you recall from the satellite winds module, we can derive wind motion from the visible, water vapor, and infrared regions. With increased channels in all these areas, and increased spatial and temporal resolution, we WILL see improvements in representative winds. Better navigation will certainly have a positive impact. The higher spatial resolution will improve target selection. With multiple bands, there will be improved information at more atmospheric levels and result in better target height assignment. Increased temporal resolution will allow for better target tracking. (NEdT = noise equivalent delta temperature - refers to instrument induced noise level.)
49. No Dedicated Sounder on GOES R-S
There will not be a dedicated sounder on GOES-R and S. As mentioned earlier in this presentation because of budget constraints the sounder component was removed from GOES-R and S , it has not been completely excluded from consideration for inclusion on later satellites in the series. The question that often comes up is: Will the products based on the ABI provide an adequate substitute for legacy sounder products? The Short answer is Yes. Adequate substitute products can be generated from ABI data in conjunction with information from short-term numerical model forecasts. This was written up by Tim Schmit and colleagues in a recent journal article and you can check that out for information. You can also obtain more information on the benefits of a hyperspectral sounder by referring to the COMET modules listed at the end of this sessionhas already been produced and another is scheduled to be released in early 2010.
What 3 points do we want you to remember? There is more to GOES-R than the Advanced Baseline Imager. There IS a new imager with many new channels. If you want to keep abreast of what is happening with GOES-R, one of the first places to start is the web page. You don’t even have to remember the address - just google GOESR.
If you want another graphical representation to represent the number and types of ABI channels, refer to this.
(probably the more important ones)
More links directly related to imagery and imagery products
58. Acknowledgements/Contributors
As mentioned on the title page, there are many contributors. We’ve likely left some out, but not intentionally.
Thank you for looking through this presentation. Feedback is always appreciated. We like to hear what you liked and what you think can use improvement.