All right welcome back everyone to part two of this training on NASA atmospheric composition ground Network supporting air quality and climate applications today we're going to continue with aeronet and do some Hands-On analysis of data again I'm Dr Carl malings and again joined by Dr poan Gupta one of the aeronet co-leads and today we're also joined by Peter gorov a scientific programmer from science systems and Application Inc here at Godard who supports the aeronet program in this part we'll be showing you how to access relevant aeronet data for given locations and different application purposes in
a variety of different ways and also showing you how to compare and jointly analyze the aeronet data together with relevant satellite aerosol Optical depth data products from the V instrument for a given location and Time just as a review last uh session we learned about aeronet it's a passive remote sensing instrument measuring aerosol Optical microphysical and radiative properties from more than 600 active sites around the world giving total column aerosol information and if you have any questions during this training please put them in the questions box in the WebEx tool and we'll address them at
the end of the webinar and also uh in a Document which we will post to the training web page uh after the training today and now let's hand it over to Dr poen Gupta to talk about Hands-On analysis of aeronet okay great uh welcome back everyone to the part two of this webinar series thanks to call for introduction and continuing this series into part two so in part one we learned a lot about aeronet Network how it make measurements what kind of instrument we make what uh What kind of uh different geophysical parameters on atmospheric
aerosols are produced by aeronet and how International Community comes together to run this great Network which has been providing data for last 30 years in part two what we are going to do is we going to start looking some of the AET data using some of the online tools which we have in house through aeronet uh we will also go over specific websites show you uh what you through with some of the Tools uh and I would also recommend everyone to uh have a uh internet web browser open so that they can actually walk through
with me and I will in between I will also ask people to do a small exercise as we move on towards the uh this part two so let me quickly start with uh some of the slide deck which we have created as a reference slide for this uh this part of the part two of the presentation will not use uh the part two of the presentation will not use lot Of the slides but we will do online uh demo and online uh walk through to the different tools but I want to actually just go over
this slide tag which we have prepared as a reference slide when I say reference slide it means if you have something missed during the demo if you want to come back later on and take a link or refer uh what particular part of the website or tool is doing what then you can come back and look at this slide so this can serve you a reference Slide later on time uh this is just the list of things which we are going to do today uh we will have data display data download uh we'll play with
some of the Jupiter notebook uh python codes to uh visualize aeronet data into different uh uh different ways quickly this is the main page of aeronet website again we are highlighting and I will walk you through actually different component as we go again showing where the data part is Located on the aeronet website again a great reference slide this is how the data station data looks like um I will show you how to access this particular uh file uh this is a site information page how it looks like again we'll walk through this and go
over it when in the live demo this is the data display page uh again it shows very quickly what to look for but uh when we do the demo we'll walk you through and understand each of these Plots more carefully this is the data download page uh how to download the data this is a data Synergy tool Aon data Synergy tool here we actually try to bring aeronet data along with some of other satellite data uh realtime imageries some model output to look them together to understand a specific atmospheric phenomena or atmospheric aerosol properties this
is a very new inworking uh Tool uh called mapping uh through API Where we can actually map the aoso data from aeronet from all around the world uh on a map uh more recently aeronet data has been actually Incorporated uh into NASA worldview tool anyone who is using satellite data for any of the application uh they are familiar with the NASA world view tool is one of the most popular tool to visualize uh near real time satellite data and aeronet has been just added to That uh these are little bit more on data API people
who are more data savy if they wants to access the data in more automated ways API is the way to go and this is the part where the we have some uh Jupiter notebook in Python which we have put out on GitHub uh they perform different operation and my colleague uh Peter uh will actually go over this part uh some example what we can do using those Jupiter python codes uh so You can do the map in different projections you can do the time series you can do some kind of a tile plot showing over
different time of the window you can do some Trend analysis uh simple or using annual main numbers uh calendar plots so we have uh several python codes which do these task and finally uh aeronet data how we actually collocate in a space and time with satellite data in this case uh we have an example code uh which does this for The vi data which will show and demonstrate using the jupyter notebook uh these are some specific details how to connect with our code and uh so on and so on we have also included couple of
slides uh to guide you to through to get some the aoso data from satellite because in our last code we we use the satellite data so we just want to make sure people are aware of satellite data although we assume anybody who is comparing aeronet and Satellite data already have good understanding where to get the satellite data again on the satellite data specific part okay so I think that's all those slides are and before I move on to the actual uh demo let me take a pause for a um for a few seconds uh 10
seconds so that people can uh get into their browser and uh get ready to walk along with me okay great so before we start on the web demos um let's find the aeronet site So it's easy to remember the aeronet website it's a. gsfc nasa.gov but if you cannot uh the best way to do is just search aeronet NASA in the Google search engine or any other search engine you like and hit and the first one will come up as error Soul robotic network homepage and if I click on that you will reach to this
page now let's me make this little bit bigger in terms of fonts so that everybody is able to visualize see that more Clearly so when you go to this website this is our homepage on the top we have different components of aeronet which we talked about in part one uh this gives you a little bit introduction about aeronet again we talked about this all in part one and describe what different component does uh this is the map showing all the stations and if you come on the bottom here we update this particular section with with
the news And announcement so for example NASA AET data recently added to the world view which I talked earlier uh is the news item here we have some specific about uh maintenance issues for people who are using data on regular basis we have a big meeting coming up in September so that announcement is here we have a newsletter so that announced so anything sometime we also put out some quick analysis on interesting aerosol events which are Happening so this is from the last year when Canadian Wildfire actually happened and lot of smoke was transporting here
in the east coast in the US we put together how aeronet is actually measuring those smoke uh Aerosoles in the atmosphere and how it is impacting their quality in the eastern coast so there's a lot which we put here time to time so anybody who's really looking for this aeronet data watch out this space for new and updates Information one more thing I want to point out uh we have a list uh so on homepage you can see this yellow highlighted part so if you are interested in receiving regular uh updates about aeronet and any
uh specific things through aeronet uh you can click on that link and join uh uh aeronet list ser and you will get in regular updates uh from team on the left you will see different parts right so you have networks Campaign collaborators data Logistics all different things are there listed here okay so I will go over some of these uh more detail but let me go through the homepage first this section is where the aeronet data section so anybody who's interested accessing data or displaying data this is the section to watch the panel vertical panel
on the left side where You have visualization tool we have the data display download in different ways uh and these are divided into different types of data sets which we talked in part one so these are direct sun measurement aerosol Optical de these are inversion data product like single sking ELO and other things uh we have some other things also like solar flux these are the ocean color data which we talked about aeronet OC these are the nighttime data using the lunar and then we have Some other Cloud mode which we have not talked in
more great detail but these are some experimental data sets Okay so before we move on let me show you one of the thing which often people use specifically people who have used a lot of data so one what is we called is site list okay so if you click on the text format then you will get a list like this and this list combines all the sites aeronet ever has created so it has site name the first one and Then the longitude and then latitude and then the elevation in meters so this site is generated
I think every day uh and it is get updated so there are if you look counts I think there are 1,600 some odd number of sites which ever created so this is this can be a good source for people who want to analyze global data sets the site list is also available in Google Earth format so if you want to download that and display in Google Earth you will see all those uh Dots all over the world and if you click on the all list you can get that the same list in Google Earth format
or text format by year so you can generate this list by any years so let's click on uh 2024 and then you will get a list like this this list is very similar to what we saw earlier but in addition to those station information it also has information in terms of data availability so the the way data ability Is displayed we have done in one way so for each month we have set up the flag one or zero so if there is a data available for that particular month it is saying one if there's no
data available then it is says zero so there is a this flag for each month is available and this again can be very useful for people who wants to analyze and understand how much data is available from particular side on particular year again there are so many List here uh people can browse through okay so let's go back to the uh homepage after browsing those list now first thing I want to go over is called what we call site information so again on the left side panel if you click on the site information page you
will reach to this page now you will see a notice here that if you want to stabilize a new aeronet site if anyone is interested in getting a new aeronet site near them uh they can Contact us uh and we can see if aeronet just remember um every we cannot fulfill every single request there are so many factors which determines where we will put theet a station so uh but this is a way to uh start the process whether it is possible or not now we have a similar map as we saw on homepage this
map is very uh inter active you can zoom in zoom out any parts of the world uh there are four buttons on the top uh by default all the Stations are displayed but you can also display active stations so remember in part one we talked about that although there are 17 1800 stations dis deployed during entire time window of aeronet but the currently about 600 stations are active so out of all if I just click active then these are the sites which are active in last one year sites which we the way we Define the
active sites are sites which have produced any data in last 365 days that's the side we call It active then there are inactive sites which have not produced any data in last one year and then there are some sites which are limited so these limited sites are created sometimes to test the instrument or to make measurement for just for a few days uh sometime they can be very specific to fil campaign or something but they are very very rare uh limited sites I think we Define them uh if the data is available for less than
10 days then we don't call them Permanent or campaign site they comes into the limited site page so the most important one is the active site page now on the bottom of this map you will see a list of all the sites okay and as you start zooming into the map you will notice that that station list also start reducing it only shows the list which is displaying on the map so I'm going to go to the NASA guarded area that's where we are located that's from where we are doing this webinar Series so I'm
going To I have already clicked all the active sites so this is the area where we are located and if I go down to the map then you will see there are only nine stations which are displaying in the map which we have and this particular site is the one where we are located it's called gsfc there are two sites actually on the same location one is called gsfc which is the oldest one and more recently we started a gsfc polar which is a Polar instrument uh on the CED site so I if I click
on the gsfc you get to this page let me zoom out a little bit so that you can see what is the on then page you have site name you have coordinates and the elevation same information as we saw earlier in the file list here is the picture typically we we have two three pictures so this is our aeronet since it is a calibration Center uh we have uh mounted many many Instrument here for calibration purpose which get swap out on a daily basis there is some size description there are some information about if you
want to contact the pi site manager there are some information here so for each site we have this information so let's go back to the map here now as exercise I want you to take 2 minute of your time and zoom in to the area where you are physically located and find a site which Is nearest to your location and note down the name and piece of pen paper or on a uh notepad on your computer and that is side we will use throughout this rest of the uh presentation to actually evaluate or download the
data so I'm giving 2 minutes time for everyone's to look browse through this map find the site Which is closest to you and note down the name thank you e E e okay great so I hope everyone uh with internet connection uh you have uh ability and you have explore the site information page and find the nearest tonet site which is close to you okay now let's move on just remember that site name and we'll use it later on now let me also go through some of the other pages on this website uh we have
a uh link which is says campaigns so again I talked about this in part one aeronet Supports several NASA and non NASA field campaigns specifically designed to make a measurements sometime for air quality applications some for time to do the climate application some time to understand more atmospheric uh behavior of atmospheric aerosols in chemistry and um energy budget so these measurement aeronet um has been taking to support this field campaign since beginning of the program as you can see in 1993 and since then it has participated In many many uh F campaign uh starting 2011
we started deploying this high density Network called Dragon uh the most recent one we deployed uh aeronet is in the Asia AQ so I'll just give you a flavor of how this campaign information looks on our web page so once you click on this campaign page you will get to that you'll have some background information about the campaign itself and then uh we will display and provide the near real time Data to actually not only people who are doing The Campaign on the ground but to the community as well and this data helps them to
actually do flight planning if they are flying aircrafts uh they can actually uh change uh their uh flight plans based on where the aoso events are happening based on their onet data which becomes almost in real time available so again during Asia AQ we deployed in South Korea Philippines Taiwan and Thailand uh these are the locations specifically actually done just for the field campaign there are additional sites which you don't see here which are permanent sites in that particular regions of the world so this is just example here from the same page you can download
the data also um you can also get the list of uh location list here uh there's a map where the new sites are displayed in blue color the permanent sites are Displayed in green color very similar to what we saw earlier and if I just click on the campaign then you will just see the blue color sites which are specifically uh falls in the and then there are some more details contact information so very important if you interested in field campaign data sets uh this is the uh aeronet contribution to those uh campaigns okay now
let me go to the operation operations Is a specific page not may not be so much useful for the data users but this is more for site managers and principal investigator of each site people who actually manage the instrument on remote locations at universities or Institution misss or remote or individual houses this is for them and what this does is it provides a lot of information about the health of the instrument so if I click on for example instrument status let me click these are the different Aeronet networks partner networks as remember we talked about
that aeronet is a networks of networks so these are the different networks so if I just click on the aeronet then you will see all all the instruments which are actually deployed and you you'll see many many stations many many with the gsfc because these are all deployed for calibration purposes so you will see a lot of specific details about those instrument again not so much useful for the data Users but for more for site managers in pi okay uh now one of the most important uh thing is about the system description we talked about
very briefly in part one what kind of measurements uh sun photometer or what kind of instrument aeronet use uh here you will get lot more details about that so you can click on the measurement system and you will learn about all that SML instrument what each number in that name means different wavelength combinations all the specific Things including the communication um in the calibration section you will read about about uh entire calibration process which I have briefly gone through in part one but if you are interested in more details you will get a lot of
details here there is a operation page uh there is a data transfer page you will get lot details specifics uh if you're transferring data through GSM or Internet or satellite uh we have a data process in process setup how the data are processed we get the data and then it get to different and then finally the data distribution which I will go over a little bit more okay so that is what system description takes you now let me take you to one more important topic is logistics again this is not so much for data users
but this is for people who manage those site uh but it has a lot of Important information so we have all the manuals of all the different models of sun photometer which we ever used we have several videos to travel shoot the instrument to install the instrument so these are very uh instructional videos uh we have some tools uh how to trans how to actually do some data transfer uh we have some we do lot of shipping as I mentioned earlier in the part one the syst system has to come every 12 to 18 months
to NASA Gard Or any other calibration Center so there's a lot of shipping happens back and forth uh from the site to one of the calibration Center so there's a lot of information here uh again not not so much useful for the data users but for the site managers and Pi okay one more thing publication uh since the beginning of program there have been a lot of uh uh research article which uh are published by aeronet team and aeronet partners all Around from all around the world and we have tried to Archive them here most
of them you can directly get to the links some of them are very old one and you can actually find through the search we also have link to the quarterly newsletters uh we also uh have several reports um on aeronet different data setes some details uh about a specific version a specific products there's lot of uh reading material available for People who wants to use the Aon data okay so this is main parts I have gone through there are some other things which you can go over in your own they're very uh clearly written now
I also want to show quickly about ocean color which we talked about aeronet OC in part one uh this is a specific page on that they have their own news uh although it's component of aeronet but it operates only over ocean their sites their data downloads all the details are There about aeronet similarly we have for maritime aerosols which I discussed earlier uh using handheld microtops uh this is a uh old paper but it describ the men Network very clearly uh we provide the data sets here again all the data sets are produced here and
they are named based on the ship cruise and time window when it happens and you can download this data actually so you will see a long list of shift Crews from where we collect the data and the data Are reported here okay so I think that kind of completes the first part now I'm want to get into the data part so aeronet data access right so let's start with for optical death because that is the most common parameters used um from aeronet so the first one is called Data display I will click on the data
display and it will take me to this Page so on the data display now again this map is there you can actually filter this map in many different ways how many years of data are available what level of data you want uh you can also pick individual years you can pick month day you can pick active inactive sites all kind of things you can do here but I'm going to just look the gsfc site and then what you can do is when you're walking through with me you can select the site which you identify earlier
is The nearest site to your location so instead of gsfc you can pick that site yeah so I'll just so I pick the gsfc site I click on that the map shows the location of the site and then when I click on the gsfc it will take me to the data display page for the gsfc site so once you click on the GSF site you will come here the page look like this okay now let me walk you through very quickly So again this is the site information page link which we saw earlier here there
some details about the people who manage the site here are some statistics on the data avability so from this particular site we have 25 years of data record started in 1992 93 92 993 and currently data is so the total number of days are these almost 30 years of the data are available from this particular site data arranged in different years You can select any years uh these are the parameter which you want to display you can select I have selected the aerosol Optical depth first and these are the different levels again level one is
the cloud raw data without cloud clearing level 1.5 or cloud Clear automatically Cloud clear data and level two are quality assured and calibration applied so for on quantitative purposes we recommend that users use level two data you can display daily averages or All Points um so let me go to level 1.5 data because that is available in real time okay level two data takes 1 to two years and this this is 2024 August and today is August 2nd right so this is showing August one and August 2 data and then this is just for the
August 2nd these are the time in uh UTC so you can see these different lines shows aerosol Optical depth corresponding to different wavelength which are represented here aod at around 500 nanometer is 344 which is actually pretty high for this location so there's a wildfire going on in the west coast which may have some influence transported but also during summer time sometime a can be high uh because of the hygroscopic effect on aerosols on the bottom you have several Link you can download the data directly from here uh or there are other ways to download
which I will show you so this is what the first visualization looks like now let me go to the download tool next the interface is very similar okay you can select any site so since I am going directly to gsfc on the search box I'll click put gsfc click on the gsfc and then we'll take once I click on the station it will take me to the download page and you can see various Options here okay you can select the date range so I'm going to just select 2024 for a sample download from January to
December 2024 and then there are different types of measurement which you can download which we talked earlier and you can get now what you can do here is you can click on this data description page and you will get description of all the things which are written there how the daily averages are calculated how the Monthly data are calculated what it means for different parameters all the raw data and inversion and so very specific and details informations are actually provided on this page the unit page gives you the units of all the parameters which we
provides so you have a specific units for each of those uh parameter which we provide so just for sake of uh demo I'm going to choose level 1.5 uh aerosol Optical depth precipitation water and then angstrom parameter and then I will choose all points means all the measurements which are taken and then the daily average is just the daily average so to keep things faster I will choose daily average level 1.5 January 1st to December uh 31st 2024 and click on the data download Once I click on the data download you will get to this
page which Is basically aeronet data policy so since aeronet is a collaboration uh among many countries many individuals um we try to make sure that people who manage the individual sites uh uh get credit when their data is used for publication for research purpose and that's what exactly this data policy page shows how to refer the data before you use the data and research and Publishing uh do you want to give some credit to the people who Are managing and collecting the data uh from that particular station believe me it takes a lot of efforts
to maintain these instrument they often gets uh problems and the site managers have to continuously watch for them and make sure the data continuity remains so that is the whole purpose of this data policy page uh you click accept and then you immediately it downloads the file it's the files are downloaded in this form this is the uh initial date and this is The ending date of the data and the site name in level 1.5 so these are downloaded as asky file uh I can open this using Excel or text I will just open in
the text here just to show you how it looks but you can also these are basically CSV files so you can easily open them in a Excel spreadsheet this is the header of the file you will see all the details about the data sets different columns so you will notice there are many many Different columns presented here uh but this provides you everything you need to use the data for various scientific research or application purposes so it's very simple format very easy to use data and very easy to download the data okay now let's say
you want to download all the global data for the entire time frame everything together then we have this as well this is called download all the data of all the sites All times 10 and they are provided in two different format one is star. gz and then just the zip format so depending on your requirement you can uh download any of these uh they are again separated by All Points data daily average or monthly average and again these are different uh parameters these are SDA AOS Optical whatever you need and again users guides are also
given here so this is another way to download the it's huge file when you Click on the download you will get a several gigabyte of file so it's it's a little bit bigger file but you get everything into single file climatological tables now this is another piece of information which we provide again very similar interface uh as you see earlier what it does is it provides climatological values for each site again this is the data policy I'll just say accept and then you get to these Tables so what these tables provides you are the climatological
mean for each month and year so this is entire time window for each month how the aerosol Optical depth is and this is yearly mean8 and then this is one standard deviation this is angstrom exponent this is standard deviation in Ang from exponent this is precipitable water vapor this is standard in water vapor these are the number of days which are Used to average get this value and these are the number of months for which uh annual data average so this is for the entire time window but you can also get into each year this
is 1993 and remember whenever we don't have the data we don't provide it we don't try to fill it so if any of the month is missing yearly means are not calculated user can calculate their own annual mean but we don't provide it here and you can Download the data using these uh files which we have here overall download combined downlo so this climatology is very very useful they are calculated with lot of precaution quality control and people who are interested in Trend analysis uh they can actually get this quantities easily okay uh so now
the next thing I want to do is click on the web services so on the web service we have uh data access through API again Different types of the data we are going to look into the aod and SDA these are different parameters and their explanations in the API okay and in the bottom these are the different examples how the apis are called so let's say I want to get level two level 1.5 data from all sites available from the aod right or I want to get level one data uh for one of the site
so let's click on this particular link this is a web link which you can Actually Des uh Define so instead of what I will do is to get the data for the guarded site I will change in the web link site equal to card site instead of card site my site name is gsfc and you can change to the site name which you have noted down earlier so once I change the site name I rerun and that's what now the site is gsfc and this is for 2,000 data now instead of 2,000 I'm going to
Get the data for 2024 so this is 2024 and then now this aod 10 is level one so I'm going to change this to a aod 15 equal to 1 it means I'm getting the data for level 1.5 so this is level 1.5 data uh all the points uh which were given for that particular this is gsfc this is the day month uh and this is since we are doing average equal to 20 it means it is going To give me daily average if I do average equal to 10 I think this will give me
all points so this gives me all point now this API is very useful people who wants to access data in more automated ways we will show you some example how we use this API in Python code with when we get to that part okay so I hope you are able to follow along and get up to this point let me go back to this website again so this was Data display and data Download part now you will notice that although I have only gone through aerosol Optical depth but these similar capabilities are built for each
type of data sets which you can explore in your own time now in the next few minutes I will quickly walk you through two more tools which we have one is called map Explorer uh this is a I'm going to reduce this is a visualize the data and It can visualize either near real time data or daily main data so select a mode it's near real time and then if I select the daily then this day night flag should go away and you will see let me select from yesterday so you can actually map Theos
Optical depth over the entire globe and let's since we know already there are have been a lot of fires in the US currently going on so I will just go back uh few days and let's look uh July 26 for example uh so this is July 6th 202 uh 2024 you can see some of the stations let me zoom in to this area they are really High aerosol Optical depth in this particular part of the the gray area uh stations shows inactive sites on that particular day and I can turn them on and off if
I want if I don't want they can be turn Off now if you Mouse over any of these sites you will see specific details about the site it will can take you to the data page also from here but if you click on them you will get a time series of the data from last 30 days so you can see how on this particular side aerosol Optical depth was so low in the earlier in the July aod values were less than 0.1 very clean condition as soon as fire started a started jumping up up and
then it reached to all the 1.24 and these are Daily mean values if you look hourly values they can be much more higher so this is a nice tool to visualize the data if I do on the near real time basis then it basically takes the current time and display the data which are available in last one or two hour time window and you will also notice that this also displays a day night uh separator so the area which are kind of hidden Gray colored they are nighttime And this is the daytime so AET is
an optical measurement uh most of the our sun data are during daytime there are some night data which we don't display here but we do have night time data as well so again very nice Tool uh if you want to visualize data on the map okay now let me go to the Synergy tool now so we can click go to the data visualization click on the Synergy tool here uh I want to spend a lot of time But again you can do similar things you can choose the date time site information and then you can
select different things from here what you want to visualize some of them uh may not be readily available and some of them may be available so aod inversion products right so you can display select again what parameter you want to display you can choose water ER or angong Parameter so you can actually decide which parameter you want to display here are the size distribution reflective index or other parameter which you want to display so for this particular date probably we don't have the data available so let me switch to level 1.5 and then here also
level 1.5 then let's pick another date yeah now this is for January 14 so you can see um some other parameters so This is reflective Index this is size distribution yeah so if you click on that you'll get a size distribution basically it's showing that there are two modes in January one is do minating in the smaller size particles and one in the coarser size particles and depending on time of the years this size mode can change so let's look at the size mode just now what is happening during the fires and we have some
impact of fires right so size Distribution now you can see a little bit different now dominating mode is the fine mode smaller size particles whereas the course mode is not that much of domination so this this is also a good Tool uh to actually look several data sets together okay great uh so that was uh aeronet website tools and data download now I'm going to pass along to my colleague Peter uh he is also part of aeronet team and he's has developed some Of the Python tools to visualize map do time series uh and do
inter comparison with the satellite data uh so he will show you how to access these codes and how to run them uh using Google collabs so for this specifically I'm hoping everybody has Google account so please sign in your Google account if you have not uh because we are going to use Google account to uh do this so Peter take over thank you pan uh hi everyone my Name is is Peter gorov and I'm aon's new scientific programmer I developed the following Google collab codes in the infancy of my career and we think those codes
are very important because it gives people outside of our Network the capability of interacting with AET data and producing some cool plots so there are five codes that we're going to go over the first to read and map aeronet aerosol data onto a global map one of them produces a series of static plots That can be saved uh onto your machine while the other ones produce more interactive visuals with some widgets another script accomplishes a similar task uh except it uh uh produces except it reads inversion uh AET inversion data um and then another script
reads aerosol data once again but it instead produces a variety of Time series plots and the final script collocates aeronet aerosol data with the VR satellite data and then does a Comparison so first code we're going to open up is the read and map AET data or aod and just note that uh this uh these codes are on the AET GitHub and they're open and the link to this repository is I think on slide 17 okay so now that we have uh this code open so the purpose of this uh code is is to read
and map airet data directly from our web API this data includes our aod measure measurements taken with different wavelengths as well as Anam Exponent parameters uh different parameters can be specified by uh by the user such as the geographical box of the coordinate box of latitude longitude um the type of average whether it's daily or hourly um the time frame uh Channel Etc and the output of this code is those geographical maps with the circle markers and color bar which represent the average daily or hourly values of the products the main use of This tool
is to track the movement and density of aerosols over time and this tool was first deployed in June of 2023 and mapping the aerosols produced by the Canadian wildfires um then this time we focus on the time frame from July 25th to August 1st where we would track some of the wildfires that are affecting the west coast of the US and the spreading throughout so this module makes use of uh certain um uh certain libraries so The first thing that we would have to do once we run the script is we'll have to mount our
Google uh sorry we have to mount our Google drive onto this collab notebook otherwise if we do not do that we're not going to be able to you know save our our produced plots so you just have to click continue make sure you have a Google account as pan had previously mentioned and these are some of the libraries that we make use of uh the Numpy library is for array manipulation pandas is for um data cleaning and data processing uh some of our map features such as coastlines State boundaries Etc are done with uh ctie
and in addition to that uh some of the projections that we use such as the orthographic or the plate carry projection those are imported from the ctip library as well beautiful soup uh kind of an odd name but that one uh is a is the most popular web scraping library From python where it would access a website send a request using the request module and then it would read that data that text file on the website um as a text file and then geopandas is similar to pandas except it's for geographical data so now we've
gone over some of the relevant libraries and we've mounted our uh Google Drive the next part will be the input parameters so how the web API works is we have some Parameters and those parameters such as the date the level data the uh features the coordinates there used to construct a URL hyperlink and depending on those parameters it it'll produce a different hyperlink a different sorry a different URL that contains that uh relevant data so our initial and final dates are in four digigit year two digigit month and two-digit day format our level as Pavan
mentioned a Has level 1.0 data uh level 1.5 which is U Cloud screen and quality assured then level 2.0 data is in addition to uh quality assured in Cloud screen it's also um calibration uh verified the average type um so we have three types of averages here either we can produce daily plots which is the average for the day we can produce by hourly bin or the time average is the total average for the entire selected data frame the DAT time frame which will Produce only one plot um these two variables viz Min and viz
Max they are the color color bar bounds so we've defined that aod value greater than 1.0 is going to show as a magenta on the color map anything less than zero would show up as gray or unavailable and anything in between will will follow this color pattern where for low aods will be uh dark green for high aods it will be dark red and then anything in Between so um so after defining this color bar uh our final some of our final um parameters for data aggregation will be selecting our feature and our channels so
if you would like to select aod or anr exponent we either do one or two otherwise um um and then after that is done we select um our specific Channel so we have dozens of different wavelengths but the most popular aod that almost always has data is the 500 nanometer and then here's the bounding Box so uh they have to be in a decimal format because the uh web API does only take a um a decimal format when constructing the links so we can run that and um and since everyone can download a copy of
uh this this code they they're free to do so and and everyone can change their parameters as they as they wish this next cell uh what it does is it validates some of the data uh and then and It prepares that those inputs For constructing the link so first um this block here takes the initial and final dates and extracts their year month and date components here uh in this block we take the user input of the level data and we convert it to an integer number so if it's level 1.0 it'll become 10 if
it's level 1.5 it'll become 15 the purpose of that is the web API reads it uh in in such a way and then the same thing some the Some data verification for vimin and vimax make sure none of them are negative make sure they're not flipped and then this is the last verification check which prevents users from selecting a the current year and level 2.0 data because as pavon earlier mentioned it takes a couple of months maybe up to a year for level 2.0 data to show up and after that is done we construct the
URL as is seen here from all of the Input parameters and then we pass this URL onto this request module which sends an HTTP uh request using the web and uses the beautiful sup package to convert that to a text file that is saved so that is uh after that is being saved it will the file would show up onto your U uh Google Drive so once that is done we read that text file and we assign it to a pandas data frame so data frame is kind of like Um uh it resembles an Excel
file it has column and then uh this is where the some of the data manipulation begins because our our date and time uh format for era data they're little different than uh the conventional ones so this block of code basically converts them to Python's daytime format which follows a four digigit year-2 digigit month- two digit day for erret data it's using two digigit day colon two digit month colon four digigit Year then depending on the average type that we specify we group them based on the eress site and date if it's daily if it's hourly
it's site date and hour and if it's uh for total average we just group everything by the site and the numeric columns argument pretty much uh make sure that string columns do not get get aggregated otherwise that would cause an error because we cannot average non-numeric columns and there's a check here that if For example if um our data frame is is does not exist or if there is no data they will print the message that there's no data to pars and then you will prompt the user to retry with different parameters so now we
have some data so the next uh after we have our data processed it's time to select the data that we want to work with so this uh this cell this particular cell does exactly that it extracts the all the aod columns and the uh it sorry it reads all The aod columns or the action exponent columns then it matches the proper columns based on the user input so depending on if your feature choices aod or if your feature choices anram or um what um aod value you use it parses through the file and it matches
your input with the proper file and at the end it isolates the data frame to just include the S side L latitude side longitude and the um aod column or the acram exponent Column after that is done we produce a uh we first things very important to drop all values that are um undefined and to reset the index so resetting the index does remember we Group by error site and date for daily averages to do further manipulation we have to reset the index so we can include the site and date back into our data set
so now we have data set that is ready for plotting so here we make use Of uh all of those uh plotting packages such as ctip and matplot lib then we produce those plots for each day so now um the coast should generate a plot for the first day which is uh July 25th and then it will keep loading until it goes to um the last date which is August 2nd so here we have um for July 25th you can see some high aod valleys in the Midwest and here's some some forest fires on the
on the uh on The western side of the US around ID in Montana and some more for that day here's on a 28 they're spreading to the north Great Plains same for 29 and 30th and here's where um they're kind of like subdued and they've already pushed they've moved a little Eastward you see some higher aod valleys around the Great Lakes region and now here um the aods are uh a values are pretty much uh [Music] stable so that was for the plate carry projection and a similar thing can be done except in the orthographic
projection uh which some people may um prefer because it gives you the the it's it's like you know looking at a globe instead of just a map right so uh the computational expense is a little greater for for this type of a projection so it will take a little longer for the plot to produce but Should still not take that um big of time so this is I have to zoom out a little bit so everyone can see so we have for the 27 28 29 30th 31st and after uh the scode finishes running everyone
has the opportunity to uh decide if they want to save all these files all these files that have been produced Here would like to download as a zip file or not so I'll say yes so they will compress compress them could have chosen a more creative name but feel freid to modify the code and and put a different different name for the output and I'll just put in my downloads folder and it's going to be there and let's see okay so uh that is the end of the read and map AET code so Now um
we will go back and and launchy I launch a different one so let's try the uh the more interactive code for reading and mapping ER the data so the only difference between the this interactive code and the previous code is that this one makes use of um widgets from the from Python's IPI widgets Library as well as mapl Libs animation um Package so um it's the same procedure as always we Define the um same input parameters we produce the the link the exact same way um we process the data in the exact same way and
as well as uh the anram exponent aod selections the only difference is that uh in terms of the plotting at the end we get uh this map which it's one plot but it works like an image Carousel where you select the slider let's put the 30th there's going to be a a few second delay depending on how big the data is but that would go directly to the specified date and similarly this is done for the orthographic uh projection it's a and again if you uh if you define uh hourly average then actually what produce
two sliders one for date and one for hour but for demonstration purposes we're only going to stick to Date so our plot has been produced and we can select let's say 28 and and here's the plot and the second piece of uh interactive widget that decided to use here is an animation similar to a a gif except um there's some uh control buttons such as uh pause play uh we can either run it to go through ones or we can just we can Loop it and we can also adjust the speed of You know how
how how fast the um images switch or you could also go in reverse I think reflect was yes reflect is where you would go in reverse yeah and then same thing for the orthographic uh projection and um there is no U I did not put in the script to um uh save the uh I think uh this interactive code is better for just the purpose of displaying and viewing uh the data which Is why I did not put um anything to save it but yeah this uh this will work the same way okay so moving
on to one of the more recent developments sorry uh the more recent developments it is the this script which reads and Maps inversion products so this is no longer aod this um runs uh it plots some inversion products like single scattering uh uh albo or Um like a single scattering OBO or aod coincident inputs or Extinction and absorptions Etc and this and we can select our inversion type whether it's Alma counter or hybrid and we can specify again our wavelength of interest and similarly like before uh there's some data validation and uh constriction of the
of the data year month or uh sorry constriction of the converting the time stamp to individual year month and day Constructing the hyperlink and then reading it through beautiful soup and so here I combined all the data processing and plotting into one big cell so it should start plotting some of them so uh this particular the script just because I combined everything in one cell it would take uh a few minutes to run maybe between three and four but at the end it would uh produce uh it would produce everything for you uh so we
have for each hourly Bin I guess this is why it took a little long this time because instead of daily averages I did hourly average for each day so there it is you can see some uh so this plots the single scattering oido at 440 nanometers each hour and I think it also does it for the orographic projection eventually but you can just take one plot Tab and you can see in Detail so yeah this uh this script runs same very similar fashion the way that uh uh the read the original read and map AET
aerosol data Works uh except there's some more data validation and um and I've combined all the cells and uh together so uh the next uh script we're going to run will be for the read and map AET time series so we can just close out of that Uh can go back to our Google collab okay so now that everybody has that open so the purpose of this script is to read airnet data again from the aess API and plot time series graph to visualize changes in uh aod or Antion exponent parameters except this time it
does not do it for the whole um you know the the whole geographical domain it does refer a specific AET site um so again we can specify you know site name uh average type data level Wavelength time frame and it produces three plots it will produce a um standard um time series graph it'll produce a standard time series graph um it will also uh produce some uh calendar uh a tile plot and at the end it'll it'll it produce a um an annual variability plot with some standard deviations um over superimpose on the graph as
a a shaded Region right so we pretty much use almost the same uh packages uh we have Cal uh except yeah cowplot is another package that we have to install that one's different from uh the previous one all the other ones are are pre-installed already we have math for um some more complex equations we have uh some other map lib uh components like MDES which converts a date time to Numeric and then again we'll have to our drive onto the scab notebook just quick continue so obviously the only difference of the input par parameters is
that um we also have to specify site name and actually the web an aess web API you can put uh the site as a parameter and it'll generate a link with data just for that specific site so in this case we choose uh NASA GED is gsfc and we're doing it for a period of 10 years so starting yeah so averages we can either choose uh daily or monthly uh so uh daily averages is preferable here because you're also going to get the calendar plots and the tile plots and unfortunately you cannot produce a calendar
plot with with monthly averages because the purpose of a calendar plot is to show the value for um each day so now we read our a URL same data Validation as before uh same uh um same data processing and same feature selection and at the end for our specific site notice that the a site name is no longer there because now it's it's redundant because we're doing that for each specific site so we have the date day of the year and the aod measurement da of the year is uh important to have here because uh
because date is it's it's an object it's Not a an integer value so it's going to be difficult to manipulate whenever we're doing um counters so the first plot is a Time series graph depending on the average type it will produce either monthly average or daily average little clutter here because um again it's a daily average so it's valid for each date and this is over the course of 10 years so but we can spot some uh season it here with some of the Um Peaks and obviously here we know this is during the period
where there was a lot of wild fires in Northeast Canada so that's that was roughly yeah looks like that's around summer of 2023 and this is where we produce our tile map so tile map is let me zoom out a little bit it it resembles a tile basically uh the months are on the y a uh x axis sorry and the years are on the y- AIS You can actually specify a monthly average except for the month of January instead of having 30 values it'll have one value so just going to look like a less
CL tole plot you can spot specific days where the aod might be higher and this calendar plot is for again for daily averages only U it would not reproduce if average type is not one it's not daily so we wait for it to um produce and it looks like it is done so I cannot zoom out far enough to see the whole thing because it's a very large data frame but here it is uh side gsfc and not only does it show uh the month and the year it also shows the day of the week
here that gives us a more clear view we can spot particular days where the aod will be really high and then once and again we can see for the month of June the Wildfire season we can see some some of those particularly um High Aod days and finally is the annual variability plot which again can only be shown for daily averages so what it does is Aggregates the data and shows for a particular year what the average aod was at that site and it creates a linear regression line along with a standard deviation and we
can see the the long-term trend of aods and we can see that for the G site it's been the average has Been pretty uh consistent but if we generate some plots for um other places where where recently the the governments have uh have have um have have implemented some measures to reduce the level of pollution you can see that the trends have gone down over the years so again this last part will um prompt the user if they want to save those plots to zip folder in this case I would not want to save it
and so this is it for the time series script so uh now We can try transition to our final script which is the one that collocates AET data with V's data let wait a little bit for everyone uh open this one up okay so this is this code I would say is probably the most involved in terms of what the user has to do okay um so first we obviously want to install some of the libraries here um and while that runs I can explain uh what this code does so After mounting the Google Drive
continue continue yeah so um this code includes uh so yeah it collocates AET data with V's data includes some some functions for spatial averaging statistical comparison it retrieves and processes a aerosol data uh from the uh NASA's lad's uh query data query and it also retrieves and process aess aerosol data in a similar manner uh using the the web scraping uh capabilities from from the Web API but the catches vus has vus uh CH uses the 550 NM Channel which ARA does not have so for proper comparison we do um some statistical interpolation using the
440 500 and 675 channels to infer what the a aod measurement would be at the 550 nanometer Channel and I think we use a cubic spline uh in interpolation yep it's a cubic spline correct and after the collocation is done we we produce um two outputs one is an Excel spreadsheet With the collocated data and the final one is a um visual a collocation plot so uh we use a 5.5 pixel um grid to get those to obtain those spal averages and this is the statistical statistical function which where we uh compare the actual and
then the predicted values and when we we obtain some uh values some metrics such as the bias uh root mean square error and the percent Error so uh the setting up pass and directory uh this script um what this script does is um so in our case we made it a little a little easier we have put our query file our last query file into our um GitHub so now we can just read the file directly from the GitHub and uh what that file contains is a URL a link to every all the net CDF
files in our query so in this case we Chose uh July 15 through July 31st and we focused on the region of South and Southeast Asia uh so that's all the aod 5509 met measurements it then constructs of the file name uh it Con it sets up some of the working directories and this is this URL is the one where um those measurements are taken uh are um are obtained from so a running disc code is more involved as it requires herb data credentials and the ability to use The NASA Lads archive to query cloud
and er website the user will need to generate a token uh which serves as authentication to download and access that data that token is unique for each user and it expires every 60 days and that token is actually specified right here after the uh Bearer attribute so once logged in and Earth data the token can be generated by clicking the Generate token menu um which which Pavan had had went over and we inserted in at line six here sorry not line six uh that this is cell six so um so in this cell we download
and process this uh vs data so the data set is being um read uh we calculate the moving averages uh we uh we obtain the latitude and longitude We format the data and time and and at the end we we concat all the data sets together because again uh this does this produces one data set per um per hour or per I think it's per hour correct so um so depending on um the length of of the daylight they could be up maybe anywhere between eight and 14 files per per day and each of these
net CDF files are Saved in your local drive my drive and we can see our set workshop and we can see the uh the r erdt query we have all those vus files indeed saved for that time frame so now we're done with downloading and processing vs data now the next step would be the download and process erret data in a very similar fashion right so we can choose Minimax aod so we only chose standard aod between zero and one not too high aod we specify our Coordinates in the similar way now make sure that
the boundaries the coordinates that you use in your Lads data resemble the coordinat that you're using when querying the erret data because otherwise at the end um the code will crash because there will not be any data to um to uh collocate so now after processing the error data this is where the collocation happens the collocation is uh we use the Cub exp spline to um where we tplate the 550 uh nanometer uh value based on the three channels the 440 500 675 this section of the code will take the longest time because uh because
uh it uses some uh not complex but computationally consuming um um arithmetics and after this code is done it saves the collocated data as a CSV File which is also going to be found in your working directory and that can be downloaded from your uh Google Drive and uh if you if you'll notice you don't have to specify date here because why you don't specify date for the for airnet because uh it takes the dates that is produced the dates from the v data set it assigns them to a list it's called date time and
then that list is then uh being fed in a loop right the Loop is called Date uh date time for each index of the date time um list is used to access the URL and then uh download that AET data so in this case we're not producing one Ur URL with from AET we're using it multiple times as many as their unique date times in the v data set so after the um Excel file is is um is produced there's going to be a um the final section of of this code which is which is
plotting so to plot it uh we Isolate um the necessary columns uh and and we we filter the values uh based on uh uh aod and produce the following visual let me zoom out so we have the uh measured V aod at 550 and the interpolated AET aod at 550 the solid so the solid line represents um the line of best fit and the dash line represents what theoretically the line should be if there's perfect collocation between AET And V's data so our R is 0851 so r s will be will be me means we
Pro roughly around 65 or 7 which means that the um collocation is is fairly uh reliable and uh deterministic the samples are 93 which means that there's 93 sites that this interpetation has been uh this this collocation sorry has been done for so yeah um more details on how to uh run run the script like how to set up Your account how to query Lads um how to authorize um The Lads applications on your account and how to generate a token can be found on the V airnet comparison readme file which is once again on
the airnet GitHub so um thank you everybody for um are running these scripts with me I hope you enjoy them and I hope they're not to complicate it and now back over to pavon for the closing remarks thank you very Much okay great um thank you Peter very much for excellent uh overview of the Python codes again uh if anybody has trouble in running up those code please contact Peters all the links his contacted details are there on the GitHub and on those individual codes and we will be happy to to help uh if anybody
has trouble in accessing or running those codes so quickly I want to summarize uh all the things which we have covered Today um again aeronet is a network of ground based Sun photometer uh it makes aerosol measurements uh from around 600 active stations from all around the world it's not surface measurement but it's a total column measurements of that MOS speres the data has been around for last 30 years extensively used in air quality air pollution climate changes search uh to validate the satellite to validate the model outputs um in this particular uh part two
we have actually Gone through different parts of aeronet websites uh which include interactive site Maps data visualization in many different ways and then most importantly data download which is free for everyone to download and access in simple to use asky format we have bve API also for super data users or people who want to automate and display that data into their own system uh apis does that vary cleverly and easily for everyone uh and Then Peter finally covered some of the Python tools uh which are available through the aret GitHub page for uh to move
to download and analyze the data uh along with the satellite data in more quantitative analysis and then finally again I think as uh I mentioned uh in the part one aeronet is a collaborative efforts International Community actually contribute to all the ronet data which are we Distributing so I really acknowledge every single person directly or indirectly involved in aeronet either their site managers pis from all over the world and the uh various uh network teams from uh partner networks including uh NASA G team who actually works very hard to make this data available for the
community so yeah I think that's all from me uh I thank you aret team for organizing this I want to hand over back to call uh to continue from here All right thank you very much p and thank you Peter for that uh overview of how to access aeronet data looking forward to the next part of this training we'll be learning about Pandora which is another groundbased passive remote sensor which actually provides complimentary information to the aerosol network uh the aeronet network rather on Trace gases like ozone and nitrogen dioxide in the atmosphere just as
a reminder the Homework for this will be posted at the end of our last session which will be August 22nd and you'll have about two weeks to complete that homework assignment before the deadline here's our contact information uh if you want to contact us and again there's an aeronet mailing list you can register for if you're interested in using aeronet data and here's a summary of some of the links that we showed you to the different Resources so thank you very much for your attention today and we'll now move into the question and answer session
and because we're um over our schedule time uh we're going to try to go through some of the high priority questions that are really focused on uh the tools and the data sets we talked about in the training um so I think we're going to start off um a little bit lower again we'll we'll cover these and these answers will be posted to the website After the training but because of time we'll just focus on a few questions so first we'll start with question six um which is in the aeronet downloading tool what is the
meaning of the the different levels uh so the and what are the implications of using the different aod levels um this was I think yeah go ahead yeah uh thanks call uh so I think we discussed this in part one but just I think it's a nice to refresh that so we Have three levels of data level one level one .5 and level two uh level two are the highest quality of the data which are done after the post fill calibration and we recommend that for using for scientific research analysis uh level one are basically
raw data without any Cloud clearing uh so we don't recommend those data unless uh you're looking for uh some other piece of information level 1.5 data are very similar to level two data in many ways Except it does not have the post field calibration uh so for many application uh people who need the data in real time they can also use level 1.5 uh uh but for quantitative scientific research analys analysis we always recommend using level two data okay great thank you um I think our next uh question that we prioritized was question 15 here
um so on the tool and specifically they were referring I think to the data Explorer map tool showing The the near real time data on a map uh what is the time being used is it GMT or is it local time in the the time selection yes so the way the tool work is all the time first of all in all our data sets are expressed in GMT or UTC because we want to make sure that it is consistent globally uh you can always convert that time into your local time using either standard or suntime
now on the map tool what happens is it reads uh if you have if you're displaying the Near real time data on that tool it reads the time from your computer so it reads the local time from your computer and will gather the data from past 1 hour or one to two hour exactly and then display the data there so although the the it is reading your local time from your computer the data is still displayed in uh GMT time it just Gathering the local time information to to make sure that you get the time
data closes to what you're looking For okay great thank you um the next question related to tools was 18 which is um how do we adapt or Port the Google collab example notebooks into Jupiter Hub um what kind of modifications might be needed there so um I don't think you need to modify anything as long as uh you have all the dependence packages uh you can download uh the data files directly either py or IP nvy um and you should be able to upload to any uh Jupiter Hub on Either on your local computer or
uh some online Hub uh so I I don't think you need there is any changes you need to make only thing you will have to be careful about is uh in often in the code uh there are directories uh and paths file paths are defined where you're reading either data or you're saving data to Output so those paths has to be changed depending on where you are running currently those paths are defined while considering that everybody Is running on Google callab but if you're running on some other um uh Hub then you will have to
adjust those path according to your local settings okay thank you um so the next question is how do how are missing data or or not a number data values handled um both here specifically and maybe I'll generalize the question to uh if you're taking a long-term average of aod for example uh what fraction of missing dat are there recommendations about what Fraction of missing data is acceptable versus what should be fil out yeah so call that is a very philosophical question in many ways uh yeah because it really depends on your application uh the tolerance
level of missing data right uh if I'm looking for a day-to-day change within a month uh then I would like to have data on every single day uh if I'm looking for dinal cycle then I would like to have data almost every hours or even finer time Scale if I'm looking for long-term trends um then I think having majority of the months of the data for a given year is recommended because then you don't miss certain types of aerosol Optical depth values like let's say you're looking over uh Green Belt Maryland in go where the
Gard is located and if you're looking for long-term Trend and if you're consistently missing aerosol Optical depth values during months of June July where the values are highest then your Trend will be actually biased towards the low aod values so I think that missing how much missing can be there uh or what is recommendation I would say more the data better you have the chance of getting accurate results or Trends or ready forcing calculations or how the changes are happening in the atmosphere uh uh Miss more the missing data you have more uncertainities in your
analysis and Your outcomes uh will be more uncertain in that sense and like I said in the beginning it really depends on specific application so you you will have to make that judgment yourself yeah okay um the next one is maybe another philosophical question but so if you're looking at uh an aod time series this is question 22 um and you see an unexpected single day Spike which you can't easily explain by knowing for example that there was a wildfire nearby On that day what would be some of the next steps or the next tools
that you'd recommend to try to identify what the source might be yeah I I think that's again great question so uh there there can be many things happens right so if you see a sudden spikes first of all you want to see how long that Spike sustain right if it is just one single measurement and then there was nothing before or after uh then it it could be a cloud Contamination one of the suspect suspect it could also be a very uh fine um air mass a small part of the air mass which is passing
over the location which probably is not explained by big event it could be a small dust gust or something which passed by uh Sometimes some people and we have seen this sometime um this aeronet station and some people are smoking next to the station for a few minutes it can create an Spike easily um but from the data Perspective I think what you can look is um uh aeronet provides the data in many different wavelengths so I think if you're seeing a spike in one way wavelength uh I would also recommend to check in other
wavelengths is it consistent in that in other wavelengths or not that can give you an indicator whether it is real or it is an artifact um and I think uh the the suspect will be I would say the cloud contamination could be a problem and typically aeronet Data are very well Cloud Mast uh except for the thinus cloud if there is a thinus um then we may miss that sometime we although we do them also very cleverly clear them but we have noticed sometime uh that we may have some serious contamination in our data okay
great um so the question 23 rather is is it necessary to modify the URL to download the lunar or the nighttime uh aeronet data in other words Maybe are the nighttime data also Accessible through the same API as the the regular daytime data yes so if you look all the apis are very similar in nature uh and there are different flags um or keywords in in a way in which you can uh Define in the API and um I think there is a keyword which specify whether you want daytime data or nighttime data uh the
and therefore we have a separate API for nighttime data uh it is specifically designed to get the Nighttime data but I'm I think there is also way to get both day and nighttime data using same API call I don't remember the specific at this moment but I I'm sure in the examples uh on the API page there are example which shows you how to get both data together m mhm okay uh great so this is um that was the the main questions that are the most relevant to what we covered today um again we'll go
back and we'll post this to the training web page in the next Week probably um once we've gone through and we've we've written down answers to all to all the other questions as well um unless PA on you want to re uh re revisit one of the questions now sh so let me I think I I read through the question so few things I want to just make a a generic note here okay which kind of covers some of the question but also people have question so a lot of people have asked about uh if
they are interested in a new site or if they want To uh revise a an existing site which is not working so I think those are very uh logistic type of question and what uh I would recommend please email me on those uh separately uh specifying which site you looking for my email address is there um because each site is unique uh each sit has its own story and depending on what you're looking for we can uh coordinate and work with you and see what best uh solution we can provide so that's one thing and
that applies to The new site also if you think um in your region you don't have any aeronet site and you or your institution is billing to support an aeronet site um please contact us and we will work with you to see uh explore that possibility like I said in the presentation also it's just not possible to fulfill that globally because it involve um commitment uh of lot of resources so but we are open to the suggestion and we are willing to work with anyone who wants to Stabilize sites in any parts of the world
um it it will just take a discussion and um at different level so please feel free to email us uh email me on any new site request okay yeah that's a good a good point um all right so we're we're a little bit after uh the lot of time but I think we had we did have a lot of nice questions so we did want to answer at least some of them live um and again everything Else will be answered in this document which will be posted to the training website uh in about a week
uh after we've had a chance to go through and and finalize all the answers so thank you uh everyone for attending uh thank you very much to poan and Peter for presenting today and we look forward to seeing you uh two days from now for part three of the training session thanks call and thanks everyone for attending this is great