You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Data "at risk" has been an issue ever since the internet was created, and also any time there has been a change in administration. This just happens to have been an especially alarming transition when it comes to environmental data, hence the birthing of this DataRescue project.
Today's workshop led by Joshua Dull and Kayleigh Bohemier
Not just saving it - saving it in a way that can be used!
App limits queries to 250 results at a time. Iterating through each of the states won't work because California has over 250 government employees. Perhaps using a list of all the zip codes in the USA, and querying all the results that come back for each zip code will work. I would recommend trying that, and then flagging a zip code as questionable if it has exactly 250 employees in it because that would mean the item has been overloaded.
Note that a full list of zip codes is updated every 1 month, and you have to buy a product through the USPS to get the most current codes. https://ribbs.usps.gov/index.cfm?page=address_info_systems . There are over 43,000 zip codes!
Another gotcha: when the records come back on the html page, not everyone has the same amount of metadata (e.g. Anderson, Thomas has a job title, whereas other people don't). So the content parser will have to be flexible.
In your web browser, open the network inspector tab. All of the visualizations are powered by POST form requests to this http://data.rcc-acis.org/GridData endpoint.
Data on the map appears to be returned on a 24-day rolling window. The JSON fields are not named, they are just given integer keys.
It appears that if you change the rolling window inside params to a longer time frame, you can get more than 24 days of data at once. For example, use the body below to get over 1 year of data from the bath station. Beware that this takes quite a long time to return, so maybe asking for 1 month at a time would be better.
The PLANTS database also has extensive imagery that is important for plant identification. There are 50,000+ images that link to the Plant Profiles for each species. It seems as though the images and individual plant species profiles are only accessible by following the Search UI.
There is an interactive map on each plant profile page - this data is being accessed from an ArcGIS Server REST API. If the RESP API URL can be determined, it's possible that this can be accessed directly.
The interactive map is unfortunately just a zoomable picture- it's not a D3 visualization with any underlying javascript data structures where the values associated with each state can be extracted. The URLs for each plant follow this structure, just replace "ACARO2" with the value found when you search for an element with CSS selector '.input#vSymbol' on the page. Example provided below.