NWS CLI data Coverting to .CSV file

Hi everyone, I found this forum through a google search I'm hoping someone can help me. I am so clueless on coding stuff so bare with me.

I need to write a script/program to convert the snowfall data to a .CSV file. But I guess it doesn't end there.

I'm looking to grab snowfall totals and departures from the NWS CLI data for all climo sites reporting it and enter them into excel. Just those numbers.

I did this manually for the climo sites with the CLI reports.. I'm looking for a way to automatically get them into excel.

Here's the manual version so you have an idea what data I'm talking about.

Thanks so much!

---------- Post updated at 10:24 PM ---------- Previous update was at 10:23 PM ----------

I can't post links until I have 5 posts.

I don't know where to find the raw data from NWS, what I use is the climate links for each NWS office like this one in Buffalo. (just add www & eliminate space)
nws.noaa .gov/climate/index.php?wfo=buf

Just click Go and you'll see the snowfall data.

Or use this link to see the entire NWS Offices listed on same page. Maybe that might help? nws.noaa .gov/view/validProds.php?prod=CLI

Not clear. Where do you start you automating process? What should be the end result (yes, a .csv file, but what structure)? In which system (yes, EXCEL, but running where)?

Thanks for the reply. Those are the things I'm not sure about.. I know what I need but not sure how to set it up or what needs to be done. If I can just get them into columns that would be enough. Structure like the image I posted. Is that what you mean?

As far as running where, Are you asking me what web page I'm trying to setup? I don't have any that's why I figured just getting them into excel for now.

I noticed you can refresh data from the web using a link in excel (just learned this), but each link requires a new tab I believe. I was hoping to get all the data on one page.

I don't know, maybe I'm being too complex. Let me sip some more coffee and see if I can better explain it

Yes, its still difficult for me to understand. Lets try this, please post a sample input and output that you are looking for, then we can take it from there.

Thanks for replying, give me few minutes, working on a post now...

---------- Post updated at 07:49 AM ---------- Previous update was at 07:43 AM ----------

Ok, so I exported the CLI data from this link into Excel

Then I used "text to columns" to separate the data.

Then I used a formula you can see at top right that grabs the specific snow total data I'm looking for.

I'm hoping when I refresh the new data gets put into the same cells and my formula updates..(it should)

---------- Post updated at 07:52 AM ---------- Previous update was at 07:49 AM ----------

This is all great and easy for me to look at and work with, however, how do I get this onto the web and into a better graphic table? I assume I need a webpage first, right?

Here is an example of what I would love to do... This guy grabbed all the data and had it available like this on his site (which now disappeared because he got sick).

Not sure if this is even off topic, I thought it has to do with getting the script to convert into csv file. No idea.

Let me try to understand.

First the link, looks like its a gov link, please make sure that this is not confidential and in view of the terms and conditions / contract, if this is confidential and posting in public forums might be problematic for you.

Please correct me if am wrong.

Are you trying to extract / convert data from excel into txt file?
Are you trying to pro-grammatically fetch the data from the link and transform to some other format?

Please scale down the input and output, then you can scale up as needed!

---------- Post updated at 06:28 PM ---------- Previous update was at 06:28 PM ----------

Ah! Thats airport data, make sure again if its ok to post these data in public forums.

1 Like

Thanks. Yes, that's public available data, no problems there. I think the answers to both those questions is yes. But isn't converting to a text file as simple as Copy and paste value only?

Yes, I want to fetch the data from the link and have it available on a webpage but not as a text file because I would need it to automatically update. I'm so sorry if this is all confusing. Like I said, I have in my mind what I want, I just don't know how to explain it better or where to start.

Just to confirm again, those links are fine. It's public data. Nothing hidden about weather for climo sites. Here's IEM which has a similar setup to what I'm looking to do. Notice they have the locations and all the data in columns next to it which refreshes with each update.

---------- Post updated at 08:10 AM ---------- Previous update was at 08:09 AM ----------

I see some threads that might be useful. I'll try to check them out later on to see if I can get more info which would be helpful for you guys.

---------- Post updated at 08:13 AM ---------- Previous update was at 08:10 AM ----------

Here's an image from the IEM site I was talking about.. See how everything is on that table on that page using up to date info?

---------- Post updated at 08:15 AM ---------- Previous update was at 08:13 AM ----------

That's what I would like done and I believe I have to convert the script.

If that's an authorized access then you might potentially have web services with properly defined contracts for the APIs, which will give easier for accessing the link. Otherwise you will have to take the other route of using LWP, perl to access the page, scrape the contents, restructure the data.

perl and lwp

If you are saying about crawling and scraping the data from one source and providing a modified view in another source, you need to fetch the data and restructure.

Automatically updating can be dealt in two ways

  1. When latency in update delay is accepted, you can configure a job that does the above and keep updating for every 'n' units of time.
  2. When latency in updation has to be low latency or ~0 latency, then you need to have a listener that will listen to the changes in the source, receive them, reprocess as required and update the secondary source /destination.

Hope this gives some clarity!

1 Like