For Christmas, I got one of those inexpensive home weather station things. It collects data from sensors in the yard, then stores that information in a .csv file. I can pull this data onto my computer and do stuff with it, if I want. I could:
plot data in charts to show trends with time
analyze data for statistical information (averages, max, min, etc.)
or whatever I want and can figure out how to do.
My question for the group at this stage is -- how to deal with this data for maximum future flexibility and interest?
The unit stores data every 12 minutes (120 readings/day, about 45,000 readings/year), so I will be well within Excel's limitations for a long time if I choose to import the data into Excel. But is Excel really the best place to store this information for future access?
I'm not very proficient in pivot tables, but a pivot table seems like one good way to extract information from the broader database. But pivot tables have certain limitations (such as when you want to use a scatter plot to visualize the data).
Filters can be used to extract subsets of the data.
I could download an open source database program (like MySQL or something like that), and use that as the main point for accessing and extracting data.
I guess I am seeking for some input from those who are more proficient in some of the "database type" tools as I try to set something up to access this data, before I get myself committed to any specific approach to storing and retrieving data. How would you set this up for accessing and analyzing a .csv database file like this? What considerations would you take into account? Would you use Excel exclusively, or would you prefer a separate database program?
Bookmarks