How Chicago Solved Its Open Data Dilemma, Loraine Lawson – “In New York City, obtaining a public data set required an open records request and the researcher toting in a hard drive. So grab a notepad, Big Apple, and let the Windy City show you how to do open data. A recent GCN article describes how Chicago simplified the release and updating of open data by building an OpenData ETL Utility Kit. Before the kit, the process was onerous. Open data sets required manual updates made mostly with custom-written Java code. That data updating process is now automated with the OpenData ETL Utility Kit. Pentaho’s Data Integration ETL tool is embedded into the kit, along with pre-built and custom components that can process Big Data sets, GCN reports. “What’s different now is we have a framework that can be easily used by a lot of people,” Tom Schenk, the city’s chief data officer, told GCN. “I could also give that tool to a number of users around the city of Chicago and they’d to be able to program ETLs that are going be easier for them to understand, easier for them to create. It allows us to be more nimble.” In a particularly compelling use case, the city tapped into an application programming interface (API) that monitors water quality at Lake Michigan beaches and used the ETL to push out information hourly. If you’re curious about the OpenData ETL Utility Kit — and I’m looking at you, New York City — you can download it from github.”
Sorry, comments are closed for this post.