5 Products Built During Our Hackathon That We Actually Used
When we run company hackathons we end up with quite a variety of ideas. The teams working on these projects always get something out of them regardless of whether the projects continue on past the weekend.
We do have some projects that go further than fun over the weekend. These are the products that we actually end up using. Listed below are some examples of these sorts of products and how they ended up getting used after their hackathon completed.
Ollert started to help some of our engineers who used Trello to manage tasks. They had grown accustomed to the sort of cycle and flow metrics we often track around here for our client projects and wanted the same type of data available for their Trello boards.
The hackathon team built a web-based service offering that could live alongside Trello boards and collect metrics and metadata about each card on the board. From that data, a collection of visual reports surfaced insights gathered from the data.
As a product, Ollert is one that grew into a life outside of SEP’s walls. What started as a hackathon project grew into a more officially supported and developed product. Eventually we open-sourced Ollert and released it to the community.
VLAT was built to answer a particular question that we’ve encountered on projects over the years:
While you may have coverage metrics for your unit tests, do you know for certain that your code is actually tested?
The team ended up building a product that would allow analysis of an application and its test suite. At the end of an analysis run, a team could see what portions of their applications weren’t tested in any meaningful way (despite being “covered” by a test). In other words, it would flag areas in which the test suite would continue to pass irrespective of the existence of a portion of code.
After the weekend, we kept developing the utility towards becoming a market facing product. We were able to use VLAT with some of our projects for a couple rounds of feasibility testing. Further market research led us to conclude the market for the tool wasn’t sufficient to pursue, but the product did provide us value while we were developing and using it.
The scheduling utility met the main objectives of algorithmically finding ways to build “good” schedules. Those are schedules that:
- Generally allowed companies to interview students they were interested in
- Matched students to companies based on shared roles and technologies
- Ensured each company had enough students to interview
- Ensured each student had enough interviews (but not too many)
The team also managed to put in some delighters such as generating highly styled PDFs of schedules to send out to companies.
The scheduling tool has worked helping TechPoint setup hundreds of interviews between employers and students. We’ve used this for the past two Xtern finalist days, saving time and a lot of manual effort for our friends over at TechPoint.
SEP the App
SEP the App was born as a project to build a mobile app experience across multiple platforms (iOS, Android, Windows Phone). The team picked a collection of data about SEP and its employees as its underlying data to explore.
Over the weekend, the team did succeed in building native apps across each of the platforms. Beyond that, though, the team put together a collection of API services written in Ruby. Structurally these formed a microservice layer (even if that word wasn’t being applied much at that point in time). The APIs exposed all sorts of data (employee start dates, schools attended, technologies used on projects, etc.) for internal applications to consume.
While the app itself may not have lasted all that long, the APIs the team built to serve the app have lived on. Along the way they acquired a continuous deployment pipeline from dev, to staging services with smoke tests, and then to production. The data from those APIs have served as the underpinning for a number of other applications that we’ve written to mash up company data in new ways.
The Tx2Go app was built to solve a particular problem. After an injury, many people are assigned some physical therapy exercises to do at home. Adherence to that therapy regimen is pretty low, however. Lots of patients will go home, set the paper with the exercises down on a counter, and forget about them.
To help this problem, we worked faculty and students from Huntington University to design a mobile app as a digital aid to therapy. The app allowed therapy regimens to be assigned to patients (e.g. “Do the ‘foot pull’ exercise 10 times, twice per day”). For each exercise, the app would have notifications to remind patients to perform the activities. It also included instructions for the exercises and links to instructional videos. In short, it aimed to replace the paper handouts that were often forgotten.
We never pushed this one to public app stores, but it was used, at least by one patient. A couple of months after we wrote the app, I ended up getting injured. Following weeks in an air cast, I was assigned some physical therapy activities to do at home. I took the paper home, set it down on the counter, and then promptly forgot about it (just like everyone else). After about a week of forgetting about therapy assignments, I remember the app we’d built. Once I got it pushed to my phone, it did its job of reminding me to do the at-home exercises and definitely, for me, increased adherence.