In our tech expedition series, we talk with RocketBuild’s Flight Crew to share a behind-the-scenes look at software development — from start to finish.
We interview our own team of product managers, designers, and engineers on all things related to planning, creating, and coding applications. In this edition, Jason, Connor, John, and Gwen recap the expedition the team took to build custom school management software for Paramount Schools.
Q: Why did Paramount engage RocketBuild?
Jason Ward: RocketBuild has an extensive history of working with nonprofits and educational organizations. Jess Monk, Paramount’s operations head, found us online and we had a great series of conversations about Paramount’s needs and our abilities. It was a great fit from the start, and we are proud to be working with such a great local school.
Q: What does the new application accomplish for Paramount?
Jason Ward: The new app allows Paramount and its team to effectively and efficiently create evaluations for staff and faculty, then send those evaluations electronically for completion. The evaluations are tailored by the team, and can be completed entirely online, making the process seamless and easy.
Connor Hess: The new application allows Paramount to move away from Excel spreadsheet-based scoring of teacher evaluations. The app is a custom evaluation system that is now digitized uniquely to Paramount’s process of doing teacher evaluations. It has all the benefits of moving from an electronic document system to a full-fledged app.
Q: What are the main features of the Paramount app?
Connor Hess: The app is a tool used to evaluate teacher and staff performance and provides metrics on overall location performance. Evaluations can be created at any time, and assigned to anyone in a school building. A wide array of custom analytical data is available to track the progress over time of teachers, buildings, and districts.
Q: Can you explain the new technology architecture?
Connor Hess: It is a full stack Django application with a few React components to help with the front end.
Q: How important is understanding where and how the software will be used and who the users will be for the design?
John Justice: For the MVP, our main focus was to create an evaluation creator tool for the education sector — though we wanted to make sure that we were flexible enough that the product could be used for multiple industries. Two evaluation types in the form of Excel files were provided to us to build the evaluation creator tool from On-Site Assessments and Drive-Through Rubrics. We had to take into account several parameters that would differentiate the two evaluation types and ultimately result in differing report scales.
Q: What screens were designed first and why?
John Justice: After defining the roles and studying the provided reports for each evaluation type, we formed wireframes of both the evaluation creator and the resulting reports. Dashboards were later designed to define what each role would have permission to see and do within the application. Through the entire design process, we provided clickable examples of the app along with documentation detailing the interactions.
Q: What’s the best way for a designer and developer to work together on a project like this?
John Justice: We spent much of the design process facilitating conversations around what was needed from requirements gathering to focusing on pain points that were needed in order to accomplish the objective. Once we were close to finalizing the designs, we gathered internally and made iterations needed for the developers to understand how the application functioned for the MVP build.
Q: How did you assure that the project requirements were met?
Connor Hess: Constant communication with the client. We were very thorough in our discovery and requirements gathering at the start, and our design process fine-tuned those initial requirements. With the skilled designers we have on staff, we were able to have multiple working sessions to further refine v1 requirements before moving into development. The strategy and creative thinking done early in the process serve as a guide to a successful end product.
Q: How did the team conduct code reviews?
Connor Hess: Every line of code is reviewed by another RocketBuild developer before making it up onto the staging server. In the staging server, the project manager reviews the code changes, ensuring it functions and tries to stress test it when possible. After it is confirmed to have met acceptance criteria, the code is then pushed up to the production server for the client’s use.
Q: What made this project unique?
Connor Hess: There was a unique weighting and scoring system for the exams that we had to break down and understand, and eventually convert into code and workable software. Also, we had to ensure that data from the old pen and paper exams would still be usable as a comparison model with scores in the new tool.
John Justice: While their initial focus was to use this in the education sector, the long-term goal was to provide enough flexibility so that this could be used in other industries. So there was a bit of a challenge in that we needed to focus on accomplishing an objective while not being too industry-specific at the same time.
Q: What surprised you or challenged you about this project?
John Justice: I was surprised by the amount of role delegation required in order to segment privileges. The team at Paramount was very clear in their definitions of what boundaries were required for each role. To keep the project within scope we were able to simplify these roles which allowed for a more streamlined assessment creation process.