Kids Speech Therapy Case
A custom speech recognition system for children’s therapy, built with phoneme-level modeling, fine-tuned datasets, and production-ready deployment.
“We had tried multiple times to solve an extremely complex computer vision challenge in healthcare with no luck. The It-Jim team not only cracked it; they exceeded our expectations. The solution was precise, scalable, and built with deep technical insight. If you’re facing a problem that feels nearly impossible, this is the team you want.”

Pardigm Inc. is a US-based biotech startup focused on helping people better understand and manage stress. Cortisol tracking app developed with AI and computer vision is their core product. The solution analyzes salivary cortisol levels using LFA (Lateral Flow Assay) test strips and delivers personalized insights.
Designed for a wide range of users – from Olympic athletes to women experiencing hormonal changes – the app aimed to bring lab-level cortisol monitoring into everyday life. Pardigm’s vision was to turn smartphones into powerful tools for tracking stress through accurate measurement and meaningful interpretation of cortisol rhythms.
To bring this vision to life, Pardigm partnered with It-Jim to develop an AI-powered cortisol tracking mobile app with computer vision, enabling accurate cortisol measurement and personalized baseline tracking.
Pardigm set out to build an AI-powered mobile app that could analyze cortisol levels using LFA test strips – a task that required solving several complex challenges:
Building a CV model to detect and quantify cortisol levels based on subtle color intensity differences on saliva test strips.
Enabling the app to learn each user’s natural hormonal pattern and track deviations over time.
Maintaining consistent accuracy despite variable lighting, shadow, and background conditions.
Compensating for minor inconsistencies across different batches of test kits.
To bring Pardigm's vision to life, we followed a staged development process – from initial proof of concept to real-world testing and development of an AI-powered cortisol tracking app with personalized features. Each phase addressed a specific technical challenge by combining computer vision, AI model development, and mobile optimization. This step-by-step approach ensured the final solution was accurate, user-friendly, and adaptable to real-world conditions.
When Pardigm first approached It-Jim, their objective was to measure cortisol levels by analyzing the color intensity on salivary LFA test strips. The science behind it was straightforward in theory – but making it work on a smartphone was a different story. Achieving accurate results required custom computer vision algorithms, calibration against variable lighting, and a reliable way to process subtle color differences between test and control zones.
This first milestone demonstrated that real-time cortisol detection via smartphone was technically feasible and laid the groundwork for further development.
With the initial version of the app complete, Pardigm moved into field testing. This phase was an important reality check: how would the prototype perform in uncontrolled conditions, outside the lab?
Pardigm arranged testing with Olympic-level runners – a high-stakes environment where precision mattered. Athletes used the mobile app to analyze their salivary cortisol at different times, and the resulting data was shared with our team via Dropbox for evaluation.
This stage validated the model’s foundational accuracy and uncovered the next major requirement: creating a personalized baseline for every user.
The field testing phase made one thing clear: isolated cortisol readings weren’t enough. To provide actionable insights, the app had to account for each user’s natural cortisol rhythm, which varies significantly during the day.
To solve this, we introduced a baseline tracking feature. Drawing on medical research and feedback from healthcare professionals, we designed a system that prompted users to log their cortisol levels multiple times a day – morning, afternoon, and evening. This data was then used to create a personalized “diurnal curve” that reflected the user’s typical hormonal pattern.
The result was a major leap in user value: instead of isolated numbers, users received context-aware feedback tailored to their own biology.
The project culminated in a fully functional prototype of a mobile app that accurately measured and interpreted cortisol levels in real time directly from salivary LFA test strips.
By combining advanced computer vision, personalized baselines, and real-world testing, the app delivered a unique value proposition: stress tracking that adapts to each user’s biology and works in everyday environments.
This collaboration demonstrated how AI and computer vision can bring lab-level insights into users’ pockets, redefining how personal health can be tracked and understood.
No vague promises. No pressure. Just a real conversation with our technical leadership to understand your needs and explore the best next steps.