My Career

Professional Experience

A Journey of Innovation & Impact

Explore my diverse roles across AI, cloud computing, and full-stack development, where I have driven transformative solutions for global organizations.

Cancer Institute NSW

Data Scientist Intern (Jul 2025 - Nov 2025)

Delivered a privacy-preserving, on-prem AI Knowledge Assistant that transforms how clinicians and policy teams discover answers across internal PDFs, policies, web pages, and spreadsheets without relying on any external APIs.

Key Contributions

At Cancer Institute NSW, I built a hybrid Retrieval-Augmented Generation system combining FAISS vector search with an ontology-driven Knowledge Graph to reduce ambiguity and improve explainability in cancer-related answers. I orchestrated multi-step queries using LangChain and LangGraph to plan retrieval, knowledge graph lookups, and summarisation, enforcing groundedness checks and safe fallbacks with consistent citations. I also operationalised evaluation and observability using RAGAS and LangSmith/OpenTelemetry, and deployed the stack fully on-prem with Docker, Kubernetes, and Terraform, enabling reproducible releases, scalable inference, and healthcare-grade governance.

AirLabOne

Machine Learning Engineer – Digital Twin (Jun 2025 - Jul 2025)

Contributed to a patented Digital Twin platform that turns high-value lab equipment into a secure, real-time cloud of lab hardware for remote access, rental, and collaboration.

Key Contributions

At AirLabOne, I built high-frequency sensor telemetry ingestion for lab devices using AWS IoT Core and Kinesis, landing per-second multi-sensor streams into S3 and curating traceable datasets with AWS Glue and the Glue Data Catalog. I implemented lakehouse-style analytics to investigate device health and compliance signals using Athena and Redshift for operational dashboards and reporting. I deployed the Digital Twin application and anomaly-detection services on AWS with Docker and EKS, codified infrastructure with Terraform, and used MLflow, EventBridge, Lambda, and SNS to monitor model performance, automate retraining, and maintain strong audit trails aligned with healthcare compliance expectations.

Turing

LLM - Python Engineer (Aug 2024 - Aug 2025)

Specialised in fine-tuning and evaluating Large Language Models, focusing on Google Gemini to improve reasoning quality, instruction adherence, and agentic behaviour.

Key Contributions

At Turing, I evaluated Google Gemini using supervised fine-tuning and human-feedback-driven reinforcement learning workflows to measure prompt accuracy, reasoning depth, and safety. I fed these insights into training decisions to iteratively refine datasets and model behaviour, and built automated quality gates that validated formatting and policy criteria at scale, significantly reducing manual review effort. I engineered a high-throughput AWS data pipeline using S3, Glue, Spark, Athena, Redshift, and dbt to ingest, validate, and query fifteen to twenty thousand supervised fine-tuning samples per day across distributed contributors. The full evaluation and automation stack was containerised with Docker, enabling reproducible runs and scalable execution across engineering teams.

SAS

Cloud and Information Services Intern (Nov 2024 - Dec 2024)

Developed and evaluated machine learning models on the SAS Viya platform to demonstrate big data analytics and high-speed decision support.

Key Contributions

At SAS, I built machine learning workflows using SAS Viya, leveraging the Compute Server for execution and the CAS Server for in-memory, parallel processing over large datasets. I used CAS to accelerate feature engineering, training, and validation, and produced visual insights with SAS Visual Analytics to support business stakeholders. I presented weekly progress updates to technical and non-technical audiences, incorporated feedback to refine models and dashboards, and gained exposure to how large enterprises use big data and analytics to drive operational and strategic decisions.

Deneka IT

Security Software Engineer Intern (May 2024 - Nov 2024)

Helped engineer an ERP and CRM data platform that unified security, multi-tenant onboarding, and analytics across customer and technician workflows.

Key Contributions

At Deneka IT, I built an ERP and CRM platform with multi-tenant onboarding and role-based access, implementing secure authentication and authorisation through JWT-based single sign-on and RBAC across services. I integrated CRM contact pipelines with Google Contacts and Microsoft Outlook, using Snowflake as the central data hub to consolidate and standardise customer activity data. I implemented data quality and synchronisation logic to handle deduplication, conflict resolution, and incremental updates so records stayed consistent across systems, and delivered analytics-ready datasets and operational reports from Snowflake to support engagement, onboarding, and productivity insights. I packaged APIs and services with Node.js and Express, following CI/CD practices to keep deployments reliable and maintainable.

Jio Platforms Limited

Software Development Engineer (Nov 2022 - Jan 2024)

Built and operated core modules of Jio SecureID, a multi-tenant digital identity and MFA platform serving enterprise clients at massive scale.

Key Contributions

At Jio, I engineered Java and Spring Boot microservices and REST APIs that powered secure authentication for Jio SecureID, focusing on low-latency responses and reliability under high concurrency. I implemented secure client onboarding using mutual TLS, API key exchange, and API Gateway allowlists so only approved systems could call authentication endpoints. I designed event and telemetry flows to feed audit trails, operational analytics, and security monitoring, and delivered production deployments by containerising services with Docker, running them on Kubernetes, and provisioning infrastructure with Terraform. I used Prometheus and Datadog to monitor latency, error rates, throughput, and client activity, and worked within agile teams to refine system design, enforce strong security practices, and improve resilience over time.

Infosys

Specialist Programmer (Nov 2020 - Oct 2022)

Helped build and enhance AutoData, a full-stack asset platform for managing vehicle and compliance data across an automotive client’s operations.

Key Contributions

At Infosys, I worked on AutoData, delivering end-to-end workflows from ingestion to operational UI using Java, Spring Boot, microservices, REST APIs, and React. I implemented persistence and performance layers on Oracle DB with JPA and Hibernate, improved responsiveness using Redis caching, and integrated event-driven flows with Kafka across services. I later extended the solution with scalable data engineering pipelines on AWS using S3, Kinesis, Glue, Athena, and Redshift so high-volume vehicle and policy data could be transformed and queried for analytics. I strengthened production readiness with JUnit test suites, hardened microservice contracts, and automated deployments on Docker and Kubernetes, using CloudFormation or Terraform plus Prometheus and Datadog to monitor health, performance, and reliability.

Zoho Corporation

Summer Intern (Jun 2019)

Built a Java-based e-commerce console application with clear role separation and efficient database interactions during an intensive summer internship.

Key Contributions

At Zoho, I developed a feature-rich console application implementing core e-commerce workflows for customers, vendors, and premium users using Java and clean object-oriented design. I integrated MySQL via JDBC for reliable persistence and queries, applied role-based access controls to separate user capabilities, and focused on performance and maintainability through modular design and straightforward logging and error-handling practices. This experience grounded my understanding of back-end design patterns and database-driven application development.

Contact

Get In Touch!

+61 481 700 945
tayalarajan45@gmail.com