Kartik Singhal Shares How He Built a Cutting-Edge ML System from Scratch
Developing a machine learning (ML) system from the ground up requires immense expertise, given each phase's complex nature. Firstly, defining the problem requires an in-depth understanding of the specific business goals and objectives the ML model aims to address. Acquiring high-quality and relevant data is another challenge, as it often involves consolidating disparate sources and formats. Kartik Singhal, a seasoned machine learning engineer, remarks, "Understanding the business objectives and data behind the model is paramount. Without a clear vision of the problem and data available at hand, even the most advanced algorithms can fall short."
In addition, the raw data must be meticulously preprocessed and cleaned to be suitable for analysis. This stage demands extensive domain knowledge to make informed decisions about which features are relevant and how they should be represented. Even the deployment of the model presents intricacies, such as scalability, latency, and integration with existing systems.
Kartik shares an impactful project he spearheaded at one of the world's largest online retailers and leading cloud service providers. He stands as a key player in developing a state-of-the-art ML system that serves millions of users worldwide. When Kartik joined the company, his expertise revolved only around scalable systems and backend infrastructure. However, he, along with three other engineers, took on the challenge of developing an end-to-end ML system for optimizing product pricing. This project required building a solution from the ground up to handle the massive scale and complexity of a global e-commerce platform.
Kartik assumed a multifaceted role in this venture. Responsible for designing and implementing the real-time ML inference system as well as the success metric of the project, he ensured that the model's predictions could be scaled efficiently to millions of products. This process is exceptionally complex because it involves extensive research and comparison of various technologies to determine the best approach for their specific needs.
In other words, to achieve excellent outcomes, one must comprehensively understand ML systems' training and inference aspects. Kartik, deeply passionate about software engineering since he was 16 and ML since he was an undergraduate, was able to navigate the complexities of offline training and real-time inference. This dual focus is needed to ensure the system can handle live data inputs and produce accurate pricing updates to significantly improve revenue and profit margin for the sellers.
"We experienced a lot of challenges," Kartik shares. "Personally, the most difficult part was making the inference system scalable. It's because this involves setting up infrastructure that could adjust to the volume of product queries. Our goal was to create a system that could manage the ML computational load efficiently and remain cost-effective."
Kartik along with the team delivered a minimum viable product (MVP) within three to four months. Kartik acknowledges the project's significant impact on the organization's capabilities in handling dynamic pricing. "It was a significant milestone within the organization because it is its first major system overhaul in several years," he states. Essentially, the implementation of a true ML pricing system, leveraging cutting-edge technologies, represented a shift from previous rule-based approaches.
Kartik recalls experiencing mixed emotions after completing the project. He felt a sense of accomplishment and pride in breaking into the ML field and collaborating with top-tier professionals. At the same time, he became personally invested in its success. He, therefore, remained actively involved in discussions and decision-making even after handing it over to the next generation of developers.
It's not an exaggeration to say that this project is a defining moment in Kartik's career. After all, it signifies his transition from a software engineer to a full-fledged machine learning engineer. Besides serving as a springboard for his professional growth, this venture refueled his passion for ML and modeling.
After the success of the model's launch, Kartik worked on search advertising systems within the company. He collaborated with applied scientists, helping him gain insights into model selection, fine-tuning, and enhancing the offline training infrastructure for the team. He then set up an efficient workflow for training models, streamlining the process from development to deployment.
Kartik, determined to build on his growing expertise in the search domain, sought to expand his horizons and take on more modeling responsibilities. This motivation led him to work at one of the largest American information technology companies, where he's responsible for the end-to-end search ads workflow.
Given search technology's numerous challenges and opportunities, Kartik's passion for innovation and improvement is further enhanced. He remains ecstatic about the possibilities ahead, recognizing the immense potential for growth and innovation in the field with the advent of large language models and the latest developments in natural language processing.
© Copyright IBTimes 2024. All rights reserved.