Jobs
Interviews

259 Data Pipelines Jobs - Page 2

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 - 11.0 years

0 Lacs

karnataka

On-site

As the Lead AI Strategist & Architect at DivyaSree's AI Division, you will drive early-stage AI vision and system design to establish a foundation for scalable AI adoption. The AI Division at DivyaSree is a dynamic startup within the organization, focused on reshaping real estate, education, and hospitality through AI innovation. You will play a pivotal role in defining the high-level AI roadmap, architecting secure and scalable AI systems, and leading the development of AI solutions for internal workflows and customer-facing tools. Your responsibilities will include setting up data pipelines, cloud infrastructure, and MLOps practices while also focusing on research, IP creation, team building, and ecosystem partnerships. To excel in this role, you should have at least 7 years of experience in developing and deploying AI/ML solutions in production environments. Proficiency in multiple AI domains such as NLP, computer vision, and generative AI is essential, along with a strong software engineering background and expertise in Python and relevant AI frameworks. Experience with cloud platforms and MLOps practices is also required. Preferred qualifications include an advanced degree in AI, Machine Learning, or a related field, hands-on experience with generative AI, LLMs, and agent-based systems, as well as contributions to open-source projects or research publications in AI-related fields. Knowledge of data privacy, security, and ethical considerations in AI implementation is also beneficial. In return, DivyaSree offers a unique opportunity to lead the AI vision from day zero, influence both business and tech decisions, work on cutting-edge projects, receive full support from leadership, and have the chance to make a significant impact on the AI ecosystem in India. Join us in building a bold future in AI innovation and be part of a team committed to creating lasting change through technology.,

Posted 3 days ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

At PwC, our team in data and analytics engineering is dedicated to utilizing advanced technologies and techniques to create robust data solutions for our clients. Your role will involve transforming raw data into valuable insights, facilitating informed decision-making, and fostering business growth. As part of the intelligent automation team at PwC, you will focus on process mining, designing automation solutions, and implementing various automation technologies to enhance operational efficiencies and reduce costs for our clients. You will be responsible for building meaningful client relationships, managing and motivating others, and expanding your technical expertise. As you navigate complex situations, you will develop your personal brand, deepen your knowledge, and enhance your strengths. Anticipating the needs of your team and clients, delivering high-quality work, and embracing ambiguity will be key aspects of your role. The ideal candidate for the position of GenAI Data Engineer - Senior Associate at PwC US Acceleration Center will have a strong background in data engineering, with a specific focus on GenAI technologies. You will be involved in developing and maintaining data pipelines, implementing machine learning models, and optimizing data infrastructure for GenAI projects. Collaboration with cross-functional teams, staying updated on advancements in GenAI technologies, and recommending innovative solutions will be central to your responsibilities. **Responsibilities:** - Design, develop, and maintain data pipelines and ETL processes for GenAI projects. - Collaborate with data scientists and software engineers to implement machine learning models. - Optimize data infrastructure and storage solutions for efficient data processing. - Implement event-driven architectures for real-time data processing. - Utilize containerization technologies like Kubernetes and Docker. - Develop and maintain data lakes for storing structured and unstructured data. - Implement LLM frameworks for advanced language processing. - Collaborate with cross-functional teams to design solution architectures for GenAI projects. - Utilize cloud computing platforms for data processing, storage, and deployment. - Monitor and troubleshoot data pipelines and systems to ensure smooth data flow. - Stay updated on the latest GenAI technologies and recommend innovative solutions. - Document data engineering processes, methodologies, and best practices. **Requirements:** - Proficiency in Python with a minimum of 3 years of hands-on experience. - Solid understanding of scalable system design for GenAI use cases. - Familiarity with Python web frameworks like Flask and FastAPI. - Ability to design applications with modularity, reusability, and security best practices. - Experience with cloud-native development patterns and tools. - Knowledge of deploying containerized applications on cloud platforms. - Strong proficiency in Git for effective code collaboration. - Experience with data processing frameworks like Apache Spark. - Proficiency in SQL and database management systems. - Experience in ensuring effective Agile practices using Azure DevOps. - Strong programming skills and technical experience, with a focus on GenAI projects. **Preferred Skills:** - Experience with LLM frameworks such as LangChain and Semantic Kernel. - Experience in setting up data pipelines for model training and real-time inference. If you are enthusiastic about GenAI technologies and have a proven track record in data engineering, consider joining PwC US Acceleration Center. You will have the opportunity to be part of a dynamic team shaping the future of GenAI solutions in a collaborative and innovative work environment.,

Posted 3 days ago

Apply

1.0 - 3.0 years

0 Lacs

, India

On-site

Qualification: Education: Bachelors degree in any field. Experience: Minimum 1.5-2 years of experience in data engineering support or a related role, with hands-on exposure to AWS. Technical Skills: Strong understanding of AWS services, including but not limited to S3, EC2, CloudWatch, and IAM. Proficiency in SQL with the ability to write, optimize, and debug queries for data analysis and issue resolution. Hands-on experience with Python for scripting and automation; familiarity with Shell scripting is a plus. Good understanding of ETL processes and data pipelines. Exposure to data warehousing concepts; experience with Amazon Redshift or similar platforms preferred. Working knowledge of orchestration tools, especially Apache Airflow including monitoring and basic troubleshooting. Soft Skills: Strong communication and interpersonal skills for effective collaboration with cross-functional teams and multi-cultural teams. Problem-solving attitude with an eagerness to learn and adapt quickly. Willingness to work in a 24x7 support environment on a 6-day working schedule, with rotational shifts as required. Language Requirements: Must be able to read and write in English proficiently. Show more Show less

Posted 3 days ago

Apply

3.0 - 5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Experience: 3-5 years Looking for a skilled backend developer with strong experience in Java, Spring Boot, and Apache Spark. Responsible for building scalable microservices and processing large datasets in real-time or batch environments. Must have solid understanding of REST APIs, distributed systems, and data pipelines. Experience with cloud platforms (AWS/GCP) is a plus. Show more Show less

Posted 3 days ago

Apply

10.0 - 12.0 years

0 Lacs

Gurugram, Haryana, India

On-site

We&aposre seeking a visionary Enterprise Architect to join our CTO Office and shape cross-portfolio solutions at the intersection of AI, Customer Experience (CX), Cybersecurity, and Digital Skilling technologies. Youll architect scalable, standardized solutions for global clients, govern complex deals, and collaborate with diverse stakeholders to translate business needs into future-ready technical strategies. As a trusted advisor, you will evangelize solution value, articulating how the right technology mix enables our customers to achieve strategic outcomes. At TeKnowledge , your work makes an impact from day one. We partner with organizations to deliver AI-First Expert Technology Services that drive meaningful impact in AI, Customer Experience, and Cybersecurity. We turn complexity into clarity and potential into progressin a place where people lead and tech empowers. Youll be part of a diverse and inclusive team where trust, teamwork, and shared success fuel everything we do. We push boundaries, using advanced technologies to solve complex challenges for clients around the world. Here, your work drives real change, and your ideas help shape the future of technology. We invest in you with top-tier training, mentorship, and career developmentensuring you stay ahead in an ever-evolving world. Why Youll Enjoy It Here: Be Part of Something Big A growing company where your contributions matter. Make an Immediate Impact Support groundbreaking technologies with real-world results. Work on Cutting-Edge Tech AI, cybersecurity, and next-gen digital solutions. Thrive in an Inclusive Team A culture built on trust, collaboration, and respect. We Care Integrity, empathy, and purpose guide every decision. Were looking for innovators, problem-solvers, and experts ready to drive change and grow with us. We Are TeKnowledge. Where People Lead and Tech Empowers. Responsibilities: Design enterprise-grade architectures integrating structured/unstructured data, analytics, and advanced AI models (GenAI, LLMs, cognitive services). Build scalable data pipelines and lake-centric architectures to power real-time analytics and machine learning. Architect multi-cloud AI/ML platforms using Azure, including deployment of LLMs (Azure OpenAI and open-source models like LLaMA, Mistral, Falcon). Define infrastructure, data, and app requirements to deploy LLMs in customer private data centers. Lead technical reviews for high-value deals, identifying risks and mitigation strategies. Design integrated solutions across AI, CX, Cybersecurity, and Tech Managed Services portfolios. Develop standard design patterns and reusable blueprints for repeatable, low-risk, and scalable solution delivery. Present architectural solutions to C-suite executives, aligning technical outcomes with business value and ROI. Collaborate with sales and pre-sales to scope complex opportunities and develop compelling proposals. Foster innovation across CTO, Sales, and Solution teams. Identify synergy across offerings (e.g., Microsoft Copilot + AI-first CX + Cybersecurity). Support product teams with market feedback and solution evolution. Define architectural best practices ensuring security, compliance, and scalability. Mentor delivery teams on frameworks and emerging tech adoption. Shape and execute the enterprise architecture strategy aligned with business goals. Champion digital transformation and technology innovation. Leverage expertise in Azure and Microsoft D365 to support solution architecture. Drive responsible AI adoption and ensure awareness of privacy, bias, and security in deployments. Ensure all solutions meet IT security and compliance standards. Collaborate with Legal and Procurement for contract negotiations and vendor performance. Lead, mentor, and build a high-performing, collaborative CTO team with a customer-first mindset. Qualifications: Education: Bachelor&aposs or Masters degree in Computer Science, Information Technology, Cybersecurity, or related field. Experience: 10+ years in enterprise architecture, with 5+ years in customer-facing roles. Certifications: Preferred TOGAF, Zachman, ITIL, CISSP, Azure certifications or equivalents. Proven experience architecting and delivering AI/ML platforms, data lakes, and intelligent applications at enterprise scale. Demonstrable experience deploying local LLMs in production environments, including integration with LangChain, databases, and private storage. Strong knowledge of enterprise architecture frameworks and multi-cloud platforms (with a focus on Azure). Ability to design and deliver end-to-end solutions including networks (voice and data), microservices, business applications, resilience, disaster recovery, and security. Understanding of On-Prem / Private Cloud workload migration to public or hybrid cloud environments. Commercial acumen with the ability to articulate the business value of cloud-based solutions to executive stakeholders. Strong problem-solving and critical thinking skills with a proactive, outcome-oriented mindset. Experience with cloud computing, data center technologies, virtualization, and enterprise-grade security policies/processes. Proficiency in AI/ML, cybersecurity frameworks, customer experience platforms, and Microsoft Cloud (Azure, M365, D365). Exceptional communication and storytelling abilities for both technical and non-technical audiences. Experience engaging with large enterprise clients across industries such as government, healthcare, banking & finance, travel, and manufacturing. Empowering Leadership and Innovation At TeKnowledge, we are committed to fostering a culture of inspiring leadership and innovation. Our core leadership competencies are integral to our success: Inspire: We prioritize creating an inclusive environment, leading with purpose, and acting with integrity and respect. Build: Our leaders own business growth, drive innovation, and continuously strive for excellence. Deliver: We focus on setting clear priorities, embracing agility and change, and fostering collaboration for growth. We are looking for talented individuals who embody these competencies, are ready to grow, and are eager to contribute to our dynamic team. If you are passionate about making a meaningful impact and excel in a collaborative, forward-thinking environment, we invite you to apply and help us shape the future. Show more Show less

Posted 3 days ago

Apply

0.0 years

0 Lacs

Thane, Maharashtra, India

On-site

Company Description Quantanite is a business process outsourcing (BPO) and customer experience (CX) solutions company that helps fast-growing companies and leading global brands to transform and grow. We do this through a collaborative and consultative approach, rethinking business processes and ensuring our clients employ the optimal mix of automation and human intelligence. Were an ambitious team of professionals spread across four continents and looking to disrupt our industry by delivering seamless customer experiences for our clients, backed up with exceptional results. We have big dreams and are constantly looking for new colleagues to join us who share our values, passion, and appreciation for diversity Job Description About the Role We are seeking a AI Engineer to join our team and play a critical role in the design and develop a cognitive data solution. The broader vision is to develop an AI-based platform that will crawl through unstructured data sources and extract meaningful information. The ideal candidate will possess full-stack development skills along with a strong understanding of database structures, SQL queries, ETL tools and Azure data technologies. Core Responsibilities Lead the design, development, and implementation of AI algorithms and models to enhance payment and CX product offerings. Test agent interactions, document behaviors, and help improve reliability and performance. Assist with agent integration, system testing, and deployment. Develop and deploy AI models (including deep learning models) using TensorFlow, PyTorch, and other ML/DL frameworks. Design and maintain the full AI data pipeline, including data crawling, ETL, and building fact tables. Apply statistical and programming expertise (NumPy, Pandas, etc.) for data analysis and modeling. Optimize AI models for on-premise infrastructure with a focus on performance and security compliance. Stay abreast of the latest trends in AI and contribute to continuous improvements. Mentor junior AI/ML engineers and contribute to documentation and process standardization. Contribute to the broader AI agent ecosystem and cross-functional collaboration. Required Experience Strong Python programming skills and experience with full-stack development. Experience with large language models (LLMs) like ChatGPT, Claude, etc. 6 months to 1 year of experience in Claims and Denial Management within the medical healthcare industry in India. Solid understanding of AI/ML concepts including NLP, ML algorithms, deep learning, image/speech recognition. Experience with Azure Data Factory, Databricks/Spark, Synapse/SQL DW, and Data Lake Storage. Familiarity with data pipelines, APIs, data exchange mechanisms, and RDBMS/NoSQL databases. Ability to write clear and thorough documentation. Enthusiasm for emerging AI technologies and open-source collaboration. Nice To Have Experience with agent frameworks (LangChain, AutoGPT, CrewAI, etc.) Basic DevOps knowledge and experience in scalable deployments. Experience with API development and RESTful services. Familiarity with vector databases (e.g., FAISS, ChromaDB, Pinecone, Weaviate). Prior contributions to open-source AI projects or communities. Additional Information Benefits At Quantanite, we ask a lot of our associates, which is why we give so much in return. In addition to your compensation, our perks include: Dress: Wear anything you like to the office. We want you to feel as comfortable as when working from home. Employee Engagement: Experience our family community and embrace our culture where we bring people together to laugh and celebrate our achievements. Professional development: We love giving back and ensure you have opportunities to grow with us and even travel on occasion. Events: Regular team and organisation-wide get-togethers and events. Value orientation: Everything we do at Quantanite is informed by our Purpose and Values. We Build Better. Together. Future development: At Quantanite, youll have a personal development plan to help you improve in the areas youre looking to develop over the coming years. Your manager will dedicate time and resources to supporting you in getting you to the next level. Youll also have the opportunity to progress internally. As a fast-growing organization, our teams are growing, and youll have the chance to take on more responsibility over time. So, if youre looking for a career full of purpose and potential, wed love to hear from you! Show more Show less

Posted 3 days ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

You will be joining our Data & AI practice as an Individual Contributor and Functional AI Decision Science Consultant at Level 9, with the opportunity to work in Gurgaon, Bangalore, Mumbai, or Hyderabad. As part of a global team of skilled professionals, you will utilize cutting-edge statistical tools and methodologies to deliver data-driven insights and solutions for our clients. Your primary responsibilities will include defining data requirements for Data Driven Growth Analytics, analyzing and interpreting data, and ensuring data quality. You will work on market sizing, lift ratio estimation, non-linear optimization techniques, statistical timeseries models, store clustering algorithms, and descriptive analytics to support merchandising AI capabilities. Additionally, you will develop AI/ML models using Azure ML tech stack, manage data pipelines, and leverage cloud platforms for deploying and scaling machine learning models. In this role, you will also be expected to manage client relationships, effectively communicate insights and recommendations, and contribute to capability building and thought leadership within the team. Your logical thinking and task management skills will be crucial to analyze data systematically and prioritize tasks efficiently. To be successful in this position, you should have a minimum of 4+ years of experience in Marketing analytics and 3+ years of experience in Data Driven Merchandizing. Proficiency in econometric/statistical modeling, Azure ML, SQL, R, Python, and PySpark is essential, along with a strong understanding of marketing data and business processes in the Retail and CPG industries. A Bachelor's or Master's degree in Statistics, Economics, Mathematics, Computer Science, or related fields is required, along with excellent academic credentials. Additionally, knowledge of tools like Excel, Word, and PowerPoint for communication and documentation will be beneficial in this role. If you are seeking a challenging opportunity to work with a dynamic team of professionals and make a significant impact in the field of Data & AI, we invite you to apply and join our team at Accenture.,

Posted 4 days ago

Apply

2.0 - 8.0 years

0 Lacs

karnataka

On-site

As a Data Analyst with 2 to 8 years of experience, you will be based in Bangalore and will have the following responsibilities: - Demonstrating strong proficiency in SQL, Python, and Excel, along with the ability to adapt and learn other Analytics tools as required. - Building and optimizing Data Pipelines on popular Cloud Platforms. - Automation of Data extraction and insertion processes to Management Information Systems (MIS) using Python. - Extracting actionable insights from analysis results. - Possessing attention to detail, a strong data orientation, and a commitment to creating clean and reproducible code. - Showing a genuine passion for collaboration, along with exceptional interpersonal and communication skills. Good to have skills: - Experience with tools like Google Big Query, Meta base, Clever Tap, Google Data Studio, Firebase/Google Analytics. - Proficiency in Visualization Tools such as Google Data Studio, Tableau, etc. - Hands-on experience with ETL/ELT Pipelines and Data flow setups.,

Posted 4 days ago

Apply

4.0 - 8.0 years

0 Lacs

ahmedabad, gujarat

On-site

As a Python AI/ML Developer with 4-5 years of experience, your main responsibility will be to design, develop, and maintain Python applications while ensuring code quality, efficiency, and scalability. You will collaborate with cross-functional teams to understand project requirements and deliver solutions that align with business objectives. Implementing AI/ML algorithms and models to solve complex problems and extract valuable insights from data will be a key part of your role. Additionally, you will be developing and maintaining RESTful APIs to integrate Python applications with other systems. It is essential to stay updated with the latest trends and technologies in Python development and AI/ML. To excel in this role, you should have a strong proficiency in Python programming, including object-oriented programming and design patterns. Experience with popular Python libraries and frameworks such as NumPy, Pandas, Scikit-learn, TensorFlow, and PyTorch is required. Knowledge of AI/ML concepts, algorithms, and techniques, including supervised and unsupervised learning, is crucial. Experience working with data pipelines and ETL processes is beneficial, and hands-on experience with chatbot applications is necessary. Excellent problem-solving and analytical skills are essential, along with the ability to work independently and as part of a team. Strong communication and documentation skills are also important. Preferred qualifications include experience with cloud platforms such as AWS, GCP, or Azure, knowledge of natural language processing (NLP) or computer vision, experience with machine learning deployment and operationalization, and contributions to open-source Python projects. Stay updated with the latest advancements in technology to enhance your skills and contribute effectively to the development of innovative solutions.,

Posted 4 days ago

Apply

4.0 - 8.0 years

0 Lacs

karnataka

On-site

As a Senior Platform Engineer at Kenvue Data Platforms, you will have an exciting opportunity to be part of our growing Data & Analytics product line team. Your role involves collaborating closely with various teams such as Business partners, Product Owners, Data Strategy, Data Platform, Data Science, and Machine Learning (MLOps) to drive innovative data products for end users. You will play a key role in shaping the overall solution and data platforms, ensuring their stability, responsiveness, and alignment with business and cloud computing needs. Your expertise will be crucial in optimizing business outcomes and contributing to the growth and success of the organization. Your responsibilities will include providing leadership for data platforms in partnership with architecture teams, conducting proof of concepts to deliver secure and scalable platforms, staying updated on emerging technologies, mentoring other platform engineers, and focusing on the execution and delivery of reliable data platforms. You will work closely with Business Analytics leaders to understand business needs and create value through technology. Additionally, you will lead data platforms operations, build next-generation data and analytics capabilities, and drive the adoption and scaling of data products within the organization. To be successful in this role, you should have an undergraduate degree in Technology, Computer Science, applied data sciences, or related fields, with an advanced degree being preferred. You should possess strong analytical skills, effective communication abilities, and a proven track record in developing and maintaining data platforms. Experience with cloud platforms such as Azure, GCP, AWS, cloud-based databases, data streaming platforms, and Agile methodology will be essential. Your ability to define platforms tech stack, prioritize work items, and work effectively in a diverse and inclusive company culture will be critical to your success in this role. If you are passionate about leveraging data and technology to drive business growth, make a positive impact on personal health, and shape the future of data platforms, then this role at Kenvue Data Platforms is the perfect opportunity for you. Join us in our mission to empower millions of people every day through insights, innovation, and care. We look forward to welcoming you to our team! Location: Asia Pacific-India-Karnataka-Bangalore Function: Digital Product Development Qualifications: - Undergraduate degree in Technology, Computer Science, applied data sciences or related fields; advanced degree preferred - Strong interpersonal and communication skills, ability to explain digital concepts to business leaders and vice versa - 4 years of data platforms experience in Consumer/Healthcare Goods companies - 6 years of progressive experience in developing and maintaining data platforms - Minimum 5 years hands-on experience with Cloud Platforms and cloud-based databases - Experience with data streaming platforms, microservices, and data integration - Proficiency in Agile methodology within DevSecOps model - Ability to define platforms tech stack to address data challenges - Proven track record of delivering high-profile projects within defined resources - Commitment to diversity, inclusion, and equal opportunity employment,

Posted 4 days ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

As an Engineering Manager - Analytics & Data Engineering at Highspot, you will be responsible for leading and nurturing a team of data analysts, data scientists, and data engineers. Your primary objective will be to drive the development of cutting-edge analytics capabilities within our B2B SaaS product, while also maintaining a company-wide data warehouse to facilitate data-driven decision-making. By leveraging your expertise in statistical analysis, machine learning techniques, and business acumen, you will uncover valuable insights from our data, enabling impactful decisions and fueling our growth. Your role will involve guiding strategic decisions related to data systems, analytics capabilities, team operations, and engineering culture. We are looking for a candidate who is passionate about team building and scaling, values-driven, committed to fostering a positive culture, self-directed, inquisitive, and resourceful. Your key responsibilities will include: - Leading a team of data analysts, data scientists, and data engineers, motivating them to deliver their best work and providing hands-on support when needed. - Analyzing core business topics using Highspot's product and business data to derive insights that drive product development and enhance platform effectiveness. - Applying statistical analysis, machine learning, and operations research techniques to develop solutions that drive impactful business outcomes such as operational efficiency improvements, customer churn rate reduction, and resource allocation optimization. - Driving the team's data & analytics strategy, technical roadmap, and data storage solutions. - Defining top-level business, team, and product metrics, and creating automated reports/dashboards to support strategic decision-making. - Developing and maintaining scalable end-to-end data pipelines and data warehouse systems that are essential for various teams across the company and ensure compliance with global data protection requirements. - Leading the development of custom scorecards and visualizations in the product to provide actionable insights to customers. - Contributing your technical expertise to the evolution of Highspot's software architecture and stack to meet the demands of hyper-growth and ensure high-availability and reliability across multiple data centers. - Collaborating with key partners and stakeholders to deliver high-impact customer value and promote effective communication within and outside the team. To be considered for this role, you should possess: - A Bachelor's or Master's degree in Computer Science, Engineering, or a related field. - 5+ years of experience in designing and building scalable, high-quality customer-facing software. - 5+ years of experience in advanced analytics and cloud data engineering. - Proficiency in statistical analysis, data science models, data pipelines, and deriving actionable insights from complex datasets. - Strong skills in SQL, Python, object-oriented programming, and web technologies. - Experience in presenting to C-level executives and collaborating with various business functions. - A track record of fostering a high-performing team and promoting a positive work culture. - An entrepreneurial spirit and a commitment to delivering high-quality results. At Highspot, we are committed to diversity and inclusion. If this role aligns with your skills and interests, we encourage you to apply, even if you do not meet all the requirements listed above.,

Posted 4 days ago

Apply

5.0 - 9.0 years

0 Lacs

maharashtra

On-site

As a Data Warehouse (DWH) professional with relevant experience in Google Cloud Platform (GCP), you will be responsible for developing and implementing robust data architectures. This includes designing data lakes, data warehouses, and data marts by utilizing GCP services such as BigQuery, Dataflow, DataProc, and Cloud Storage. Your role will involve designing and implementing data models that meet business requirements while ensuring data integrity, consistency, and accessibility. Your deep understanding of GCP services and best practices for data warehousing, data analytics, and machine learning will be crucial in this role. You will also be tasked with planning and executing data migration strategies from on-premises or other cloud environments to GCP. Optimizing data pipelines and query performance to facilitate efficient data processing and analysis will be a key focus area. Additionally, your proven experience in managing teams and project delivery will be essential for success in this position. Collaborating closely with stakeholders to comprehend their requirements and deliver effective solutions will be a significant part of your responsibilities. Any experience with Looker will be considered advantageous for this role.,

Posted 4 days ago

Apply

0.0 - 4.0 years

0 Lacs

haryana

On-site

As an IoT & AI Intern at Bharat Ecolabs, you will have the opportunity to support the development and integration of sensor networks, data pipelines, and AI models for our smart, modular Packaged Sewage Treatment Plants (STPs). Working out of the NASSCOM Center of Excellence for IoT & AI in Gurugram, you will collaborate closely with our core engineering team. Your key responsibilities will include assisting in the design and development of IoT-enabled sensor and control systems for STPs, building and testing data acquisition modules using sensors for various parameters, developing cloud-based or edge data pipelines for real-time monitoring, and supporting the development of AI/ML models for fault detection and efficiency optimization. Additionally, you will contribute to the creation of dashboards or mobile interfaces for STP data visualization and alerts, as well as conduct testing and debugging of IoT prototypes on pilot installations. The ideal candidate for this role would be a final year student or recent graduate in Electronics, Computer Science, AI, Mechatronics, or related fields with hands-on experience in microcontrollers like Arduino, ESP32, or Raspberry Pi. You should also have familiarity with sensor integration, basic electronics, and communication protocols such as MQTT, Modbus, or LoRa. Proficiency in Python, basic ML libraries like scikit-learn or TensorFlow, and cloud platforms like AWS, Azure, or GCP is desired. Strong problem-solving skills and a passion for applying technology in real-world clean tech solutions are essential for success in this role. Joining us as an IoT & AI Intern at Bharat Ecolabs will provide you with real-world exposure to applied IoT and AI in environmental engineering, mentorship from a passionate founding team, and access to the NASSCOM CoE ecosystem. Upon successful completion of the internship, you will receive a formal Experience Certificate, and there is a possibility to convert the internship into a full-time role based on performance and mutual interest.,

Posted 4 days ago

Apply

6.0 - 10.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Instaworks vision is to create economic opportunities for local businesses and skilled hourly workers in communities around the world. With an AI-first approach, were supercharging the leading online labor marketplace and looking for exceptional talent to help us build the future of hourly work. Backed by world-class investors like Benchmark, Spark Capital, Craft Ventures, Greylock, Y Combinator, and more, we want you to help us continue to scale quickly and make an even greater impact. Instaworks mission is to create economic opportunities for businesses and local professionals globally. We believe AI and machine learning will be key factors across our products and company in accelerating our mission. We need an innovative Senior Machine Learning Engineer to join our growing team. While we have a team in the US, this role will be the founding ML engineer in APAC and thus can help set the direction and vision for ML in the region. Who You Are: 6-10 years of experience building machine learning models for business applications Entrepreneurial mindset with a mix of startup and large scale experience, bonus points for being a co-founder Advanced statistical techniques and concepts (regression, properties of distributions, statistical tests and proper usage, or similar) and experience with applications Deep understanding of machine learning techniques (clustering, decision tree learning, artificial neural networks, or similar) and their real-world advantages/drawbacks Strong communication and presentation skills with a natural executive presence Expert level fluency using languages like Python & SQL to manipulate data and draw insights from large data sets Working knowledge of common web/mobile architectures, data pipelines, CI/CD processes, etc. Strong understanding of DBT or other similar data transformation/organization methodologies Experience with BI tools (e.g. Mode, Tableau, Looker) B.Tech/ B.E or BS/B.Sc in Math, Statistics, Physics, Computer Science, or other quantitative fields. What You&aposll Do: Design, develop, and implement machine learning algorithms to solve complex problems in real-time scheduling and workforce optimization Initially, youll work closely with a US-based team for 1-2 quarters to build knowledge before immersing more in APAC-focused projects. Flexibility to work some mornings or evenings will be necessary during this period. Impact key company objectives by working closely with Product to refine our roadmap Build/maintain reports, dashboards, and metrics to monitor the performance of our products Gather, clean, and prepare large datasets for machine-learning models Build and maintain machine learning pipelines that are robust, scalable, and efficient Collaborate with cross-functional teams to integrate machine learning solutions into production systems Monitor and evaluate the performance of machine learning models and make continuous improvements Our Values Empathy, Trust & Candor We put ourselves in the shoes of our colleagues and customers and dont shy away from uncomfortable conversations, instead building trust through honest and direct feedback. Bias for Action We practice high-velocity decision-making, clear-eyed that we often operate with incomplete information. Growing quickly means its OK to be wrong, so long as we learn from our mistakes and course correct! Always Be Learning Were a curious bunch, and with AI transforming our workplace we encourage everyone to learn from each other, compounding our knowledge and experience to help us change an entire industry. Act Like an Owner We work long, hard, and smart, building products that delight our users and drive growth. Your ability to impact Instawork is limited only by your courage and conviction, not your job description. About Instawork Founded in 2015, Instawork is the nations leading online labor marketplace for food services, hospitality, light industry, and logistics, connecting more than 7M skilled workers with local restaurants, hotels, warehouses, stadiums, and more. Our AI-powered platform serves thousands of businesses across more than 50 major markets in the United States and Canada. We&aposre not just helping fill shifts, we&aposre supporting local economiesand we&aposre just getting started! Instawork has been featured by CBS News, The Wall Street Journal, The Washington Post, and the Associated Press. Forbes included us on their Next Billion Dollar Startups list; RetailTech Breakthrough named us Workforce Hiring Solution of the Year for 2025; and Inc. 5000 recognized us as one of the country&aposs top 10% fastest-growing companies two years in a row. But what matters most is our impact. We&aposre solving real problems for real people, and were doing it at scale. Join our team to help us build something that matters! Were looking for superstars who want to help us shape the future of work. With hubs in San Francisco, Bangalore, and Chicago, city offices in New York, Phoenix, and Singapore, we&aposre back to working together in-person five days a week because we believe the best ideas happen when great people collaborate face-to-face. We also value diverse perspectives and encourage applications from candidates of all backgrounds. Ready to make an impact Learn more at www.instawork.com/about. Personnel Privacy Policy Show more Show less

Posted 4 days ago

Apply

6.0 - 10.0 years

3 - 15 Lacs

Hyderabad, Telangana, India

On-site

Job Summary: We are seeking a skilled Databricks Developer to join our Data Lake Modernization project. You will be responsible for migrating on-premises data lake use cases to Databricks on GCP while collaborating with business and product teams to ensure high-quality development aligned with modern data engineering standards. Key Responsibilities: Play a key role in defining and implementing migration patterns for Data Lake Modernization Migrate data and use cases from on-premises systems to Databricks on GCP Collaborate with product managers and business users to gather requirements Follow development lifecycle practices such as testing, code reviews, CI/CD, and documentation Document and showcase feature designs and workflows Participate in agile team discussions and product development planning Stay updated on current industry trends and design standards

Posted 5 days ago

Apply

3.0 - 7.0 years

0 Lacs

maharashtra

On-site

As an Associate Data Architect at Quantiphi, you will be part of a dynamic team that thrives on innovation and growth. Your role will involve designing and delivering big data pipelines for structured and unstructured data across diverse geographies, particularly focusing on assisting healthcare organizations in achieving their business objectives through the utilization of data ingestion technologies, cloud services, and DevOps practices. Your responsibilities will include collaborating with cloud engineers and clients to address large-scale data challenges by creating tools for migration, storage, and processing on Google Cloud. You will be instrumental in crafting cloud migration strategies for both cloud-based and on-premise applications, as well as diagnosing and resolving complex issues within distributed systems to enhance efficiency at scale. In this role, you will have the opportunity to design and implement cutting-edge solutions for data storage and computation for various clients. You will work closely with experts from different domains such as Cloud engineering, Software engineering, and ML engineering to develop platforms and applications that align with the evolving trends in the healthcare sector, including digital diagnosis, AI marketplace, and software as a medical product. Effective communication with cross-functional teams, including Infrastructure, Network, Engineering, DevOps, SiteOps, and cloud customers, will be essential to drive successful project outcomes. Additionally, you will play a key role in building advanced automation tools, monitoring solutions, and data operations frameworks across multiple cloud environments to streamline processes and enhance operational efficiency. A strong understanding of data modeling and governance principles will be crucial for this role, enabling you to contribute meaningfully to the development of scalable and sustainable data architectures. If you thrive in a fast-paced environment that values innovation, collaboration, and continuous learning, then a career as an Associate Data Architect at Quantiphi is the perfect fit for you. Join us and be part of a team of dedicated professionals who are passionate about driving positive change through technology and teamwork.,

Posted 6 days ago

Apply

12.0 - 16.0 years

0 Lacs

karnataka

On-site

As a Senior Software Engineer with 12-16 years of experience in supply chain retail technology, you will be joining the Fulfilment Optimization team at Wayfair. The team is responsible for building platforms that determine how customer orders are fulfilled, focusing on optimizing Wayfair profitability and customer satisfaction. Your role will involve enhancing and scaling customer-facing platforms that provide fulfillment information on the website, ensuring the accurate representation of the dynamic supply chain, and surfacing the necessary information for customers, suppliers, and carriers in real-time. In this position, you will partner with business stakeholders to provide transparency, data, and resources for informed decision-making. You will be a technical leader within and across the teams, driving high-impact architectural decisions and hands-on development following best design and coding practices. Your responsibilities will include ensuring production readiness, identifying risks and proposing solutions, contributing to the team's strategy and roadmap, and fostering a culture of continuous learning and innovation. To be successful in this role, you should have a Bachelor's Degree in Computer Science or a related field, along with at least 12 years of experience in a senior engineer or technical lead role. You should have mentored a team of 10-12 people and possess expertise in developing and designing scalable distributed systems. Strong communication skills, the ability to collaborate effectively with cross-functional teams, and a passion for mentoring and leading engineers are essential qualities for this role. Experience in designing APIs and microservices, working with cloud technologies (specifically GCP), data processing, data pipelines, and familiarity with common open-source platforms and tools such as Kafka, Kubernetes, Java microservices, and GraphQL APIs are highly desirable. Additionally, experience in designing and developing recommendation systems, productionalizing ML models, and working with event-driven systems and technologies would be advantageous. Joining Wayfair means being part of one of the world's largest online destinations for home goods. With a commitment to industry-leading technology and creative problem-solving, Wayfair offers rewarding career opportunities for individuals seeking rapid growth, continuous learning, and dynamic challenges. If you are looking to be a part of a team that is reinventing the way people shop for their homes, Wayfair is the place for you. Please review our Candidate Privacy Notice for information on how your personal data is processed. If you have any questions or wish to exercise your privacy rights, please contact us at dataprotectionofficer@wayfair.com.,

Posted 6 days ago

Apply

6.0 - 10.0 years

0 Lacs

navi mumbai, maharashtra

On-site

You should have 6-8 years of experience with a deep understanding of the Spark framework, along with hands-on experience in Spark SQL and Pyspark. Your expertise should include Python programming and familiarity with common Python libraries. Strong analytical skills are essential, especially in database management, including writing complex queries, query optimization, debugging, user-defined functions, views, and indexes. Your problem-solving abilities will be crucial in designing, implementing, and maintaining efficient data models and pipelines. Experience with Big Data technologies is a must, while familiarity with any ETL tool would be advantageous. As part of your responsibilities, you will be working on projects to deliver, review, and design PySpark and Spark SQL-based data engineering analytics solutions. Your tasks will involve writing clean, efficient, reusable, testable, and scalable Python logic for analytical solutions. Emphasis will be on building solutions for data cleaning, data scraping, and exploratory data analysis, ensuring compatibility with any BI tool. Collaboration with Data Analysts/BI developers to provide clean and processed data will be essential. You will design data processing pipelines using ETL techniques, develop and deliver complex requirements to achieve business goals, and work with unstructured, structured, and semi-structured data and their respective databases. Effective coordination with internal engineering and development teams to understand requirements and develop solutions is critical. Communication with stakeholders to grasp business logic and provide optimal data engineering solutions will also be part of your role. It is important to adhere to best coding practices and standards throughout your work.,

Posted 6 days ago

Apply

5.0 - 9.0 years

0 Lacs

maharashtra

On-site

As a Senior Specialist in Software Development (Artificial Intelligence) at Accelya, you will lead the design, development, and implementation of AI and machine learning solutions to tackle complex business challenges. Your expertise in AI algorithms, model development, and software engineering best practices will be crucial in working with cross-functional teams to deliver intelligent systems that optimize business operations and decision-making. Your responsibilities will include designing and developing AI-driven applications and platforms using machine learning, deep learning, and NLP techniques. You will lead the implementation of advanced algorithms for supervised and unsupervised learning, reinforcement learning, and computer vision. Additionally, you will develop scalable AI models, integrate them into software applications, and build APIs and microservices for deployment in cloud environments or on-premise systems. Collaboration with data scientists and data engineers will be essential in gathering, preprocessing, and analyzing large datasets. You will also implement feature engineering techniques to enhance the accuracy and performance of machine learning models. Regular evaluation of AI models using performance metrics and fine-tuning them for optimal accuracy will be part of your role. Furthermore, you will collaborate with business stakeholders to identify AI adoption opportunities, provide technical leadership and mentorship to junior team members, and stay updated with the latest AI trends and research to introduce innovative techniques to the team. Ensuring ethical compliance, security, and continuous improvement of AI systems will also be key aspects of your role. You should hold a Bachelor's degree in Computer Science, Data Science, Artificial Intelligence, or a related field, along with at least 5 years of experience in software development focusing on AI and machine learning. Proficiency in AI frameworks and libraries, programming languages such as Python, R, or Java, and cloud platforms for deploying AI models is required. Familiarity with Agile methodologies, data structures, and databases is essential. Preferred qualifications include a Master's or PhD in Artificial Intelligence or Machine Learning, experience with NLP techniques and computer vision technologies, and certifications in AI/ML or cloud platforms. Accelya is looking for individuals who are passionate about shaping the future of the air transport industry through innovative AI solutions. If you are ready to contribute your expertise and drive continuous improvement in AI systems, this role offers you the opportunity to make a significant impact in the industry.,

Posted 6 days ago

Apply

3.0 - 8.0 years

0 Lacs

noida, uttar pradesh

On-site

You should have a total of 8+ years of development/design experience, with a minimum of 3 years experience in Big Data technologies on-prem and on cloud. Proficiency in Snowflake and strong SQL programming skills are required. In addition, you should have strong experience with data modeling and schema design, as well as extensive experience using Data warehousing tools like Snowflake, BigQuery, or RedShift. Experience with BI Tools like Tableau, QuickSight, or PowerBI is a must, with at least one tool being a requirement. You should also have strong experience implementing ETL/ELT processes and building data pipelines, including workflow management, job scheduling, and monitoring. A good understanding of Data Governance, Security and Compliance, Data Quality, Metadata Management, Master Data Management, and Data Catalog is essential. Moreover, you should have a strong understanding of cloud services (AWS or Azure), including IAM and log analytics. Excellent interpersonal and teamwork skills are necessary for this role, as well as experience with leading and mentoring other team members. Good knowledge of Agile Scrum and communication skills are also required. As part of the job responsibilities, you will be expected to perform the same tasks as mentioned above. At GlobalLogic, we prioritize a culture of caring. From day one, you will experience an inclusive culture of acceptance and belonging, where you will have the chance to build meaningful connections with collaborative teammates, supportive managers, and compassionate leaders. We are committed to your continuous learning and development. You will learn and grow daily in an environment with many opportunities to try new things, sharpen your skills, and advance your career at GlobalLogic. GlobalLogic is known for engineering impact for and with clients around the world. As part of our team, you will have the chance to work on projects that matter and engage your curiosity and creative problem-solving skills. We believe in the importance of balance and flexibility. With many functional career areas, roles, and work arrangements, you can explore ways of achieving the perfect balance between your work and life. GlobalLogic is a high-trust organization where integrity is key. By joining us, you are placing your trust in a safe, reliable, and ethical global company. GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner to the world's largest and most forward-thinking companies. Since 2000, we have been at the forefront of the digital revolution, helping create innovative and widely used digital products and experiences. Today we continue to collaborate with clients in transforming businesses and redefining industries through intelligent products, platforms, and services.,

Posted 1 week ago

Apply

12.0 - 16.0 years

0 Lacs

delhi

On-site

We are seeking a talented Systems Architect (AVP level) with specialized knowledge in designing and expanding Generative AI solutions for production environments. In this pivotal position, you will collaborate across various teams including data scientists, ML engineers, and product leaders to mold enterprise-level GenAI platforms. Your responsibilities will include designing and scaling LLM-based systems such as chatbots, copilots, RAG, and multi-modal AI, architecting data pipelines, training/inference workflows, and integrating MLOps. You will be tasked with ensuring that systems are modular, secure, scalable, and cost-effective. Additionally, you will work on model orchestration, agentic AI, vector DBs, and CI/CD for AI. The ideal candidate should possess 12-15 years of experience in cloud-native and distributed systems, with 2-3 years focusing on GenAI/LLMs utilizing tools like LangChain, HuggingFace, and Kubeflow. Proficiency in cloud platforms such as AWS, GCP, or Azure (SageMaker, Vertex AI, Azure ML) is essential. Experience with RAG, semantic search, agent orchestration, and MLOps is highly valued. Strong architectural acumen, effective stakeholder communication skills, and preferred certifications in cloud technologies, AI open-source contributions, and knowledge of security and governance are all advantageous.,

Posted 1 week ago

Apply

5.0 - 12.0 years

0 Lacs

hyderabad, telangana

On-site

As a Managed Services Provider (MSP), we are looking for an experienced TechOps Lead to take charge of our cloud infrastructure operations team. Your primary responsibility will be ensuring the seamless delivery of high-quality, secure, and scalable managed services across multiple customer environments, predominantly on AWS and Azure. In this pivotal role, you will serve as the main point of contact for customers, offering strategic technical direction, overseeing day-to-day operations, and empowering a team of cloud engineers to address complex technical challenges. Conducting regular governance meetings with customers, you will provide insights and maintain strong, trust-based relationships. As our clients explore AI workloads and modern platforms, you will lead the team in rapidly adopting and integrating new technologies to keep us ahead of evolving industry trends. Your key responsibilities will include: - Acting as the primary technical and operational contact for customer accounts - Leading governance meetings with customers to review SLAs, KPIs, incident metrics, and improvement initiatives - Guiding the team in diagnosing and resolving complex technical problems in AWS, Azure, and hybrid environments - Ensuring adherence to best practices in cloud operations, infrastructure-as-code, security, cost optimization, monitoring, and compliance - Staying updated on emerging cloud, AI, and automation technologies to enhance our service offerings - Overseeing incident, change, and problem management activities to ensure SLA compliance - Identifying trends from incidents and metrics and driving proactive improvements - Establishing runbooks, standard operating procedures, and automation to reduce toil and improve consistency To be successful in this role, you should possess: - 12+ years of overall experience with at least 5 years managing or delivering cloud infrastructure services on Azure and/or AWS - Strong hands-on skills in Terraform, DevOps tools, monitoring, logging, alerting, and exposure to AI workloads - Solid understanding of networking, security, IAM, and cost optimization in cloud environments - Experience leading technical teams in a managed services or consulting environment - Ability to quickly learn new technologies and guide the team in adopting them to solve customer problems Nice to have skills include exposure to container platforms, multi-cloud cost management tools, AI ML Ops services, security frameworks, and relevant certifications like AWS Solutions Architect, Azure Administrator, or Terraform Associate.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

As a Data/AWS Engineer at Waters Global Research, you will be part of a dynamic team focused on researching and developing self-diagnosing, self-healing instruments to enhance the user experience of our customers. By leveraging cutting-edge technologies and innovative solutions, you will play a crucial role in advancing our analytical chemistry instruments that have a direct impact on various fields such as laboratory testing, drug discovery, and food safety. Your primary responsibility will be to develop data pipelines for specialty instrument data and Gen AI processes, train machine learning models for error diagnosis, and automate manual processes to optimize instrument procedures. You will work on projects aimed at interpreting raw data results, cleaning anomalous data, and deploying models in AWS to collect and analyze results effectively. Key Responsibilities: - Build data pipelines in AWS using services like S3, Lambda, IoT core, and EC2. - Create and maintain dashboards to monitor data health and performance. - Containerize models and deploy them in AWS for efficient data processing. - Develop Python data pipelines to handle data frames and matrices, ensuring smooth data ingestion, transformation, and storage. - Collaborate with Machine Learning engineers to evaluate data and models, and present findings to stakeholders. - Mentor and review code of team members to ensure best coding practices and adherence to standards. Qualifications: Required Qualifications: - Bachelor's degree in computer science or related field with 5-8 years of relevant work experience. - Proficiency in AWS services such as S3, EC2, Lambda, and IAM. - Experience with containerization and deployment of code in AWS. - Strong programming skills in Python for OOP and/or functional programming. - Familiarity with Git, BASH, and command prompt. - Ability to drive new capabilities, solutions, and data best practices from technical documentation. - Excellent communication skills to convey results effectively to non-data scientists. Desired Qualifications: - Experience with C#, C++, and .NET considered a plus. What We Offer: - Hybrid role with competitive compensation and great benefits. - Continuous professional development opportunities. - Inclusive environment that encourages contributions from all team members. - Reasonable adjustments to the interview process based on individual needs. Join Waters Corporation, a global leader in specialty measurement, and be part of a team that drives innovation in chromatography, mass spectrometry, and thermal analysis. With a focus on creating business advantages for various industries, including life sciences, materials, and food sciences, we aim to transform healthcare delivery, environmental management, food safety, and water quality. At Waters, we empower our employees to unlock their full potential, learn, grow, and make a tangible impact on human health and well-being. We value collaboration, problem-solving, and innovation to address the challenges of today and tomorrow. Join us to be part of a team that delivers benefits as one and provides insights for a better future.,

Posted 1 week ago

Apply

1.0 - 5.0 years

0 Lacs

chennai, tamil nadu

On-site

Are you ready to experiment, learn, and implement Join us on a new adventure where your RDBMS Gen AI Exposure can change the dynamics of our organization and revolutionize the paradigm. At OptiSol, we are your answer to a stress-free, balanced lifestyle. Certified as a GREAT PLACE TO WORK for 4 consecutive years, we are known for our culture of open communication, accessible leadership, and celebration of diversity. With flexible policies promoting work-life balance, you can thrive personally and professionally. We are at the forefront of AI and innovation, shaping the future together. The ideal candidate should be proficient in JavaScript and Python, with strong skills in both front-end and back-end development. Experience with React and Angular, combined with Redux for state management, is essential for enabling efficient, dynamic web applications. Expertise in Node.js for building scalable server-side applications, along with Serverless architectures for optimized, cost-effective solutions, is highly valued. Familiarity with data pipelines and managing RDBMS for structured data storage and efficient querying is a plus. Exposure to Gen AI technologies and experience working with AWS or Azure to deploy, manage, and scale cloud applications is desirable. Key Requirements: - Hands-on experience with GenAI projects and LLM frameworks - Proficiency in Python for various development tasks and projects - Excellent communication skills for effective collaboration - Self-starter mindset with a focus on delivering high-quality results What You'll Bring to the Table: - Strong skills in JavaScript and TypeScript for building scalable, high-performance apps - Experience with Node.js and frameworks like Express.js or Sails.js for back-end systems - Comfort with Angular 14+ or React for front-end, and Redux or NGRX for state management - Familiarity with Python or Go and adaptability to different tech stacks - Design experience with microservice architectures and MonoRepo structures - Exposure to serverless architectures, data pipelines, and message queues like RabbitMQ or Kafka - AWS or Azure certification and hands-on experience with cloud services Core Benefits You'll Gain: - Hands-on experience with advanced technologies like JavaScript, TypeScript, Node.js, and React - Flexibility in learning multiple stacks such as Python, Go, and Angular - Experience with microservices and MonoRepo structures for clean, scalable systems - Knowledge of cloud deployment with AWS or Azure - Exposure to cutting-edge technologies like serverless architectures and data pipelines Join us at OptiSol and explore why we are the perfect match for each other. Learn more about our culture and discover the opportunities that await you at OptiSol.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

As an AI Engineer at Xurrent, you will play a crucial role in the development of platforms and solutions that drive the transformation of Service and Operations Management through AI technologies. Your primary responsibilities will involve collaborating with the R&D team to deliver AI software encompassing data engineering, prompt engineering, algorithms, and APIs. Your performance will be gauged based on your ability to generate production-ready code, prototype swiftly for stakeholders, and maintain high code quality as a senior member of the team. You will also be responsible for conducting code reviews and working closely with DevOps and MLOps to ensure a robust deployment and testing pipeline. Responsibilities: - Collaborate with R&D team members to identify functional and non-functional requirements for AI-related software adjustments or extensions and incorporate them into AI RFCs. - Assist product management in developing high-level product specifications related to artificial intelligence, focusing on system integration and feasibility. - Develop software adjustments and extensions that align with specifications, requirements, and coding standards. - Provide technical guidance and coaching on artificial intelligence to fellow R&D members. - Ensure that AI software used in Xurrent services meets quality, security, and extensibility standards. - Actively contribute to the continuous improvement of AI software in Xurrent services through active development participation. Requirements: - Demonstrated proficiency in developing AI features using LLM. - Proven track record in developing, deploying, maintaining, and enhancing machine learning models. - Experience in creating and managing data systems like feature stores and data pipelines for AI and machine learning. - Ability to translate product requirements into technical specifications. - Background in a production engineering environment. This role at Xurrent offers an exciting opportunity to be at the forefront of AI innovation in Service and Operations Management. If you are passionate about leveraging AI technologies to drive impactful changes, this position is tailored for you.,

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies