Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 8.0 years
4 - 8 Lacs
Bengaluru
Work from Office
Skilled and motivated Data Engineer to join our dynamic technology team. The ideal candidate will have a strong background in data processing, cloud computing, and software development , with hands-on experience in Python, PySpark, Java , and Microsoft Azure . You will be responsible for designing, building, and maintaining scalable data pipelines and infrastructure to support advanced analytics and data science initiatives. Key Responsibilities: Design, develop, and maintain robust data pipelines using PySpark , Python , and Java . Implement and manage data workflows on Microsoft Azure and other public cloud platforms . Collaborate with data scientists, analysts, and IT operations to ensure seamless data integration and availability. Optimize data systems for performance, scalability, and reliability. Ensure data quality, governance, and security across all data platforms. Support DevOps practices for continuous integration and deployment of data solutions. Monitor and troubleshoot data infrastructure and resolve system issues. Document processes and maintain data architecture standards. Required Skills & Qualifications: Bachelors or Masters degree in Computer Science, Information Technology, or related field. 3+ years of experience in data engineering , software development , or IT operations . Proficiency in Python , PySpark , and Java . Experience with cloud computing platforms, especially Microsoft Azure . Strong understanding of data management , data processing , and data analysis . Familiarity with multi-paradigm programming and modern software development practices. Knowledge of DevOps tools and methodologies. Experience with system administration and cloud providers . Excellent problem-solving and communication skills. Preferred Qualifications: Certifications in Azure , Python Data Science , or related technologies. Experience with public cloud environments like AWS or GCP. Familiarity with big data tools and frameworks. Exposure to data science workflows and tools. Skilled and motivated Data Engineer to join our dynamic technology team. The ideal candidate will have a strong background in data processing, cloud computing, and software development , with hands-on experience in Python, PySpark, Java , and Microsoft Azure . You will be responsible for designing, building, and maintaining scalable data pipelines and infrastructure to support advanced analytics and data science initiatives. Key Responsibilities: Design, develop, and maintain robust data pipelines using PySpark , Python , and Java . Implement and manage data workflows on Microsoft Azure and other public cloud platforms . Collaborate with data scientists, analysts, and IT operations to ensure seamless data integration and availability. Optimize data systems for performance, scalability, and reliability. Ensure data quality, governance, and security across all data platforms. Support DevOps practices for continuous integration and deployment of data solutions. Monitor and troubleshoot data infrastructure and resolve system issues. Document processes and maintain data architecture standards. Required Skills & Qualifications: Bachelors or Masters degree in Computer Science, Information Technology, or related field. 3+ years of experience in data engineering , software development , or IT operations . Proficiency in Python , PySpark , and Java . Experience with cloud computing platforms, especially Microsoft Azure . Strong understanding of data management , data processing , and data analysis . Familiarity with multi-paradigm programming and modern software development practices. Knowledge of DevOps tools and methodologies. Experience with system administration and cloud providers . Excellent problem-solving and communication skills. Preferred Qualifications: Certifications in Azure , Python Data Science , or related technologies. Experience with public cloud environments like AWS or GCP. Familiarity with big data tools and frameworks. Exposure to data science workflows and tools.
Posted 3 days ago
2.0 - 5.0 years
12 - 16 Lacs
Pune
Work from Office
Note: By applying to this position you will have an opportunity to share your preferred working location from the following: Pune, Maharashtra, India; Gurugram, Haryana, India; Bengaluru, Karnataka, India; Hyderabad, Telangana, India Minimum qualifications: Bachelor's degree in Computer Science or a related technical field, or equivalent practical experience, 3 years of experience in building data and Artificial Intelligence (AI) solutions and working with technical customers, Experience in designing cloud enterprise solutions and supporting customer projects to completion, Preferred qualifications: Experience in working with Large Language Models, data pipelines, and with data analytics, data visualization techniques, Experience with Data Extract, Transform, and Load (ETL) techniques, Experience in Large Language Models (LLMs) to deploy multimodal solutions involving Text, Image, Video and Voice, Knowledge of data warehousing concepts, including data warehouse technical architectures, infrastructure components, Extract, Transform, and Load/Extract, Load and Transform and investigative tools and environments ( e-g , Apache Beam, Hadoop, Spark, Pig, Hive, MapReduce, Flume), Knowledge of cloud computing, including virtualization, hosted services, multi-tenant cloud infrastructures, storage systems, and content delivery networks, Excellent communication skills, About the jobThe Google Cloud Consulting Professional Services team guides customers through the moments that matter most in their cloud journey to help businesses thrive We help customers transform and evolve their business through the use of Googles global network, web-scale data centers, and software infrastructure As part of an innovative team in this rapidly growing business, you will help shape the future of businesses of all sizes and use technology to connect with customers, employees, and partners, In this role, you will play a role in ensuring that customers have the quality experience moving to the Google Cloud Generative AI (GenAI) and Agentic AI suite of products You will design and implement solutions for customer use cases, leveraging core Google products You will work with customers to identify opportunities to transform their business with Generative AI (GenAI), and deliver workshops designed to educate and empower customers to realize the potential of Google Cloud You will have access to Googles technology to monitor application performance, debug and troubleshoot product issues, and address customer and partner needs You will lead the execution of adopting the Google Cloud Platform solutions to the customer Google Cloud accelerates every organizations ability to digitally transform its business and industry We deliver enterprise-grade solutions that leverage Googles cutting-edge technology, and tools that help developers build more sustainably Customers in more than 200 countries and territories turn to Google Cloud as their trusted partner to enable growth and solve their most critical business problems, Responsibilities Deliver big data and GenAI solutions and solve technical customer tests, Act as a trusted technical advisor to Googles customers, Identify new product features and feature gaps, provide guidance on existing product tests, and collaborate with Product Managers and Engineers to influence the roadmap of Google Cloud Platform, Deliver best practices recommendations, tutorials, blog articles, and technical presentations adapting to different levels of business and technical stakeholders, Google is proud to be an equal opportunity workplace and is an affirmative action employer We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status We also consider qualified applicants regardless of criminal histories, consistent with legal requirements See also Google's EEO Policy and EEO is the Law If you have a disability or special need that requires accommodation, please let us know by completing our Accommodations for Applicants form , Show
Posted 3 days ago
2.0 - 5.0 years
12 - 16 Lacs
Gurugram
Work from Office
Note: By applying to this position you will have an opportunity to share your preferred working location from the following: Pune, Maharashtra, India; Gurugram, Haryana, India; Bengaluru, Karnataka, India; Hyderabad, Telangana, India Minimum qualifications: Bachelor's degree in Computer Science or a related technical field, or equivalent practical experience, 3 years of experience in building data and Artificial Intelligence (AI) solutions and working with technical customers, Experience in designing cloud enterprise solutions and supporting customer projects to completion, Preferred qualifications: Experience in working with Large Language Models, data pipelines, and with data analytics, data visualization techniques, Experience with Data Extract, Transform, and Load (ETL) techniques, Experience in Large Language Models (LLMs) to deploy multimodal solutions involving Text, Image, Video and Voice, Knowledge of data warehousing concepts, including data warehouse technical architectures, infrastructure components, Extract, Transform, and Load/Extract, Load and Transform and investigative tools and environments ( e-g , Apache Beam, Hadoop, Spark, Pig, Hive, MapReduce, Flume), Knowledge of cloud computing, including virtualization, hosted services, multi-tenant cloud infrastructures, storage systems, and content delivery networks, Excellent communication skills, About the jobThe Google Cloud Consulting Professional Services team guides customers through the moments that matter most in their cloud journey to help businesses thrive We help customers transform and evolve their business through the use of Googles global network, web-scale data centers, and software infrastructure As part of an innovative team in this rapidly growing business, you will help shape the future of businesses of all sizes and use technology to connect with customers, employees, and partners, In this role, you will play a role in ensuring that customers have the quality experience moving to the Google Cloud Generative AI (GenAI) and Agentic AI suite of products You will design and implement solutions for customer use cases, leveraging core Google products You will work with customers to identify opportunities to transform their business with Generative AI (GenAI), and deliver workshops designed to educate and empower customers to realize the potential of Google Cloud You will have access to Googles technology to monitor application performance, debug and troubleshoot product issues, and address customer and partner needs You will lead the execution of adopting the Google Cloud Platform solutions to the customer Google Cloud accelerates every organizations ability to digitally transform its business and industry We deliver enterprise-grade solutions that leverage Googles cutting-edge technology, and tools that help developers build more sustainably Customers in more than 200 countries and territories turn to Google Cloud as their trusted partner to enable growth and solve their most critical business problems, Responsibilities Deliver big data and GenAI solutions and solve technical customer tests, Act as a trusted technical advisor to Googles customers, Identify new product features and feature gaps, provide guidance on existing product tests, and collaborate with Product Managers and Engineers to influence the roadmap of Google Cloud Platform, Deliver best practices recommendations, tutorials, blog articles, and technical presentations adapting to different levels of business and technical stakeholders, Google is proud to be an equal opportunity workplace and is an affirmative action employer We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status We also consider qualified applicants regardless of criminal histories, consistent with legal requirements See also Google's EEO Policy and EEO is the Law If you have a disability or special need that requires accommodation, please let us know by completing our Accommodations for Applicants form , Show
Posted 3 days ago
6.0 - 11.0 years
6 - 10 Lacs
Hyderabad
Work from Office
About the Role In this opportunity, as Senior Data Engineer, you will: Develop and maintain data solutions using resources such as dbt, Alteryx, and Python. Design and optimize data pipelines, ensuring efficient data flow and processing. Work extensively with databases, SQL, and various data formats including JSON, XML, and CSV. Tune and optimize queries to enhance performance and reliability. Develop high-quality code in SQL, dbt, and Python, adhering to best practices. Understand and implement data automation and API integrations. Leverage AI capabilities to enhance data engineering practices. Understand integration points related to upstream and downstream requirements. Proactively manage tasks and work towards completion against tight deadlines. Analyze existing processes and offer suggestions for improvement. About You Youre a fit for the role of Senior Data Engineer if your background includes: Strong interest and knowledge in data engineering principles and methods. 6+ years of experience developing data solutions or pipelines. 6+ years of hands-on experience with databases and SQL. 2+ years of experience programming in an additional language. 2+ years of experience in query tuning and optimization. Experience working with SQL, JSON, XML, and CSV content. Understanding of data automation and API integration. Familiarity with AI capabilities and their application in data engineering. Ability to adhere to best practices for developing programmatic solutions. Strong problem-solving skills and ability to work independently. #LI-SS6 Whats in it For You Hybrid Work Model Weve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrows challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our valuesObsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world. Thomson Reuters informs the way forward by bringing together the trusted content and technology that people and organizations need to make the right decisions. We serve professionals across legal, tax, accounting, compliance, government, and media. Our products combine highly specialized software and insights to empower professionals with the data, intelligence, and solutions needed to make informed decisions, and to help institutions in their pursuit of justice, truth, and transparency. Reuters, part of Thomson Reuters, is a world leading provider of trusted journalism and news. We are powered by the talents of 26,000 employees across more than 70 countries, where everyone has a chance to contribute and grow professionally in flexible work environments. At a time when objectivity, accuracy, fairness, and transparency are under attack, we consider it our duty to pursue them. Sound excitingJoin us and help shape the industries that move society forward. As a global business, we rely on the unique backgrounds, perspectives, and experiences of all employees to deliver on our business goals. To ensure we can do that, we seek talented, qualified employees in all our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under applicable law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. We also make reasonable accommodations for qualified individuals with disabilities and for sincerely held religious beliefs in accordance with applicable law. More information on requesting an accommodation here. Learn more on how to protect yourself from fraudulent job postings here. More information about Thomson Reuters can be found on thomsonreuters.com.
Posted 3 days ago
7.0 - 12.0 years
8 - 13 Lacs
Bengaluru
Work from Office
We are looking for a self-motivated individual with appetite to learn new skills and be part of a fast-paced team that is delivering cutting edge solutions that drive new products and features that are critical for our customers. Our senior software engineers are responsible for designing, developing and ensuring the quality, reliability and availability of key systems that provide critical data and algorithms.Responsibilities of this role will include developing new and enhancing existing applications and you will work collaboratively with technical leads and architect to design, develop and test these critical applications. About the role Actively participate in the full life cycle of software delivery, including analysis, design, implementation and testing of new projects and features using Hadoop, Spark/Pyspark, Scala or Java, Hive, SQL, and other open-source tools and design patterns. Python knowledge is a bonus for this role. Working experience with HUDI , Snowflake or similar Must have technologies like Big Data, AWS services like EMR, S3, Lambdas, Elastic, step functions. Actively participate in the development and testing of features for assigned projects with little to no guidance. The position holds opportunities to work under technical experts and also to provide guidance and assistance to less experienced team members or new joiners in the path of the project. Appetite for learning will be key attribute for doing well in the role as the Org is very dynamic and have tremendous scope into various technical landscapes. We consider AI inclusion as a key to excel in this role, we want dynamic candidates who use AI tools as build partners and share experiences to ignite the Org. Proactively share knowledge and best practices on using new and emerging technologies across all of the development and testing groups Create, review and maintain technical documentation of software development and testing artifacts Work collaboratively with others in a team-based environment. Identify and participate in the resolution of issues with the appropriate technical and business resources Generate innovative approaches and solutions to technology challenges Effectively balance and prioritize multiple projects concurrently. About you Bachelors or Masters degree in computer science or a related field 7+ year experience in IT industry Product and Platform development preferred. Strong programming skill with Java or Scala. Must have technologies includes Big Data, AWS. Exposure to services like EMR, S3, Lambdas, Elastic, step functions. Knowledge of Python will be preferred. Experience with Agile methodology, continuous integration and/or Test-Driven Development. Self-motivated with a strong desire for continual learning Take personal responsibility to impact results and deliver on commitments. Effective verbal and written communication skills. Ability to work independently or as part of an agile development team. #LI-SP1 Whats in it For You Hybrid Work Model Weve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrows challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our valuesObsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world. Thomson Reuters informs the way forward by bringing together the trusted content and technology that people and organizations need to make the right decisions. We serve professionals across legal, tax, accounting, compliance, government, and media. Our products combine highly specialized software and insights to empower professionals with the data, intelligence, and solutions needed to make informed decisions, and to help institutions in their pursuit of justice, truth, and transparency. Reuters, part of Thomson Reuters, is a world leading provider of trusted journalism and news. We are powered by the talents of 26,000 employees across more than 70 countries, where everyone has a chance to contribute and grow professionally in flexible work environments. At a time when objectivity, accuracy, fairness, and transparency are under attack, we consider it our duty to pursue them. Sound excitingJoin us and help shape the industries that move society forward. As a global business, we rely on the unique backgrounds, perspectives, and experiences of all employees to deliver on our business goals. To ensure we can do that, we seek talented, qualified employees in all our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under applicable law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. We also make reasonable accommodations for qualified individuals with disabilities and for sincerely held religious beliefs in accordance with applicable law. More information on requesting an accommodation here. Learn more on how to protect yourself from fraudulent job postings here. More information about Thomson Reuters can be found on thomsonreuters.com.
Posted 3 days ago
8.0 - 13.0 years
15 - 19 Lacs
Bengaluru
Work from Office
Responsibilities: * Collaborate with cross-functional teams on data initiatives. * Deliver the corporate training * Develop big data solutions using Hadoop, Python & SQL. * Flexible for Offline and Online sessions
Posted 3 days ago
5.0 - 8.0 years
5 - 9 Lacs
Bengaluru
Work from Office
This role involves the development and application of engineering practice and knowledge in the following technologiesStandards and protocols, application software and embedded software for wireless and satellite networks, fixed networks and enterprise networks; connected devices (IOT and device engineering), connected applications (5G/ edge, B2X apps); and Telco Cloud, Automation and Edge Compute platforms. This role also involves the integration of network systems and their operations, related to the above technologies. - Grade Specific Focus on Connectivity and Network Engineering. Develops competency in own area of expertise. Shares expertise and provides guidance and support to others. Interprets clients needs. Completes own role independently or with minimum supervision. Identifies problems and relevant issues in straight forward situations and generates solutions. Contributes in teamwork and interacts with customers.
Posted 3 days ago
5.0 - 8.0 years
5 - 9 Lacs
Chennai
Work from Office
Design, develop, and maintain ETL processes using Pentaho Data Integration (Kettle) . Extract data from various sources including databases, flat files, APIs, and cloud platforms. Transform and cleanse data to meet business and technical requirements. Load data into data warehouses, data lakes, or other target systems. Monitor and optimize ETL performance and troubleshoot issues. Collaborate with data architects, analysts, and business stakeholders to understand data requirements. Ensure data quality, integrity, and security throughout the ETL lifecycle.Document ETL processes, data flows, and technical specifications. - Grade Specific Focus on Industrial Operations Engineering. Develops competency in own area of expertise. Shares expertise and provides guidance and support to others. Interprets clients needs. Completes own role independently or with minimum supervision. Identifies problems and relevant issues in straight forward situations and generates solutions. Contributes in teamwork and interacts with customers.
Posted 3 days ago
5.0 - 9.0 years
8 - 13 Lacs
Bengaluru
Work from Office
Your role Responsible for conducting and successfully completing the training delivery of program(s) assigned, by applying various training methods / innovation / gamification to accommodate diverse learning styles and preferences Responsible for creation / curation of new programs/ courses/ assessment/artifacts and deliver training, on need basis Collaborate closely with stakeholders from different teams (L&D, CFMG, BU, Vendor) to ensure seamless execution of training programs, with quality outcome Assess training effectiveness, analyse feedback, refine/revamp training materials to align with curriculum signed off by BU SMEs to maximize the learning outcomes and improve learner experience Upskill/upgrade on new & relevant skills aligned to the organization need, get certifications, to further enhance the expertise, stay competitive and grow professionally, while keeping abreast of the skills, fostering culture of continuous learning and development Skills: Oracle SQL/ MS SQL Server DWH & ETL Concepts UNIX Python/Java Big Data Technologies ETL Tool - Informatica Powercenter Reporting Tool Tableau/Power BI Your profile Strategize, implement, and maintain program initiatives that adhere to organizational objectives Training Delivery for I&D Skills Develop program assessment protocols for evaluation and improvement Ensure overall program goal & objectives effectively Work closely with cross-functional teams Apply change, risk and resource management Analyse, evaluate, and overcome program risks, and produce program reports for management and stakeholders Ensuring effective quality outcome and the overall integrity of the program Proactively monitoring progress, resolving issues and initiating appropriate corrective action Project management What you"ll love about working here Were committed to ensure that people of all backgrounds feel encouraged and have a sense of belonging at Capgemini. You are valued for who you are, and you canbring your original self to work. Every Monday, kick off the week with a musical performance by our in-house band - The Rubber Band. Also get to participate in internalsports events, yoga challenges, or marathons. At Capgemini, you can work oncutting-edge projectsin tech and engineering with industry leaders or createsolutionsto overcome societal and environmental challenges.
Posted 3 days ago
2.0 - 5.0 years
4 - 8 Lacs
Hyderabad
Work from Office
Data engineers are responsible for building reliable and scalable data infrastructure that enables organizations to derive meaningful insights, make data-driven decisions, and unlock the value of their data assets. - Grade Specific The role involves leading and managing a team of data engineers, defining and executing the data engineering strategy, and ensuring the effective delivery of data solutions. They provide technical expertise, drive innovation, and collaborate with stakeholders to deliver high-quality, scalable, and reliable data infrastructure and solutions.
Posted 3 days ago
3.0 - 7.0 years
7 - 11 Lacs
Bengaluru
Work from Office
As a key member of our Data Science team, you will be responsible for developing innovative AI ML solutions across diverse business domains This includes designing, implementing, and optimizing advanced analytics models to address complex business challenges and drive data-driven decision making Your core responsibility will be to extract actionable insights from large datasets, develop predictive algorithms, and create robust machine learning pipelines You will collaborate closely with cross-functional teams including business analysts, software engineers, and product managers to understand business requirements, define problem statements, and deliver scalable solutions Additionally, you'll be expected to stay current with emerging technologies and methodologies in the AI/ML landscape to ensure our technical approaches remain cutting-edge and effective Desired Skills and experience Demonstrated expertise in applying advanced statistical modeling, machine learning algorithms, and deep learning techniques. Proficiency in programming languages such as Python data analysis and model development. Proficiency in cloud platforms, such as Azure, Azure Data Factory, Snowflake, Databricks. Experience with data manipulation, cleaning, and preprocessing using pandas, NumPy, or equivalent libraries. Strong knowledge of SQL and experience working with various database systems and big data technologies. Proven track record of developing and deploying machine learning models in production environments. Experience with version control systems (e.g., Git) and collaborative development practices. Proficiency with visualization tools and libraries such as Matplotlib, Seaborn, Tableau, or PowerBI. Strong mathematics background including statistics, probability, linear algebra, and calculus. Excellent communication skills with ability to translate technical concepts to non-technical stakeholders. Experience working in cross-functional teams and managing projects through the full data science lifecycle. Knowledge of ethical considerations in AI development including bias detection and mitigation techniques. Key Responsibilities Analyze complex datasets to extract meaningful insights and patterns using statistical methods and machine learning techniques. Design, develop and implement advanced machine learning models and algorithms to solve business problems and drive data-driven decision making. Perform feature engineering, model selection, and hyperparameter tuning to optimize model performance and accuracy. Create and maintain data processing pipelines for efficient data collection, cleaning, transformation, and integration. Collaborate with cross-functional teams to understand business requirements and translate them into analytical solutions. Evaluate model performance using appropriate metrics and validation techniques to ensure reliability and robustness. Present findings, visualizations, and recommendations to stakeholders in clear, accessible formats tailored to technical and non-technical audiences. Stay current with the latest advancements in machine learning, deep learning, and statistical methods through continuous learning and research. Develop proof-of-concept applications to demonstrate the value and feasibility of data science solutions. Implement A/B testing and experimental design methodologies to validate hypotheses and measure the impact of implemented solutions. Document methodologies, procedures, and results thoroughly to ensure reproducibility and knowledge transfer within the organization.
Posted 3 days ago
3.0 - 5.0 years
6 - 8 Lacs
Gurugram
Work from Office
D evelop and execute high-impact analytics solutions for large, complex, structured, and unstructured data sets (including big data) to drive impact on client business (topline). This person will lead the engagement for AI based SaaS product deployment to clients across industries. Leverage their strong Data Science, analytics and engineering skills to build Advanced analytics processes, build scalable and operational process pipelines and find data-driven insights that help our clients solve their most important business problems and bring optimizations. Associate Consultants also engage with Project Leadership team and clients to help them understand the insights, summaries, implications and make plans to act on them. What Youll Do: Deep analytics-tech expertise: Develop and implement advanced algorithms that solve complex business problems in a computationally efficient and statistically effective manner leveraging tools like PySpark, Python, SQL on Client/ZS cloud environment Execute statistical and data modelling techniques (e.g. hypothesis testing, A/B Testing setup, marketing impact analytics, statistical validity etc.) on large data sets to identify trends, figures and other relevant information with scalable and operational process implementations. Evaluating emerging datasets and technologies that may contribute to our analytical platform including good understanding of Generative AI capabilities and SaaS products. Communication, collaboration, unstructured problem solving and client engagement (in a high performing and high intensity team environment): Problem solving and Client engagement : Understand client business priorities, develop product use cases, do proforma analysis for estimating business opportunity, and deploy the use case for the clients. Collaboration: Work in a cross-functional team environment to lead the client engagement and collaborate on holistic solutions comprising of best practices from Frontend and Backend engineering, Data Science, and ML Engineering area. Storyboarding impact communication : Build effective storyboards to communicate solution impact to clients and ZS Leadership Scaling mindset: Provide a structure to client engagement, build and maintain standardized and operationalized Quality Checks on teams work and ensuring high quality client deliverables Team management : Export best practices and learnings to broader team and mentor Associates on teams What Youll Bring: Bachelor's degree in Computer Science (or Statistics) from a premier institute, and strong academic performance with analytics and quantitative coursework is required Knowledge of programming - Python (Deep Expertise), Pyspark, SQL Expertise in machine learning, regression, clustering, and classification models (preferably in a product environment) Knowledge of big data/advanced analytics concepts and algorithms (e.g. social listening, recommender systems, predictive modeling, etc.) Excellent oral and written communication skills Strong attention to detail, with a value-addition mindset Excellent critical thinking and problem-solving skills High motivation, good work ethic and maturity. 3-5 years of relevant post-collegiate work experience, preferably in industries like B2C, Product companies, in execution roles focused on Data Decision Sciences, Data Engineering, Stakeholder management and building scalable processes. Should have hands on analytics experience where the candidate has worked on the algorithms / methodology from scratch and not merely executed existing codes and processes. Ability to coach, mentor juniors on the team to drive on the job learning expertise building
Posted 3 days ago
2.0 - 4.0 years
4 - 6 Lacs
Noida, Gurugram
Work from Office
What Youll Do Develop advanced and efficient statistically effective algorithms that solve problems of highdimensionality. Utilizetechnical skills such as hypothesis testing, machine learning and retrieval processes to apply statistical and data mining techniques toidentify trends, createfigures, and analyze other relevantinformation. Collaborate with clients and other stakeholders at ZS to integrate and effectively communicate analysis findings. Contribute to the assessment of emerging datasets and technologies thatimpact our analytical platform. What Youll Bring Amasters degree in computer science, Statistics, or a relevant field; A robust academic performance history with coursework emphasizing analysis and quantitative skills. A knowledge of big data, advanced analytical concepts, and algorithms (e.g., text mining, social listening, recommender systems, predictive modeling, etc A proficiency in at least one programming language (e.g., Java/Python/R). Experience with tools/platforms such as the Hadoop eco system, Amazon Web Services or database systems. Fluency in English.
Posted 3 days ago
3.0 - 6.0 years
13 - 18 Lacs
Pune
Work from Office
ZSs Platform Development team designs, implements, tests and supports ZSs ZAIDYN Platform which helps drive superior customer experiences and revenue outcomes through integrated products & analytics. Whether writing distributed optimization algorithms or advanced mapping and visualization interfaces, you will have an opportunity to solve challenging problems, make an immediate impact and contribute to bring better health outcomes. What you'll do : Pair program, write unit tests, lead code reviews, and collaborate with QA analysts to ensure you develop the highest quality multi-tenant software that can be productized As part of our full-stack product engineering team, you will build multi-tenant cloud-based software products/platforms and internal assets that will leverage cutting edge based on the Amazon AWS cloud platform. Work with junior developers to implement large features that are on the cutting edge of Big Data Be a technical leader to your team, and help them improve their technical skills Stand up for engineering practices that ensure quality products: automated testing, unit testing, agile development, continuous integration, code reviews, and technical design Work with product managers and architects to design product architecture and to work on POCs Take immediate responsibility for project deliverables Understand client business issues and design features that meet client needs Undergo on-the-job and formal trainings and certifications, and will constantly advance your knowledge and problem solving skills What you'll bring : Bachelor's Degree in CS, IT, or related discipline Strong analytic, problem solving, and programming ability Experience in coding in an object-oriented language such as Python, Java, C# etc. Hands on experience on Apache Spark, EMR, Hadoop, HDFS, or other big data technologies Experience with development on the AWS (Amazon Web Services) platform is preferable Experience in Linux shell or PowerShell scripting is preferable Experience in HTML5, JavaScript, and JavaScript libraries is preferable Understanding to Data Science Algorithms God to have Pharma domain understanding Initiative and drive to contribute Excellent organizational and task management skills Strong communication skills Ability to work in global cross-office teams ZS is a global firm; fluency in English is required
Posted 3 days ago
2.0 - 4.0 years
4 - 6 Lacs
Bengaluru
Work from Office
ZS's Insights & Analytics group partners with clients to design and deliver solutions to help them tackle a broad range of business challenges. Our teams work on multiple projects simultaneously, leveraging advanced data analytics and problem-solving techniques. Our recommendations and solutions are based on rigorous research and analysis underpinned by deep expertise and thought leadership. What Youll Do: Develop advanced and efficientstatistically effective algorithms that solve problems of highdimensionality. Utilizetechnical skills such as hypothesis testing, machine learning and retrieval processes to apply statistical and data mining techniques toidentify trends, createfigures, and analyze other relevantinformation. Collaborate with clients and other stakeholders at ZS to integrate and effectively communicate analysis findings. Contribute to the assessment of emerging datasets and technologies thatimpact our analytical platform. What Youll Bring: Amasters degree in computer science, Statistics, or a relevant field; A robust academic performance history with coursework emphasizing analysis and quantitative skills. A knowledge of big data, advanced analytical concepts, and algorithms (e.g., text mining, social listening, recommender systems, predictive modeling, etc.). A proficiency in at least one programming language (e.g., Java/Python/R). Experience with tools/platforms such as the Hadoop eco system, Amazon Web Services or database systems. Fluency in English.
Posted 3 days ago
5.0 - 8.0 years
4 - 7 Lacs
Noida, Gurugram, Bengaluru
Work from Office
What youll do : Strong understanding of data management, data cataloguing, and data governance best practice. Enterprise data integration and management experience working with data management, EDW technologies and data governance solutions. The ideal data governance & data Catalog lead will call on their expertise in master data management (#MDM), data governance, and data quality control to effectively oversee the data elements of a complex product catalog. Showcasing thorough understanding of design and developing data catalog & data assets on industry known leading tool (Open-source catalog tool, Informatica Cloud data catalog, Alation, Collibra or Atlan) that would be the inventory of collective data assets to help data owners, stewards, and business users to discover relevant data for analytics and reporting. Must have experience on Collibra, Data Quality experience, including executing at least 2 large Data Governance, Quality projects from inception to production, working as technology expert. Must have 5+ years of practical experience configuring data governance resources including business glossaries, resources, dashboards, policies, search. Management of Enterprise Glossary through the review of common business terms and definitions and continuous assessments to ensure data adheres to Data Governance Standards, Development and configuration of Collibra/Alation data catalog resources, data lineage, custom resources, custom data lineage, relationships, data domains, data domain groups and composite data domains. Implement Critical Data Elements to govern, corresponding Data Quality rules, policy, regulation, roles, Users, data source systems, dashboard/visualization for multiple data domain. Administration and management of Collibra/Alation data catalogue tool, user groups, permissions Configuration of Data profiling and data lineage Work with Data Owners, stewards, and various stakeholders to understand Collibra/Alation Catalogue requirements and configure it in the tool. What youll bring: Bachelor's or Master's degree in Business Analytics, Computer Science, MIS or related field with academic excellence 3+ years of relevant professional experience in delivering small/medium-scale technology solutions Ability to lead project teams, drive end-to-end activities, meet milestones, and provide mentorship/guidance for the team growth Strong understanding of RDBMS concepts, SQL, data warehousing and reporting Experience with big data concepts, data management, data analytics and cloud platforms Proficiency in programming languages like Python Strong analytical and problem-solving skills, includingexpertise in algorithms and data structures Additional Skills: Strong communication skills, both verbal and written, with the ability to structure thoughts logically during discussions and presentations Capability to simplify complex concepts into easily understandable frameworks and presentations Proficiency in working within a virtual global team environment, contributing to the timely delivery of multiple projects Travel to other offices as required to collaborate with clients and internal project teams Location - Bengaluru,Gurugram,Noida,Pune
Posted 3 days ago
2.0 - 5.0 years
4 - 7 Lacs
Pune
Work from Office
about our diversity, equity, and inclusion efforts and the networks ZS supports to assist our ZSers in cultivating community spaces, obtaining the resources they need to thrive, and sharing the messages they are passionate about. ZSs Platform Development team designs, implements, tests and supports ZSs ZAIDYN Platform which helps drive superior customer experiences and revenue outcomes through integrated products analytics. Whether writing distributed optimization algorithms or advanced mapping and visualization interfaces, you will have an opportunity to solve challenging problems, make an immediate impact and contribute to bring better health outcomes. What you'll do: As part of our full-stack product engineering team, you will build multi-tenant cloud-based software products/platforms and internal assets that will leverage cutting edge based on the Amazon AWS cloud platform. Pair program, write unit tests, lead code reviews, and collaborate with QA analysts to ensure you develop the highest quality multi-tenant software that can be productized. Work with junior developers to implement large features that are on the cutting edge of Big Data Be a technical leader to your team, and help them improve their technical skills Stand up for engineering practices that ensure quality products: automated testing, unit testing, agile development, continuous integration, code reviews, and technical design Work with product managers and architects to design product architecture and to work on POCs Take immediate responsibility for project deliverables Understand client business issues and design features that meet client needs Undergo on-the-job and formal trainings and certifications, and will constantly advance your knowledge and problem solving skills What you'll bring: 1-3 years of experience in developing software, ideally building SaaS products and services Bachelor's Degree in CS, IT, or related discipline Strong analytic, problem solving, and programming ability Good hands on to work with AWS services (EC2, EMR, S3, Serverless stack, RDS, Sagemaker, IAM, EKS etc) Experience in coding in an object-oriented language such as Python, Java, C# etc. Hands on experience on Apache Spark, EMR, Hadoop, HDFS, or other big data technologies Experience with development on the AWS (Amazon Web Services) platform is preferable Experience in Linux shell or PowerShell scripting is preferable Experience in HTML5, JavaScript, and JavaScript libraries is preferable Good to have Pharma domain understanding Initiative and drive to contribute Excellent organizational and task management skills Strong communication skills Ability to work in global cross-office teams ZS is a global firm; fluency in English is required
Posted 3 days ago
1.0 - 6.0 years
8 - 13 Lacs
Pune
Work from Office
Azure Data Engineer JOB_DESCRIPTION.SHARE.HTML CAROUSEL_PARAGRAPH JOB_DESCRIPTION.SHARE.HTML Pune, India India Enterprise IT - 22756 about our diversity, equity, and inclusion efforts and the networks ZS supports to assist our ZSers in cultivating community spaces, obtaining the resources they need to thrive, and sharing the messages they are passionate about. What you'll do: Create and maintain optimal data pipeline architecture. Identify, design, and implement internal process improvements, automating manual processes, optimizing data delivery, re-designing infrastructure for scalability. Design, develop and deploy high volume ETL pipelines to manage complex and near-real time data collection. Develop and optimize SQL queries and stored procedures to meet business requirements. Design, implement, and maintain REST APIs for data interaction between systems. Ensure performance, security, and availability of databases. Handle common database procedures such as upgrade, backup, recovery, migration, etc. Collaborate with other team members and stakeholders. Prepare documentations and specifications. What you'll bring: Bachelors degree in computer science, Information Technology, or related field 1+ years of experience SQL, TSQL, Azure Data Factory or Synapse or relevant ETL technology. Prepare documentations and specifications. Strong analytical skills (impact/risk analysis, root cause analysis, etc.) Proven ability to work in a team environment, creating partnerships across multiple levels. Demonstrated drive for results, with appropriate attention to detail and commitment. Hands-on experience with Azure SQL Database Perks & Benefits ZS offers a comprehensive total rewards package including health and well-being, financial planning, annual leave, personal growth and professional development. Our robust skills development programs, multiple career progression options and internal mobility paths and collaborative culture empowers you to thrive as an individual and global team member. We are committed to giving our employees a flexible and connected way of working. A flexible and connected ZS allows us to combine work from home and on-site presence at clients/ZS offices for the majority of our week. The magic of ZS culture and innovation thrives in both planned and spontaneous face-to-face connections. Travel Travel is a requirement at ZS for client facing ZSers; business needs of your project and client are the priority. While some projects may be local, all client-facing ZSers should be prepared to travel as needed. Travel provides opportunities to strengthen client relationships, gain diverse experiences, and enhance professional growth by working in different environments and cultures. Considering applying? At ZS, we're building a diverse and inclusive company where people bring their passions to inspire life-changing impact and deliver better outcomes for all. We are most interested in finding the best candidate for the job and recognize the value that candidates with all backgrounds, including non-traditional ones, bring. If you are interested in joining us, we encourage you to apply even if you don't meet 100% of the requirements listed above. ZS is an equal opportunity employer and is committed to providing equal employment and advancement opportunities without regard to any class protected by applicable law. To Complete Your Application Candidates must possess or be able to obtain work authorization for their intended country of employment.An on-line application, including a full set of transcripts (official or unofficial), is required to be considered. NO AGENCY CALLS, PLEASE. Find Out More At
Posted 3 days ago
0.0 - 3.0 years
3 - 5 Lacs
Pune, Gurugram
Work from Office
What youll do : Collaborate with ZS internal teams and client teams to shape and implement high quality technology solutions that address critical business problems Understand and analyze business problems thoroughly, and translate them into technical designs effectively Design and implement technical features using best practices for the specific technology stack being used Assist in the development phase of implementing technology solutions for client engagements, ensuring effective problem-solving Apply appropriate development methodologies (e.g., agile, waterfall, system integrated testing, mid-development client reviews, embedded QA procedures, unit testing) to ensure successful and timely completion of projects Provide guidance and support to team members in creating comprehensive project implementation plans Work closely with a development team to accurately interpret and implement business requirements What youll bring : Bachelor's or Master's degree in Business Analytics, Computer Science, MIS or related field with academic excellence Proficiency in RDBMS concepts, SQL, and programming languages such as Python Strong analytical and problem-solving skills to convert intricate business requirements into technology solutions Knowledge of algorithms and data structures Additional Skills : 0-3+ years of relevant professional experience in delivering small/medium-scale technology solutions Strong verbal and written communication skills to effectively convey results and issues to internal and client teams Familiarity with Big Data Concepts and Cloud Platforms like AWS, Azure, and Google Cloud Platform
Posted 3 days ago
2.0 - 7.0 years
7 - 11 Lacs
Bengaluru
Work from Office
Devops Engineer : Bangalore Job Description DevOps Engineer_Qilin Lab Bangalore, India Role Role We are seeking an experienced DevOps Engineer to deliver insights from massive-scale data in real time Specifically, were searching for someone who has fresh ideas and a unique viewpoint, and who enjoys collaborating with a cross-functional team to develop real-world solutions and positive user experiences for every of this role : Work with DevOps to run the production environment by monitoring availability and taking a holistic view of system health Build software and systems to manage our Data Platform infrastructure Improve reliability, quality, and time-to-market of our Global Data Platform Measure and optimize system performance and innovate for continual improvement Provide operational support and engineering for a distributed Platform at : Define, publish and defend service-level objectives (SLOs) Partner with data engineers to improve services through rigorous testing and release procedures Participate in system design, Platform management and capacity planning Create sustainable systems and services through automation and automated run-books Proactive approach to identifying problems and seeking areas for improvement Mentor the team in infrastructure best : Bachelors degree in Computer Science or an IT related field, or equivalent practical experience with a proven track record, The following hands-on working knowledge and experience is required : Kubernetes , EC2 , RDS,ELK Stack, Cloud Platforms (AWS, Azure, GCP) preferably AWS, Building & operating clusters Related technologies such as Containers, Helm, Kustomize, Argocd Ability to program (structured and OOP) using at least one high-level language such as Python, Java, Go, etc Agile Methodologies (Scrum, TDD, BDD, etc) Continuous Integration and Continuous Delivery Tools (gitops) Terraform, Unix/Linux environments Experience with several of the following tools/technologies is desirable : Big Data platforms (eg Apache Hadoop and Apache Spark)Streaming Technologies (Kafka, Kinesis, etc) ElasticSearch Service, Mesh Orchestration technologies, e-g , Argo Knowledge of the following is a plus : Security (OWASP, SIEM, etc)Infrastructure testing (Chaos, Load, Security), Github, Microservices architectures, Notice period : Immediate to 15 days Experience : 3 to 5 years Job Type : Full-time Schedule : Day shift Monday to Friday Work Location : On Site Job Type : Payroll Must Have Skills Python 3 Years Intermediate DevOps 3 Years Intermediate AWS 2 Years Intermediate Agile Methodology 3 Years Intermediate Kubernetes 3 Years Intermediate ElasticSearch 3 Years Intermediate (ref:hirist tech) Show
Posted 3 days ago
10.0 - 14.0 years
20 - 30 Lacs
Noida, Pune, Bengaluru
Hybrid
Greetings from Infogain! We are having Immediate requirement for Big Data Engineer (Lead) position in Infogain India Pvt ltd. As a Big Data Engineer (Lead), you will be responsible for leading a team of big data engineers. You will work closely with clients and team members to understand their requirements and develop architectures that meet their needs. You will also be responsible for providing technical leadership and guidance to your team. Mode of Hiring-Permanent Skills : (Azure OR AWS) AND Apache Spark OR Hive OR Hadoop AND Spark Streaming OR Apache Flink OR Kafka AND NoSQL AND Shell OR Python. Exp: 10 to 14 years Location: Bangalore/Noida/Gurgaon/Pune/Mumbai/Kochi Notice period- Early joiner Educational Qualification: BE/BTech/MCA/M.tech Working Experience 12-15 years of broad experience of working with Enterprise IT applications in cloud platform and big data environments. Competencies & Personal Traits Work as a team player Excellent problem analysis skills Experience with at least one Cloud Infra provider (Azure/AWS) Experience in building data pipelines using batch processing with Apache Spark (Spark SQL, Dataframe API) or Hive query language (HQL) Experience in building streaming data pipeline using Apache Spark Structured Streaming or Apache Flink on Kafka & Delta Lake Knowledge of NOSQL databases. Good to have experience in Cosmos DB, Restful APIs and GraphQL Knowledge of Big data ETL processing tools, Data modelling and Data mapping. Experience with Hive and Hadoop file formats (Avro / Parquet / ORC) Basic knowledge of scripting (shell / bash) Experience of working with multiple data sources including relational databases (SQL Server / Oracle / DB2 / Netezza), NoSQL / document databases, flat files Basic understanding of CI CD tools such as Jenkins, JIRA, Bitbucket, Artifactory, Bamboo and Azure Dev-ops. Basic understanding of DevOps practices using Git version control Ability to debug, fine tune and optimize large scale data processing jobs Can share CV @ arti.sharma@infogain.com Total Exp Experience- Relevant Experience in Big data Relevant Exp in AWS OR Azure Cloud- Current CTC- Exp CTC- Current location - Ok for Bangalore location-
Posted 3 days ago
2.0 - 5.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Description Retail Business Services (RBS) supports Amazons Retail business growth WW through three core tasks These are (a) Selection, where RBS sources, creates and enrich ASINs to drive GMS growth; (b) Defect Elimination: where RBS resolves inbound supply chain defects and develops root cause fixes to improve free cash flow and (c) supports operational process for WW Retail teams where there is an air gap in the tech stack The tech team in RBS develops automation that leverages Machine/Deep Learning to scale execution of these high complex tasks that currently require human cognitive skills Our solutions ensure that information in Amazon's catalog is complete, correct and, comprehensive enough to give Amazon customers a great shopping experience every time That's where you can help, We are looking for a sharp, experienced Application Engineer (AE) with a diverse skillset and background As an AE, you will work directly with our business teams to solve their support needs with the existing applications and collect requirements and ways to solve highly scalable solutions in collaboration with other technical teams You will play an active role in translating business and functional requirements into concrete deliverables and building scalable systems You will also contribute to maintain the services healthy and robust You will be responsible for implementing, and maintaining the solutions you provide You will work closely with engineers on maintaining multiple products and services, creating process automation scripts , monitoring and handling ad-hoc operational asks, Basic Qualifications 2+ years of software development, or 2+ years of technical support experience Experience troubleshooting and debugging technical systems Experience in Unix Experience scripting in modern program languages Knowledge of Python, PySpark, Big Data and SQL Queries Preferred Qualifications Knowledge of web services, distributed systems, and web application development Experience with REST web services, XML, JSON Our inclusive culture empowers Amazonians to deliver the best results for our customers If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, amazon jobs / content / en / how-we-hire / accommodations for more information If the country/region youre applying in isnt listed, please contact your Recruiting Partner, Company ADCI BLR 14 SEZ Job ID: A3032552 Show
Posted 3 days ago
6.0 - 10.0 years
20 - 27 Lacs
Pune, Chennai
Work from Office
Mandatory - Experience and knowledge in designing, implementing, and managing non-relational data stores (e.g., MongoDB, Cassandra, DynamoDB), focusing on flexible schema design, scalability, and performance optimization for handling large volumes of unstructured or semi-structured data. Mainly client needs No SQL DB, either MongoDB or HBase Data Pipeline Development: Design, develop, test, and deploy robust, high-performance, and scalable ETL/ELT data pipelines using Scala and Apache Spark to ingest, process, and transform large volumes of structured and unstructured data from diverse sources. Big Data Expertise: Leverage expertise in the Hadoop ecosystem (HDFS, Hive, etc.) and distributed computing principles to build efficient and fault-tolerant data solutions. Advanced SQL: Write complex, optimized SQL queries and stored procedures. Performance Optimization: Continuously monitor, analyze, and optimize the performance of data pipelines and data stores. Troubleshoot complex data-related issues, identify bottlenecks, and implement solutions for improved efficiency and reliability. Data Quality & Governance: Implement data quality checks, validation rules, and reconciliation processes to ensure the accuracy, completeness, and consistency of data. Contribute to data governance and security best practices. Automation & CI/CD: Implement automation for data pipeline deployment, monitoring, and alerting using tools like Apache Airflow, Jenkins, or similar CI/CD platforms. Documentation: Create and maintain comprehensive technical documentation for data architectures, pipelines, and processes. Required Skills & Qualifications: Bachelor's or Master's degree in Computer Science, Engineering, or a related quantitative field. Minimum 5 years of professional experience in Data Engineering, with a strong focus on big data technologies. Proficiency in Scala for developing big data applications and transformations, especially with Apache Spark. Expert-level proficiency in SQL; ability to write complex queries, optimize performance, and understand database internals. Extensive hands-on experience with Apache Spark (Spark SQL, DataFrames, RDDs) for large-scale data processing and analytics. Mandatory - Experience and knowledge in designing, implementing, and managing non-relational data stores (e.g., MongoDB, Cassandra, DynamoDB), focusing on flexible schema design, scalability, and performance optimization for handling large volumes of unstructured or semi-structured data. Solid understanding of distributed computing concepts and experience with the Hadoop ecosystem (HDFS, Hive). Experience with building and optimizing ETL/ELT processes and data warehousing concepts. Strong understanding of data modeling techniques (e.g., Star Schema, Snowflake Schema). Familiarity with version control systems (e.g., Git). Excellent problem-solving, analytical, and communication skills. Ability to work independently and collaboratively in an Agile team environment.
Posted 3 days ago
6.0 - 11.0 years
12 - 17 Lacs
Bengaluru
Work from Office
Role Overview: The Enterprise Architect has strong expertise in AWS AI technologies and Anthropic systems. The jobholder designs and implements cutting-edge AI solutions that align with organizational goals, ensuring scalability, security, and innovation. Responsibilities: Architect and implement AI solutions using AWS AI services and Anthropic systems. Collaborate with stakeholders to understand business requirements and translate them into technical solutions. Design scalable and secure architectures for AI-driven applications. Optimize AI workflows for performance and reliability. Provide technical leadership and mentorship to development teams. Stay updated with emerging trends in AI and cloud technologies. Troubleshoot and resolve complex technical issues related to AI systems. Document architectural designs and decisions for future reference. Eligibility Criteria: Bachelor's degree in Computer Science, Information Technology, or a related field. Extensive experience with AWS AI services and Anthropic systems. Strong understanding of AI architecture, design, and optimization. Proficiency in programming languages such as Python and Java. Experience with cloud-based AI solutions is a plus. Familiarity with Agile development methodologies. Knowledge of data governance and compliance standards. Excellent problem-solving and analytical skills. Proven leadership and team management abilities. Ability to work in a fast-paced environment and manage multiple priorities.
Posted 3 days ago
4.0 - 7.0 years
4 - 7 Lacs
Bengaluru
Work from Office
Role Overview: At Skyhigh Security, we are building ground-breaking technology to help enterprises enable and accelerate the safe adoption of cloud services. SSE products help the worlds largest organizations unleash the power of the cloud by providing real-time protection for enterprise data and users across all cloud services. The Data Analytics team of our cloud service BU is looking for a capable, enthusiastic Big Data Test Engineer who will be a creative, innovative and results-oriented person willing to go the extra mile in a fast-paced environment. Take ownership of major big data components/services and all backend aspects of the software life cycle in a SaaS environment. Data Analytics team manages big data pipelines and machine learning systems pertaining to our Skyhigh Security Cloud. We are responsible for analysing more than 40 terabytes of data per day, and we inspect close to billion activities of users in real time for threat protection and monitoring. As a member of our engineering team, youll provide technical expertise (architecture, design, development, code reviews, use of modern static analysis tools, unit testing & system integration, automated testing, etc.). The role requires frequent use of ingenuity, creativity and thinking outside-the-box, in order to effectively contribute to our outstanding analytics solution and capabilities. We firmly believe in our values, and it is what makes us tick as one of the successful team within Skyhigh Security. The more these values resonate with you better the chance of you thriving within our environment. You find clarity and make right decisions despite ambiguity You are curious in general and fascinated by how things work in this world You listen well before you respond to others You want to make an impact on the team and the company You are not afraid to speak your mind and willing to put the team ahead of yourself You are humble, and genuinely want to help your team members You can remain calm even in a most stressful situation You will aim for simplicity in whatever you do The successful candidate possesses the excellent interpersonal and communication skills required to partner with other teams across the business to identify opportunities and risks and develop and deliver solutions that support business strategies. This individual will report into the Senior Engineering Manager within the Cloud Business Unit and will be based in Bangalore, India About the role: End to End Software Development and Test, Automate, build, maintenance, and production support of big data pipelines and Hadoop ecosystem Recognize the big picture and take initiative to solve the problem and Automate. Being aware of current big data technology trends & factoring this into current design and implementation. Document Test Plan, Automation Strategy and present it to the stakeholders Identifies, recommends, coordinates, deliver timely knowledge to the globally distributed teams regarding technologies, processes, and tools Proactively identify and communicate roadblocks. About You- Minimum Requirements: Bachelor's degree in Computer Science or equivalent degree. Master'sdegree is a plus Overall 4 to 7 years of experience. Individual contribution as needed and coordinate with other teams Good exposure to test frameworks like JUnit, Test NG, Cucumber and mocking frameworks. Developing application with Java and spring Test, Develop and implement automated tests using Python Experience in any Automation Framework. Hands on experience on Robot Framework will be a plus. Having Big data experience will be a plus. Exposure to Agile development, TDD, and Lean development Experience with AWS CloudFormation, Cloudwatch, SQS, Lambda is a plus.
Posted 3 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough