Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
8.0 - 10.0 years
13 - 17 Lacs
Pune
Remote
We are seeking a highly experienced and skilled Data Architect to join our dynamic team. In this pivotal role, you will work directly with our enterprise customers, leveraging your expertise in the Adobe Experience Platform (AEP) to design and implement robust data solutions. The ideal candidate will possess a profound understanding of data modeling, SQL, ETL processes, and customer data architecture, with a proven track record of delivering impactful results in large-scale environments. About the Role : As a Data Architect specializing in AEP, you will be instrumental in transforming complex customer data into actionable insights. You will serve as a technical expert, guiding clients through the intricacies of data integration, standardization, and activation within the Adobe ecosystem. This is an exciting opportunity to lead projects, innovate solutions, and make a significant impact on our clients' data strategies. Key Responsibilities : - Interface directly with Adobe enterprise customers to meticulously gather requirements, design tailored data solutions, and provide expert architectural recommendations. - Foster seamless collaboration with both onshore and offshore engineering teams to ensure efficient and high-quality solution delivery. - Produce comprehensive and clear technical specification documents for the effective implementation of data solutions. - Design and structure sophisticated data models to enable advanced customer-level analytics and reporting. - Develop and implement robust customer ID mapping processes to create a unified and holistic customer view across diverse data sources. - Automate data movement, cleansing, and transformation processes using various scripting languages and tools. - Lead client calls and proactively manage project deliverables, ensuring timelines and expectations are met. - Continuously identify and propose innovative solutions to address complex customer data challenges and optimize data workflows. Requirements : - 10+ years of demonstrable experience in data transformation and ETL processes across large and complex datasets. - 5+ years of hands-on experience in data modeling (Relational, Dimensional, Big Data paradigms). - Profound expertise with SQL/NoSQL databases and a strong understanding of data warehouse concepts. - Proficiency with industry-leading ETL tools (e.g., Informatica, Iil, etc.). - Proven experience with reporting and business intelligence tools such as Tableau and Power BI. - In-depth knowledge of customer-centric datasets (e.g., CRM, Call Center, Marketing Automation platforms, Web Analytics). - Exceptional communication, time management, and multitasking skills, with the ability to articulate complex technical concepts clearly to diverse audiences. - Bachelor's or Masters degree in Computer Science, Data Science, or a closely related quantitative field.
Posted 1 month ago
8.0 - 10.0 years
13 - 17 Lacs
Thane
Remote
We are seeking a highly experienced and skilled Data Architect to join our dynamic team. In this pivotal role, you will work directly with our enterprise customers, leveraging your expertise in the Adobe Experience Platform (AEP) to design and implement robust data solutions. The ideal candidate will possess a profound understanding of data modeling, SQL, ETL processes, and customer data architecture, with a proven track record of delivering impactful results in large-scale environments. About the Role : As a Data Architect specializing in AEP, you will be instrumental in transforming complex customer data into actionable insights. You will serve as a technical expert, guiding clients through the intricacies of data integration, standardization, and activation within the Adobe ecosystem. This is an exciting opportunity to lead projects, innovate solutions, and make a significant impact on our clients' data strategies. Key Responsibilities : - Interface directly with Adobe enterprise customers to meticulously gather requirements, design tailored data solutions, and provide expert architectural recommendations. - Foster seamless collaboration with both onshore and offshore engineering teams to ensure efficient and high-quality solution delivery. - Produce comprehensive and clear technical specification documents for the effective implementation of data solutions. - Design and structure sophisticated data models to enable advanced customer-level analytics and reporting. - Develop and implement robust customer ID mapping processes to create a unified and holistic customer view across diverse data sources. - Automate data movement, cleansing, and transformation processes using various scripting languages and tools. - Lead client calls and proactively manage project deliverables, ensuring timelines and expectations are met. - Continuously identify and propose innovative solutions to address complex customer data challenges and optimize data workflows. Requirements : - 10+ years of demonstrable experience in data transformation and ETL processes across large and complex datasets. - 5+ years of hands-on experience in data modeling (Relational, Dimensional, Big Data paradigms). - Profound expertise with SQL/NoSQL databases and a strong understanding of data warehouse concepts. - Proficiency with industry-leading ETL tools (e.g., Informatica, Iil, etc.). - Proven experience with reporting and business intelligence tools such as Tableau and Power BI. - In-depth knowledge of customer-centric datasets (e.g., CRM, Call Center, Marketing Automation platforms, Web Analytics). - Exceptional communication, time management, and multitasking skills, with the ability to articulate complex technical concepts clearly to diverse audiences. - Bachelor's or Masters degree in Computer Science, Data Science, or a closely related quantitative field.
Posted 1 month ago
8.0 - 10.0 years
13 - 17 Lacs
Mumbai
Remote
We are seeking a highly experienced and skilled Data Architect to join our dynamic team. In this pivotal role, you will work directly with our enterprise customers, leveraging your expertise in the Adobe Experience Platform (AEP) to design and implement robust data solutions. The ideal candidate will possess a profound understanding of data modeling, SQL, ETL processes, and customer data architecture, with a proven track record of delivering impactful results in large-scale environments. About the Role : As a Data Architect specializing in AEP, you will be instrumental in transforming complex customer data into actionable insights. You will serve as a technical expert, guiding clients through the intricacies of data integration, standardization, and activation within the Adobe ecosystem. This is an exciting opportunity to lead projects, innovate solutions, and make a significant impact on our clients' data strategies. Key Responsibilities : - Interface directly with Adobe enterprise customers to meticulously gather requirements, design tailored data solutions, and provide expert architectural recommendations. - Foster seamless collaboration with both onshore and offshore engineering teams to ensure efficient and high-quality solution delivery. - Produce comprehensive and clear technical specification documents for the effective implementation of data solutions. - Design and structure sophisticated data models to enable advanced customer-level analytics and reporting. - Develop and implement robust customer ID mapping processes to create a unified and holistic customer view across diverse data sources. - Automate data movement, cleansing, and transformation processes using various scripting languages and tools. - Lead client calls and proactively manage project deliverables, ensuring timelines and expectations are met. - Continuously identify and propose innovative solutions to address complex customer data challenges and optimize data workflows. Requirements : - 10+ years of demonstrable experience in data transformation and ETL processes across large and complex datasets. - 5+ years of hands-on experience in data modeling (Relational, Dimensional, Big Data paradigms). - Profound expertise with SQL/NoSQL databases and a strong understanding of data warehouse concepts. - Proficiency with industry-leading ETL tools (e.g., Informatica, Iil, etc.). - Proven experience with reporting and business intelligence tools such as Tableau and Power BI. - In-depth knowledge of customer-centric datasets (e.g., CRM, Call Center, Marketing Automation platforms, Web Analytics). - Exceptional communication, time management, and multitasking skills, with the ability to articulate complex technical concepts clearly to diverse audiences. - Bachelor's or Masters degree in Computer Science, Data Science, or a closely related quantitative field.
Posted 1 month ago
8.0 - 10.0 years
13 - 17 Lacs
Ahmedabad
Remote
We are seeking a highly experienced and skilled Data Architect to join our dynamic team. In this pivotal role, you will work directly with our enterprise customers, leveraging your expertise in the Adobe Experience Platform (AEP) to design and implement robust data solutions. The ideal candidate will possess a profound understanding of data modeling, SQL, ETL processes, and customer data architecture, with a proven track record of delivering impactful results in large-scale environments. About the Role : As a Data Architect specializing in AEP, you will be instrumental in transforming complex customer data into actionable insights. You will serve as a technical expert, guiding clients through the intricacies of data integration, standardization, and activation within the Adobe ecosystem. This is an exciting opportunity to lead projects, innovate solutions, and make a significant impact on our clients' data strategies. Key Responsibilities : - Interface directly with Adobe enterprise customers to meticulously gather requirements, design tailored data solutions, and provide expert architectural recommendations. - Foster seamless collaboration with both onshore and offshore engineering teams to ensure efficient and high-quality solution delivery. - Produce comprehensive and clear technical specification documents for the effective implementation of data solutions. - Design and structure sophisticated data models to enable advanced customer-level analytics and reporting. - Develop and implement robust customer ID mapping processes to create a unified and holistic customer view across diverse data sources. - Automate data movement, cleansing, and transformation processes using various scripting languages and tools. - Lead client calls and proactively manage project deliverables, ensuring timelines and expectations are met. - Continuously identify and propose innovative solutions to address complex customer data challenges and optimize data workflows. Requirements : - 10+ years of demonstrable experience in data transformation and ETL processes across large and complex datasets. - 5+ years of hands-on experience in data modeling (Relational, Dimensional, Big Data paradigms). - Profound expertise with SQL/NoSQL databases and a strong understanding of data warehouse concepts. - Proficiency with industry-leading ETL tools (e.g., Informatica, Iil, etc.). - Proven experience with reporting and business intelligence tools such as Tableau and Power BI. - In-depth knowledge of customer-centric datasets (e.g., CRM, Call Center, Marketing Automation platforms, Web Analytics). - Exceptional communication, time management, and multitasking skills, with the ability to articulate complex technical concepts clearly to diverse audiences. - Bachelor's or Masters degree in Computer Science, Data Science, or a closely related quantitative field.
Posted 1 month ago
1.0 - 3.0 years
13 - 17 Lacs
Lucknow
Remote
We are seeking a highly experienced and skilled Data Architect to join our dynamic team. In this pivotal role, you will work directly with our enterprise customers, leveraging your expertise in the Adobe Experience Platform (AEP) to design and implement robust data solutions. The ideal candidate will possess a profound understanding of data modeling, SQL, ETL processes, and customer data architecture, with a proven track record of delivering impactful results in large-scale environments. About the Role : As a Data Architect specializing in AEP, you will be instrumental in transforming complex customer data into actionable insights. You will serve as a technical expert, guiding clients through the intricacies of data integration, standardization, and activation within the Adobe ecosystem. This is an exciting opportunity to lead projects, innovate solutions, and make a significant impact on our clients' data strategies. Key Responsibilities : - Interface directly with Adobe enterprise customers to meticulously gather requirements, design tailored data solutions, and provide expert architectural recommendations. - Foster seamless collaboration with both onshore and offshore engineering teams to ensure efficient and high-quality solution delivery. - Produce comprehensive and clear technical specification documents for the effective implementation of data solutions. - Design and structure sophisticated data models to enable advanced customer-level analytics and reporting. - Develop and implement robust customer ID mapping processes to create a unified and holistic customer view across diverse data sources. - Automate data movement, cleansing, and transformation processes using various scripting languages and tools. - Lead client calls and proactively manage project deliverables, ensuring timelines and expectations are met. - Continuously identify and propose innovative solutions to address complex customer data challenges and optimize data workflows. Requirements : - 10+ years of demonstrable experience in data transformation and ETL processes across large and complex datasets. - 5+ years of hands-on experience in data modeling (Relational, Dimensional, Big Data paradigms). - Profound expertise with SQL/NoSQL databases and a strong understanding of data warehouse concepts. - Proficiency with industry-leading ETL tools (e.g., Informatica, Iil, etc.). - Proven experience with reporting and business intelligence tools such as Tableau and Power BI. - In-depth knowledge of customer-centric datasets (e.g., CRM, Call Center, Marketing Automation platforms, Web Analytics). - Exceptional communication, time management, and multitasking skills, with the ability to articulate complex technical concepts clearly to diverse audiences. - Bachelor's or Masters degree in Computer Science, Data Science, or a closely related quantitative field.
Posted 1 month ago
1.0 - 3.0 years
13 - 17 Lacs
Visakhapatnam
Remote
We are seeking a highly experienced and skilled Data Architect to join our dynamic team. In this pivotal role, you will work directly with our enterprise customers, leveraging your expertise in the Adobe Experience Platform (AEP) to design and implement robust data solutions. The ideal candidate will possess a profound understanding of data modeling, SQL, ETL processes, and customer data architecture, with a proven track record of delivering impactful results in large-scale environments. About the Role : As a Data Architect specializing in AEP, you will be instrumental in transforming complex customer data into actionable insights. You will serve as a technical expert, guiding clients through the intricacies of data integration, standardization, and activation within the Adobe ecosystem. This is an exciting opportunity to lead projects, innovate solutions, and make a significant impact on our clients' data strategies. Key Responsibilities : - Interface directly with Adobe enterprise customers to meticulously gather requirements, design tailored data solutions, and provide expert architectural recommendations. - Foster seamless collaboration with both onshore and offshore engineering teams to ensure efficient and high-quality solution delivery. - Produce comprehensive and clear technical specification documents for the effective implementation of data solutions. - Design and structure sophisticated data models to enable advanced customer-level analytics and reporting. - Develop and implement robust customer ID mapping processes to create a unified and holistic customer view across diverse data sources. - Automate data movement, cleansing, and transformation processes using various scripting languages and tools. - Lead client calls and proactively manage project deliverables, ensuring timelines and expectations are met. - Continuously identify and propose innovative solutions to address complex customer data challenges and optimize data workflows. Requirements : - 10+ years of demonstrable experience in data transformation and ETL processes across large and complex datasets. - 5+ years of hands-on experience in data modeling (Relational, Dimensional, Big Data paradigms). - Profound expertise with SQL/NoSQL databases and a strong understanding of data warehouse concepts. - Proficiency with industry-leading ETL tools (e.g., Informatica, Iil, etc.). - Proven experience with reporting and business intelligence tools such as Tableau and Power BI. - In-depth knowledge of customer-centric datasets (e.g., CRM, Call Center, Marketing Automation platforms, Web Analytics). - Exceptional communication, time management, and multitasking skills, with the ability to articulate complex technical concepts clearly to diverse audiences. - Bachelor's or Masters degree in Computer Science, Data Science, or a closely related quantitative field.
Posted 1 month ago
2.0 - 5.0 years
13 - 17 Lacs
Hyderabad
Remote
We are seeking a highly experienced and skilled Data Architect to join our dynamic team. In this pivotal role, you will work directly with our enterprise customers, leveraging your expertise in the Adobe Experience Platform (AEP) to design and implement robust data solutions. The ideal candidate will possess a profound understanding of data modeling, SQL, ETL processes, and customer data architecture, with a proven track record of delivering impactful results in large-scale environments. About the Role : As a Data Architect specializing in AEP, you will be instrumental in transforming complex customer data into actionable insights. You will serve as a technical expert, guiding clients through the intricacies of data integration, standardization, and activation within the Adobe ecosystem. This is an exciting opportunity to lead projects, innovate solutions, and make a significant impact on our clients' data strategies. Key Responsibilities : - Interface directly with Adobe enterprise customers to meticulously gather requirements, design tailored data solutions, and provide expert architectural recommendations. - Foster seamless collaboration with both onshore and offshore engineering teams to ensure efficient and high-quality solution delivery. - Produce comprehensive and clear technical specification documents for the effective implementation of data solutions. - Design and structure sophisticated data models to enable advanced customer-level analytics and reporting. - Develop and implement robust customer ID mapping processes to create a unified and holistic customer view across diverse data sources. - Automate data movement, cleansing, and transformation processes using various scripting languages and tools. - Lead client calls and proactively manage project deliverables, ensuring timelines and expectations are met. - Continuously identify and propose innovative solutions to address complex customer data challenges and optimize data workflows. Requirements : - 10+ years of demonstrable experience in data transformation and ETL processes across large and complex datasets. - 5+ years of hands-on experience in data modeling (Relational, Dimensional, Big Data paradigms). - Profound expertise with SQL/NoSQL databases and a strong understanding of data warehouse concepts. - Proficiency with industry-leading ETL tools (e.g., Informatica, Iil, etc.). - Proven experience with reporting and business intelligence tools such as Tableau and Power BI. - In-depth knowledge of customer-centric datasets (e.g., CRM, Call Center, Marketing Automation platforms, Web Analytics). - Exceptional communication, time management, and multitasking skills, with the ability to articulate complex technical concepts clearly to diverse audiences. - Bachelor's or Masters degree in Computer Science, Data Science, or a closely related quantitative field.
Posted 1 month ago
3.0 - 8.0 years
0 - 3 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
Need Immediate Joiners Responsibilities: Defines, designs, develops and test software components/applications using AWS - (Databricks on AWS, AWS Glue, Amazon S3, AWS Lambda, Amazon Redshift, AWS Secrets Manager) Strong SQL skills with experience. Experience handling Structured and unstructured datasets. Experience in Data Modeling and Advanced SQL techniques. Experience implementing AWS Glue, Airflow, or any other data orchestration tool using latest technologies and techniques. • Good exposure in Application Development. The candidate should work independently with minimal supervision. Must Have: Hands-on experience with distributed computing framework like Databricks, Spark-Ecosystem (Spark Core, PySpark, Spark Streaming, SparkSQL) Willing to work with product teams to best optimize product features/functions. Experience on Batch workloads and real-time streaming with high volume data frequency Performance optimization on Spark workloads • Environment setup, user management, Authentication and cluster management on Databricks Professional curiosity and the ability to enable yourself in new technologies and tasks. Good understanding of SQL and a good grasp of relational and analytical database management theory and practice. Key Skills: • Python, SQL and PySpark • Big Data Ecosystem (Hadoop, Hive, Sqoop, HDFS, HBase) • Spark Ecosystem (Spark Core, Spark Streaming, Spark SQL) / Databricks • AWS (AWS Glue, Databricks on AWS, Lambda, Amazon Redshift, Amazon S3, AWS Secrets Manager) • Data Modelling, ETL Methodology
Posted 1 month ago
10.0 - 14.0 years
0 Lacs
karnataka
On-site
Position Summary. Drives the execution of multiple business plans and projects by identifying customer and operational needs; developing and communicating business plans and priorities; removing barriers and obstacles that impact performance; providing resources; identifying performance standards; measuring progress and adjusting performance accordingly; developing contingency plans; and demonstrating adaptability and supporting continuous learning. Provides supervision and development opportunities for associates by selecting and training; mentoring; assigning duties; building a team-based work environment; establishing performance expectations and conducting regular performance evaluations; providing recognition and rewards; coaching for success and improvement; and ensuring diversity awareness. Promotes and supports company policies, procedures, mission, values, and standards of ethics and integrity by training and providing direction to others in their use and application; ensuring compliance with them; and utilizing and supporting the Open Door Policy. Ensures business needs are being met by evaluating the ongoing effectiveness of current plans, programs, and initiatives; consulting with business partners, managers, co-workers, or other key stakeholders; soliciting, evaluating, and applying suggestions for improving efficiency and cost-effectiveness; and participating in and supporting community outreach events. What you'll do. About Team Ever wondered what would a convergence of online and offline advertising systems looks like Ever wondered how we can bridge the gap between sponsored search, display, video ad formats Ever thought how we can write our own ad servers which serve billions of requests in near real time Our Advertising Technology team is building an end-to-end advertising platform that is key to Walmarts overall growth strategy. We use cutting edge machine learning, data mining and optimization algorithms to ingest, model and analyze Walmarts proprietary online and in-store data, encompassing 95% of American households. Importantly, we build smart data systems that deliver relevant retail ads and experiences that connect our customers with the brands and products they love. Your Opportunity We are looking for a versatile principal data scientist who has a strong expertise in machine learning, deep learning with good software engineering skills and significant exposure to building ML solutions; including Gen AI solutions scratch up & leading data science engagement. The opportunities that will come with this role are As a seasoned SME in MLE, you will get to work on & take a lead in scaling & deployment for the most challenging of our data science solutions (including , but definitely not limited to Gen-AI solutions) across a broad spectrum of advertising domain. Influence the best practices that we should follow as we scale & deploy our solutions across a diverse set of product. Train and mentor our pool of data scientists in data sciences and MLE skills Contribute to the Tech org via patents, publications & open source contributions. What You Will Do Design large-scale AI/ML products/systems impacting millions of customers Develop highly scalable, timely, highly-performant, instrumented, and accurate data pipelines Drive and ensure that MLOps practices are being followed in solutions Enable data governance practices and processes by being a passionate adopter and ambassador. Drive data pipeline efficiency, data quality, efficient feature engineering, maintenance of different DBs (like Vector DBs, Graph DBs, feature stores, caching mechanism) Lead and inspire a team of scientists and engineers solving AI/ML problems through R&D while pushing the state-of-the-art Lead the team to develop production-level code for the implementation of AI/ML solutions using best practices to handle high-scale and low-latency requirements Deploy batch and real-time ML solutions, model results consumption and integration pipelines. Work with the customer-centric mindset to deliver high-quality business-driven analytic solutions. Drive proactive optimisation of code and deployments, improving efficiency, cost and resource optimisation. Design model architecture, optimal Tech stack and model choices, integration with larger engineering ecosystem, drive best-practices of model integrations working closely with Software Engineering leaders Consult with business stakeholders regarding algorithm-based recommendations and be a thought-leader to deploy these & drive business actions. Closely partners with the Senior Managers & Director of Data Science, Engineering and product counterparts to drive data science adoption in the domain Collaborate with multiple stakeholders to drive innovation at scale Build a strong external presence by publishing your team's work in top-tier AI/ML conferences and developing partnerships with academic institutions Adhere to Walmart's policies, procedures, mission, values, standards of ethics and integrity Adopt Walmart's quality standards, develop/recommend process standards and best practices across the retail industry. Drive data pipeline efficiency, data quality, efficient feature engineering, maintenance of different DBs (like Vector DBs, Graph DBs, feature stores, caching mechanism). Deploy batch and real-time ML solutions, model results consumption and integration pipelines. Design model architecture, Optimal Tech stack and model choices, integration with larger engineering ecosystem, drive best-practices of model integrations working closely with Software Engineering leaders. Drive proactive optimisation of code and deployments, improving efficiency, cost and resource optimisation. What You Will Bring Bachelors with > 13 years or Master's with > 12 years OR Ph.D. with > 10 years of relevant experience. Educational qualifications should be in Engineering / Data sciences. Strong experience working with state-of-the-art supervised and unsupervised machine learning algorithms on real-world problems. Experienced in architecting solutions with Continuous Integration and Continuous Delivery in mind. Strong experience in real time ML solution deployment Experience with deployment patterns for Distributed Systems. Strong Python coding and package development skills. Experience with Big Data and analytics in general leveraging technologies like Hadoop, Spark, and MapReduce. Ability to work in a big data ecosystem - expert in SQL/Hive/Spark. About Walmart Global Tech: Imagine working in an environment where one line of code can make life easier for hundreds of millions of people and put a smile on their face. Thats what we do at Walmart Global Tech. Were a team of 15,000+ software engineers, data scientists and service professionals within Walmart, the worlds largest retailer, delivering innovations that improve how our customers shop and empower our 2.3 million associates. To others, innovation looks like an app, service, or some code, but Walmart has always been about people. People are why we innovate, and people power our innovations. Being human led is our true disruption. Flexible, hybrid work: We use a hybrid way of working that is primarily in office coupled with virtual when not onsite. Our campuses serve as a hub to enhance collaboration, bring us together for purpose and deliver on business needs. This approach helps us make quicker decisions, remove location barriers across our global team and be more flexible in our personal lives. Benefits: Beyond our great compensation package, you can receive incentive awards for your performance. Other great perks include a host of best-in-class benefits maternity and parental leave, PTO, health benefits, and much more. Equal Opportunity Employer: Walmart, Inc. is an Equal Opportunity Employer By Choice. We believe we are best equipped to help our associates, customers and the communities we serve live better when we really know them. That means understanding, respecting and valuing diversity- unique styles, experiences, identities, ideas and opinions while being inclusive of all people. Minimum Qualifications. Outlined below are the required minimum qualifications for this position. If none are listed, there are no minimum qualifications. Minimum Qualifications:Option 1: Bachelors degree in Statistics, Economics, Analytics, Mathematics, Computer Science, Information Technology or related field and 5 years" experience in an analytics related field. Option 2: Masters degree in Statistics, Economics, Analytics, Mathematics, Computer Science, Information Technology or related field and 3 years" experience in an analytics related field. Option 3: 7 years" experience in an analytics or related field. Preferred Qualifications. Outlined below are the optional preferred qualifications for this position. If none are listed, there are no preferred qualifications. Primary Location. G, 1, 3, 4, 5 Floor, Building 11, Sez, Cessna Business Park, Kadubeesanahalli Village, Varthur Hobli , India R-1925040,
Posted 1 month ago
8.0 - 12.0 years
0 Lacs
karnataka
On-site
Department: IT Project Location(s): Bangalore, Karnataka Job Type: Full Time Education: Bachelor in Engineering / Technology Senior Devops Engineer: Exp: 8+ Years Responsibilities: Manage the successful outcomes of all deliverables within the resource and time constraints as defined by the DevOps Manager and Delivery lead Participate in project stand-ups by effectively communicating status and impediments to progress Attend iteration planning sessions and participate in the sizing of the testing effort required to complete stories, ensuring external dependencies are considered Work closely and communicate effectively with the development team and product stakeholders to ensure project outcomes adhere to agreed quality standards Required skills and experience: Extensive cloud experience with AWS (VPC, EC2, Route53, IAM, STS, RDS, API GW) with a strong emphasis on security and well-architected solutions Sound grasp of security best practices in the cloud architectures Familiarity with configuration management (experience with Ansible, Puppet or Chef is a bonus) Being competent in configuring and administering a Linux system, including knowledge and experience with Bash Solid coding/scripting skills in at least 1 major language with a strong preference for one of the following: Python, Golang, Javascript Strong experience with infrastructure as code tools (AWS CloudFormation preferred) Solid experience in building and maintaining Docker images Understanding of and hands-on experience in using CI/CD tools such as AWS CodePipeline (preferred), Github Actions, TeamCity, Bamboo, GoCD, or Concourse CI Solid understanding of TCP/IP, networking and routing protocols Experience in implementing CDN/DDoS/WAF technologies Strong understanding and experience with Identity & Access Management solutions Experience in Web and API development and architectures (Event Driven, Microservices, SOA) Bonus points for: AWS Associate and/or Professional Level Certifications Experience working in Agile software and product development teams Experience with AWSs Machine Learning/AI suite, including SageMaker, Rekognition, Transcribe etc Strong skill sets in Web and API development and architectures (Event Driven, SOA, Microservices) Experience with AWS CloudFront & WAF or Akamai / Cloudflare Experience with Docker clustering and management Experience with AWSs Big Data & Analytics services, such as Glue, Redshift etc Cloud migration and delivery exposure This is custom heading element,
Posted 1 month ago
4.0 - 8.0 years
0 Lacs
chennai, tamil nadu
On-site
You should be strong in data structures and algorithms, and have experience working on a large scale consumer product. It is essential that you have worked on distributed and microservice architecture, and possess a solid understanding of scale, performance, and memory optimization fundamentals. Requirements And Skills - You should hold a BS/MS/BTech/MTech degree in Computer Science, Engineering, or a related field. - You must have a minimum of 4-8 years of experience in Java/J2EE Technologies. - Experience in designing open APIs and implementing oAuth2 is required. - Proficiency in Kafka, JMS, RabbitMQ, and AWS Elastic Queue is a must. - You should have hands-on experience with Spring, Hibernate, Tomcat, Jetty, and Undertow in a production environment. - Familiarity with Junit, Mockito for unit test cases, and MySQL or any other RDBMS is necessary. - Proven experience in software development and Java development is essential. - Hands-on experience in designing and developing applications using Java EE platforms is required. - Knowledge of Object-Oriented analysis and design using common design patterns is expected. Preferred - Experience in handling high traffic applications is a plus. - Familiarity with MongoDB, Redis, CouchDB, DynamoDB, and Riak is preferred. - Experience in Asynchronous Programming (Actor model concurrency, RxJava, Executor Framework) is a bonus. - Knowledge of Lucene, ElasticSearch, Solr, Jenkins, and Docker is advantageous. - Experience in other languages/technologies like Scala, NodeJs, PHP is a plus. - Experience in AWS, Google, Azure Cloud for managing, monitoring, and hosting servers is a bonus. - Experience in handling Big Data and knowledge of WebSocket and backend server for WebSocket is preferred.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
bhopal, madhya pradesh
On-site
You are invited to apply for the position of Assistant / Associate Professor in the Computer Science & Engineering Department at Sagar Institute of Science & Technology. The ideal candidate should hold a B.Tech, M.Tech, and Ph.D. in disciplines such as CSE, IT, AIML, Image Processing, Machine Learning, Neural Networks, Deep Learning, or Data Science from a reputable institute and a recognized university. To excel in this role, you should possess a strong command over programming languages including C, C++, Java, and SQL. Additionally, you should have expertise in subjects such as Basic Computer Engineering, Object-oriented programming, Data Structure, DBMS, Artificial Intelligence, Machine Learning, Neural Networks, Deep Learning, Software Engineering, Security & Privacy, Python, Big Data, Cloud Computing, and Natural Language Processing. This position is based at Sagar Institute of Science & Technology, SISTec Gandhi Nagar Campus, located opposite the International Airport in Bhopal, Madhya Pradesh - 462036. If you meet the above qualifications and are passionate about teaching in the field of Computer Science & Engineering, please send your updated CV to hrd@sistec.ac.in. For more information about career opportunities at SISTec, please visit our careers page at https://www.sistec.ac.in/careers.,
Posted 1 month ago
1.0 - 5.0 years
0 Lacs
noida, uttar pradesh
On-site
You are an experienced Python developer who will be responsible for writing and testing scalable code, developing back-end components, and integrating user-facing elements in collaboration with front-end developers. You should have worked with various Python Frameworks like Django, possess good knowledge of databases like MySql, MongoDB, Hadoop, Big Data, and Docker. Additionally, you should have a good understanding of API development in XML and JSON, experience with at least one cloud provider such as Google Cloud or AWS, and have worked on Google APIs integration, restful APIs, and GIT. It is important to have familiarity with SOA (micro-services/message buses/ etc.), the ability to come up with innovative ideas, and transform requirements into scalable and smart code. You will be responsible for unit design and coding, implementing and following standards and guidelines with coding best practices in mind, ensuring bug-free and timely delivery of allocated development tasks by working with developers and architects, conducting proper unit testing, and taking ownership of a product end-to-end. Requirements for this role include a degree in Computer Science or related field, critical thinking and problem-solving skills, the ability to contribute individually, good time-management skills, great interpersonal skills, and 1-3 years of sound experience in the mentioned skill sets. This is a full-time job with a day shift schedule and requires in-person work at the designated location.,
Posted 1 month ago
6.0 - 14.0 years
0 Lacs
coimbatore, tamil nadu
On-site
You should have a solid experience of 6 to 14 years in Data Modeling/ ER Modeling. As a candidate for this position, you should possess knowledge of relational databases and data architecture computer systems, including SQL. It is preferred that you have familiarity with or a bachelors degree in computer science, data science, information technology, or data modeling. In this role, you will be expected to have a good understanding of ER modeling, big data, enterprise data, and physical data models. Additionally, experience with data modeling software such as SAP PowerDesigner, Microsoft Visio, or Erwin Data Modeler would be beneficial. The job location is in Coimbatore, and there will be a walk-in interview scheduled on 12th April. If you meet the requirements and are passionate about data modeling and ER modeling, we encourage you to apply for this exciting opportunity.,
Posted 1 month ago
6.0 - 11.0 years
15 - 30 Lacs
Pune
Hybrid
Software Engineer - Specialist What youll do Demonstrate a deep understanding of cloud-native, distributed microservice-based architectures. Deliver solutions for complex business problems through standard Software Development Life Cycle (SDLC) practices. Build strong relationships with both internal and external stakeholders, including product, business, and sales partners. Demonstrate excellent communication skills with the ability to both simplify complex problems and also dive deeper if needed. Lead strong technical teams that deliver complex software solutions that scale. Work across teams to integrate our systems with existing internal systems. Participate in a tight-knit, globally distributed engineering team. Provide deep troubleshooting skills with the ability to lead and solve production and customer issues under pressure. Leverage strong experience in full-stack software development and public cloud platforms like GCP and AWS. Mentor, coach, and develop junior and senior software, quality, and reliability engineers. Ensure compliance with secure software development guidelines and best practices. Define, maintain, and report SLA, SLO, and SLIs meeting EFX engineering standards in partnership with the product, engineering, and architecture teams. Collaborate with architects, SRE leads, and other technical leadership on strategic technical direction, guidelines, and best practices. Drive up-to-date technical documentation including support, end-user documentation, and run books. Responsible for implementation architecture decision-making associated with Product features/stories and refactoring work decisions. Create and deliver technical presentations to internal and external technical and non-technical stakeholders, communicating with clarity and precision, and presenting complex information in a concise, audience-appropriate format. What experience you need Bachelor's degree or equivalent experience. 5+ years of software engineering experience. 5+ years experience writing, debugging, and troubleshooting code in mainstream Java and SpringBoot. 5+ years experience with Cloud technology: GCP, AWS, or Azure. 5+ years experience designing and developing cloud-native solutions. 5+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes. 5+ years experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others. 5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understanding infrastructure-as-code concepts, Helm Charts, and Terraform constructs. What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision. Strong communication and presentation skills. Strong leadership qualities. Demonstrated problem-solving skills and the ability to resolve conflicts. Experience creating and maintaining product and software roadmaps. Working in a highly regulated environment. Experience on GCP in Big data and distributed systems - Dataflow, Apache Beam, Pub/Sub, BigTable, BigQuery, GCS. Experience with backend technologies such as JAVA/J2EE, SpringBoot, Golang, gRPC, SOA, and Microservices. Source code control management systems (e.g., SVN/Git, Github, Gitlab), build tools like Maven & Gradle, and CI/CD like Jenkins or Gitlab. Agile environments (e.g., Scrum, XP). Relational databases (e.g., SQL Server, MySQL). Atlassian tooling (e.g., JIRA, Confluence, and Github). Developing with modern JDK (v1.7+). Automated Testing: JUnit, Selenium, LoadRunner, SoapUI.
Posted 1 month ago
5.0 - 10.0 years
20 - 25 Lacs
Bengaluru
Work from Office
Minimum qualifications: Bachelor's degree or equivalent practical experience. 5 years of experience in product management or a related technical role. 2 years of experience taking technical products from conception to launch. Experience with the domain area of customer service or business application or building Support System Preferred qualifications: Master's degree in a technology or business related field. Experience in one or more of the following: generative AI Co-pilot, big data, security and privacy, development and operations, or machine learning. Ability to influence multiple stakeholders without direct authority. About the job At Google, we put our users first. The world is always changing, so we need Product Managers who are continuously adapting and excited to work on products that affect millions of people every day. In this role, you will work cross-functionally to guide products from conception to launch by connecting the technical and business worlds. You can break down complex problems into steps that drive product development. One of the many reasons Google consistently brings innovative, world-changing products to market is because of the collaborative work we do in Product Management. Our team works closely with creative engineers, designers, marketers, etc. to help design and develop technologies that improve access to the world's information. We're responsible for guiding products throughout the execution cycle, focusing specifically on analyzing, positioning, packaging, promoting, and tailoring our solutions to our users. Google Cloud provides the best possible combination of support quality and efficiency. We do this by driving customer retention and consumption through consistently positive support experiences. We build the Google Cloud Support Platform, which is a standardized system that optimizes and automates Cloud to get help interactions. The goal of the platform is to bring efficiency, scale, and consistently positive customer experiences to Cloud. Google Cloud accelerates every organization s ability to digitally transform its business and industry. We deliver enterprise-grade solutions that leverage Google s cutting-edge technology, and tools that help developers build more sustainably. Customers in more than 200 countries and territories turn to Google Cloud as their trusted partner to enable growth and solve their most critical business problems. Responsibilities Understand the cloud ecosystem markets, competition, and user requirements. Ideate and launch innovative products and features, test their performance, and iterate quickly. Develop and secure buy in for a product goals that identifies, defines, and supports the overall product narrative and direction, achieving an outcome that is greater than the sum of its parts. Work collaboratively with engineering, marketing, legal, UX, and other teams on technologies. Develop solutions to problems by collaborating as needed across regions, product areas, and functions.
Posted 1 month ago
8.0 - 13.0 years
50 - 55 Lacs
Bengaluru
Work from Office
Minimum qualifications: Bachelor's degree in Computer Science, a related technical field, or equivalent practical experience. 8 years of experience with software development in one or more programming languages (e.g., Python, C, C++, Java, JavaScript). 5 years of leadership experience and managing people. Preferred qualifications: Experience in data analytics, warehousing, ETL development, data science or other Big Data applications. About the job A line of code can be many things - an amazing feature, a beautiful UI, a transformative algorithm. The faster this line of code reaches millions of users, the sooner it impacts their lives. As a Software Engineer, Tools and Infrastructure, you will be at the heart of Google s engineering process building software that empowers engineering teams to develop and deliver high quality products quickly. We are focused on solving the hardest, most interesting challenges of developing software at scale without sacrificing stability, quality, velocity or code health. We ensure Google's success by partnering with engineering teams and developing scalable tools and infrastructure that help engineers develop, test, debug and release software quickly. We impact thousands of Googlers and billions of users by increasing the pace of product development and ensuring our products are thoroughly tested. We are advocates for code health, testability, maintainability and best practices for development and testing. Having access to all of Google's platforms and vast compute resources provides a unique opportunity to grow as an engineer. We typically work in small, nimble teams that collaborate on common problems across products and focus areas. As a result, the exposure to this broad set of problems provides technical challenges as well as accelerated career growth. Google Photos is a photo sharing and storage service developed by Google. Photos is one of the most sought after products at Google and is looking for both client-side (web and mobile), with server-side (search, storage, serving) and machine intelligence (learning, computer vision) Software Engineers. We are dedicated to making Google experiences centered around the user. Responsibilities Manage a team of Software Engineers to solve some of Photos and Google One's most critical analytics and experimentation challenges. Foster a collaborative partnership with stakeholders including Data Scientists, Analysts, and Strategy, Core partnerships with CoreData, to carve out and execute on long-term strategy. Build and enhance self-serve tools to help other teams create and manage data pipelines that generate metrics. Creating critical dashboards for visualization, and helping others create dashboards. Work on Analytics infrastructure, which also is a core infra in the ML life cycle.
Posted 1 month ago
5.0 - 8.0 years
10 - 15 Lacs
Bengaluru
Work from Office
You will be a key member of our Data Engineering team, focused on designing, developing, and maintaining robust data solutions on on-prem environments. You will work closely with internal teams and client stakeholders to build and optimize data pipelines and analytical tools using Python, Scala, SQL, Spark and Hadoop ecosystem technologies. This role requires deep hands-on experience with big data technologies in traditional data center environments (non-cloud). What you ll be doing Design, build, and maintain on-prem data pipelines to ingest, process, and transform large volumes of data from multiple sources into data warehouses and data lakes Develop and optimize Scala-Spark and SQL jobs for high-performance batch and real-time data processing Ensure the scalability, reliability, and performance of data infrastructure in an on-prem setup Collaborate with data scientists, analysts, and business teams to translate their data requirements into technical solutions Troubleshoot and resolve issues in data pipelines and data processing workflows Monitor, tune, and improve Hadoop clusters and data jobs for cost and resource efficiency Stay current with on-prem big data technology trends and suggest enhancements to improve data engineering capabilities Bachelors degree in software engineering, or a related field 5+ years of experience in data engineering or a related domain Strong programming skills in Python & S
Posted 1 month ago
1.0 - 2.0 years
4 - 6 Lacs
Bengaluru
Work from Office
SLSQ326R415 Databricks is at the forefront of the Unified Data Analytics field, where innovation is key to providing our clients with a competitive edge in todays fast-paced business landscape. We are looking for a Business Development Representative to help drive revenue growth within the India Market. If youre a results-oriented sales professional with a track record in similar roles, aiming to contribute to the expansion of a transformative enterprise software company and propel your career, this role is for you. Reporting to the Manager of the India Sales Development team, youll play a pivotal role in this journey. The impact you will have: Cultivate expertise in value-based selling, big data, and AI. Evaluate and prioritize the inbound leads from Marketing initiatives. Craft outbound strategies encompassing personalized emails, cold calls, and social selling to qualify opportunities. Devise compelling outreach campaigns targeting diverse buyer levels, including senior executives, to unlock opportunities in critical target accounts. Identify and uncover client requirements, progressing discussions into sales prospects by demonstrating how Databricks can address their data-related challenges. What we look for: Preferably a minimum of 1-2 years of prior experience in inbound and outbound sales and inquiries. Proficiency in comprehending technical concepts, coupled with genuine enthusiasm for technology. Determination and courage to excel and contribute to the growth of the next top-tier enterprise software company. Demonstrated a history of consistent, quantifiable achievements in previous roles. Curiosity and eagerness to continually learn and stay abreast of developments in the big data/AI sector. A strong sense of ownership and accountability. About Databricks Databricks is the data and AI company. More than 10,000 organizations worldwide including Comcast, Cond Nast, Grammarly, and over 50% of the Fortune 500 rely on the Databricks Data Intelligence Platform to unify and democratize data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark , Delta Lake and MLflow. To learn more, follow Databricks on Twitter , LinkedIn and Facebook . Benefits At Databricks, we strive to provide comprehensive benefits and perks that meet the needs of all of our employees. For specific details on the benefits offered in your region, please visit https: / / www.mybenefitsnow.com / databricks . Our Commitment to Diversity and Inclusion . Compliance If access to export-controlled technology or source code is required for performance of job duties, it is within Employers discretion whether to apply for a U.S. government license for such positions, and Employer may decline to proceed with an applicant on this basis alone.
Posted 1 month ago
4.0 - 6.0 years
4 - 8 Lacs
Chennai
Work from Office
Job_Description":" AI & Data Warehouse (DWH) Pando is a global leader in supply chain technology, building the worlds quickest time-to-value Fulfillment Cloud platform. Pando\u2019s Fulfillment Cloud provides manufacturers, retailers, and 3PLs with a single pane of glass to streamline end-to-end purchase order fulfillment and customer order fulfillment to improve service levels, reduce carbon footprint, and bring down costs. As a partner of choice for Fortune 500 enterprises globally, with a presence across APAC, the Middle East, and the US, Pando is recognized as a Technology Pioneer by the World Economic Forum (WEF), and as one of the fastest growing technology companies by Deloitte. Role As a Senior AI and Data Warehouse Engineer at Pando, you will be responsible for building and scaling the data and AI services team. You will drive the design and implementation of highly scalable, modular, and reusable data pipelines, leveraging big data technologies and low-code implementations. This is a senior leadership position where you will work closely with cross-functional teams to deliver solutions that power advanced analytics, dashboards, and AI-based insights. Key Responsibilities - Lead the development of scalable, high-performance data pipelines using PySpark or Big Data ETL pipeline technologies. - Drive data modeling efforts for analytics, dashboards, and knowledge graphs. - Oversee the implementation of parquet-based data lakes. - Work on OLAP databases, ensuring optimal data structure for reporting and querying. - Architect and optimize large-scale enterprise big data implementations with a focus on modular and reusable low-code libraries. - Collaborate with stakeholders to design and deliver AI and DWH solutions that align with business needs. - Mentor and lead a team of engineers, building out the data and AI services organization. Requirements - 4 to 6 years of experience in big data and AI technologies, with expertise in PySpark or similar Big Data ETL pipeline technologies. - Strong proficiency in SQL and OLAP database technologies. - Firsthand experience with data modeling for analytics, dashboards, and knowledge graphs. - Proven experience with parquet-based data lake implementations. - Expertise in building highly scalable, high-volume data pipelines. - Experience with modular, reusable, low-code-based implementations. - Involvement in large-scale enterprise big data implementations. - Initiative-taker with strong motivation and the ability to lead a growing team. Preferred - Experience leading a team or building out a new department. - Experience with cloud-based data platforms and AI services. - Familiarity with supply chain technology or fulfilment platforms is a plus.
Posted 1 month ago
2.0 - 9.0 years
30 - 35 Lacs
Bengaluru
Work from Office
This role is for a Senior Data Engineer - AIML, with a strong development background, whose primary objective will be to contribute developing and operationalizing platform services and large-scale machine learning pipelines at global scale. We are seeking for a talented professional with a solid mix of experience in Big Data and AI/ML systems. The ideal candidate for this role will have the ability to learn quickly and deliver solutions within strict deadlines in a fast-paced environment. They should have a passion for optimizing existing solutions and making incremental improvements. Strong interpersonal and effective communication skills, both written and verbal, are essential. Additionally, they should have knowledge of Agile methodologies, common scrum practices and tools. This is a hybrid position. Expectation of days in office will be confirmed by your Hiring Manager. Basic Qualifications 3+ years of relevant work experience and a Bachelors degree, OR 5+ years of relevant work experience Strong development experience in Python and one or more of the following programming languages: Go, Rust, Java/Scala,
Posted 1 month ago
0.0 - 4.0 years
18 - 19 Lacs
Gurugram
Work from Office
Here, your voice and ideas matter, your work makes an impact, and together, you will help us define the future of American Express. How will you make an impact in this role? The Analytics, Investment Optimization and Marketing Enablement (AIM) team a part of Global Commercial Services Marketing group within American Express is the analytical engine that enables the Global Commercial Card & Non-Card business. This role, based out of Gurugram, will be a part of the AIM team and will be responsible for Global Commercial Card Product Analytics. The team specifically focus on portfolio, campaign measurements and commercial product analytics and partner closely with GCS product teams. The incumbent would be responsible for customer behavior, product enhancements, portfolio analytics, measuring the impact of various marketing treatments, reporting results to leadership and providing strategic analytical support. The role requires a strong background in data analytics, commercial card product knowledge. The position is part of a highly collaborative environment, interacting with and influencing partners across the Global Commercial Service s business at American Express. Key Responsibilities include: Support and enable the business partners with campaign measurements, ROI analysis and actionable data driven insights Portfolio analytics to identify trends, composition, leading indicators, and outlook Support and enable the GCS product partners with actionable, insightful analytical solutions to help the leadership team evaluate and drive business performance. Accurate, timely and efficient delivery of monthly results reporting for marketing leadership Excellent communication skills with the ability to engage, influence, and inspire partners and stakeholders to drive collaboration and alignment Exceptional execution skills be able to resolve issues, identify opportunities, and define success metrics and make things happen Drive Automation and ongoing refinement of analytical frameworks Willingness to challenge the status quo; breakthrough thinking to generate insights, alternatives, and opportunities for business success Minimum Qualifications Degree in a quantitative field (e.g. Finance, Engineering, Mathematics, Computer Science or Economics). Strong technical and analytical skills with the ability to apply both quantitative methods and business skills to create insights and drive results. Ability to work independently and across a matrix organization partnering with business partners, functional owners, capabilities, technology teams and external vendors. Ability to prioritize and manage several concurrent projects through collaboration across teams/geographies Preferred Qualifications Strong programming skills are preferred. SQL is a must and experience with Big Data programming is a plus Rotational shift: 1:00 PM - 9:30 PM/ 11:00AM - 7:30 PM We back you with benefits that support your holistic well-being so you can be and deliver your best. This means caring for you and your loved ones physical, financial, and mental health, as well as providing the flexibility you need to thrive personally and professionally:
Posted 1 month ago
10.0 - 15.0 years
50 - 55 Lacs
Gurugram
Work from Office
It s no secret that Amazon relies on its technology to deliver millions of packages every day to its customers on time, with low cost. Our Transportation Technology division builds the complex software solutions that work across our vendors, warehouses and carriers to optimize both time & cost of getting the packages delivered. Our services already handle thousands of requests per second, make business decisions impacting billions of dollars a year, integrate with a network of small and large carriers worldwide, manage business rules for millions of unique products, and improve ordering and delivery experience for millions of online shoppers. This remains a fast growing business, and our technical journey has only started. With rapid expansion into new geographies, innovations in supply chain, unique delivery models for products ranging from Fresh groceries to big-screen TV s, increasingly complex transportation network, and growing number of shipments worldwide, we see a brand new opportunity to fundamentally change the way people get the stuff they need, and make a big impact by cutting billions of dollars of transportation costs from the ecosystem. Our mission is to build the most efficient and optimal transportation system on the planet, using our engineering muscle as our biggest advantage. We aim to leverage the latest technologies in big data, machine learning, and optimization techniques, and operate high volume, low latency, and high availability services. 10+ years of engineering experience 5+ years of engineering team management experience 10+ years of planning, designing, developing and delivering consumer software experience Experience partnering with product or program management teams Experience managing multiple concurrent programs, projects and development teams in an Agile environment Experience partnering with product and program management teams Experience designing and developing large scale, high-traffic applications
Posted 1 month ago
16.0 - 21.0 years
9 - 14 Lacs
Hyderabad
Work from Office
The Optum Technology Digital team is on a mission to disrupt the healthcare industry, transforming UHG into an industry-leading Consumer brand. We deliver hyper-personalized digital solutions that empower direct-to-consumer, digital-first experiences, educating, guiding, and empowering consumers to access the right care at the right time. Our mission is to revolutionize healthcare for patients and providers by delivering cutting-edge, personalized and conversational digital solutions. We’re Consumer Obsessed, ensuring they receive exceptional support throughout their healthcare journeys. As we drive this transformation, we’re revolutionizing customer interactions with the healthcare system, leveraging AI, cloud computing, and other disruptive technologies to tackle complex challenges. Serving UnitedHealth Group’s digital technology needs, the Consumer Engineering team impacts millions of lives through UnitedHealthcare & Optum. We are seeking a dynamic individual who embodies modern engineering culture – someone with deep engineering expertise within a digital product model, a passion for innovation, and a relentless drive to enhance the consumer experience. Our ideal candidate thrives in an agile, fast-paced rapid-prototyping environment, embraces DevOps and continuous integration/continuous deployment (CI/CD) practices, and champions the Voice of the Customer. If you are driven by the pursuit of excellence, eager to innovate, and excited to make a tangible impact within a team that embraces modern technologies and consumer-centric strategies, while prioritizing robust cyber-security protocols, we invite you to explore this exciting opportunity with us. Join our team and be at the forefront of shaping the future of healthcare, where your unique skills will not only be recognized but celebrated. Primary Responsibilities: Run the production environment by monitoring availability and taking a holistic view of system health Build software and systems to manage platform infrastructure and applications Improve reliability, quality, and time-to-market of our suite of software solutions Measure and optimize system performance, with an eye toward pushing our capabilities forward, getting ahead of customer needs, and innovating to continually improve Provide primary operational support and engineering for multiple large distributed software applications EstimateCreate, understand and validate Design and estimated effort for given module/task, and be able to justify it MentorCoach high performing engineering team to build and deliver health care product to market OperationsPossess/acquire solid troubleshooting skills and be interested in performing troubleshooting of issues in different desperate technologies and environments Thought LeadershipPropose and implement best in class architectural solution for big and complex systems EngineeringImplement and adhere to best engineering practices like design, unit testing, functional testing automation, continues integration and delivery Stakeholder managementExcellent communication skills, clarity of thought and ability to take decisions bases on available information Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications: B.Tech/MCA/Msc/MTech (16+ years of formal education, correspondence courses are not relevant) 12+ years of work experience in Product companies Experience with distributed storage technologies like NFS, HDFS, Ceph, S3 as well as dynamic resource management frameworks (Mesos, Kubernetes, Yarn) Solid experience in Core Java, Spring, Struts, and Hibernate/Spring Data JPA Experience in GraphQL using Java Spring Experience in Reactive Programming Experience in RDBMS or NoSQL databases Experience with GitHub Actions Experience with Docker, Kubernetes Expertise in microservice architecture Experience in React JS Relevant experience in used state management libraries like Redux Expertise in using Saga, Thunk, or Iterative patterns Exposure to agile development process Proven ability to program (structured and OO) with one or more high level languages, such as Python, Java, C/C++, Rub Preferred Qualifications: Work experience in Agile/Scrum Methodology Work experience in product engineering Knowledge of Snowflake and Bigdata Knowledge of SAFe Knowledge of US Healthcare domain, in general and Payment Integrity, in particular At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone – of every race, gender, sexuality, age, location and income – deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes – an enterprise priority reflected in our mission. #nic External Candidate Application Internal Employee Application
Posted 1 month ago
6.0 - 11.0 years
12 - 16 Lacs
Bengaluru
Work from Office
About the Role: This role is responsible for managing and maintaining complex, distributed big data ecosystems. It ensures the reliability, scalability, and security of large-scale production infrastructure. Key responsibilities include automating processes, optimizing workflows, troubleshooting production issues, and driving system improvements across multiple business verticals. Roles and Responsibilities: Manage, maintain, and support incremental changes to Linux/Unix environments. Lead on-call rotations and incident responses, conducting root cause analysis and driving postmortem processes. Design and implement automation systems for managing big data infrastructure, including provisioning, scaling, upgrades, and patching clusters. Troubleshoot and resolve complex production issues while identifying root causes and implementing mitigating strategies. Design and review scalable and reliable system architectures. Collaborate with teams to optimize overall system performance. Enforce security standards across systems and infrastructure. Set technical direction, drive standardization, and operate independently. Ensure availability, performance, and scalability of systems and services through proactive monitoring, maintenance, and capacity planning. Resolve, analyze, and respond to system outages and disruptions and implement measures to prevent similar incidents from recurring. Develop tools and scripts to automate operational processes, reducing manual workload, increasing efficiency and improving system resilience. Monitor and optimize system performance and resource usage, identify and address bottlenecks, and implement best practices for performance tuning. Collaborate with development teams to integrate best practices for reliability, scalability, and performance into the software development lifecycle. Stay informed of industry technology trends and innovations, and actively contribute to the organization's technology communities. Develop and enforce SRE best practices and principles. Align across functional teams on priorities and deliverables. Drive automation to enhance operational efficiency. Skills Required: Over 6 years of experience managing and maintaining distributed big data ecosystems. Strong expertise in Linux including IP, Iptables, and IPsec. Proficiency in scripting/programming with languages like Perl, Golang, or Python. Hands-on experience with the Hadoop stack (HDFS, HBase, Airflow, YARN, Ranger, Kafka, Pinot). Familiarity with open-source configuration management and deployment tools such as Puppet, Salt, Chef, or Ansible. Solid understanding of networking, open-source technologies, and related tools. Excellent communication and collaboration skills. DevOps toolsSaltstack, Ansible, docker, Git. SRE Logging and monitoring toolsELK stack, Grafana, Prometheus, opentsdb, Open Telemetry. Good to Have: Experience managing infrastructure on public cloud platforms (AWS, Azure, GCP). Experience in designing and reviewing system architectures for scalability and reliability. Experience with observability tools to visualize and alert on system performance. PhonePe Full Time Employee Benefits (Not applicable for Intern or Contract Roles) Insurance Benefits - Medical Insurance, Critical Illness Insurance, Accidental Insurance, Life Insurance Wellness Program - Employee Assistance Program, Onsite Medical Center, Emergency Support System Parental Support - Maternity Benefit, Paternity Benefit Program, Adoption Assistance Program, Day-care Support Program Mobility Benefits - Relocation benefits, Transfer Support Policy, Travel Policy Retirement Benefits - Employee PF Contribution, Flexible PF Contribution, Gratuity, NPS, Leave Encashment Other Benefits - Higher Education Assistance, Car Lease, Salary Advance Policy Working at PhonePe is a rewarding experience! Great people, a work environment that thrives on creativity, the opportunity to take on roles beyond a defined job description are just some of the reasons you should work with us. Read more about PhonePe on our blog. Life at PhonePe PhonePe in the news
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
40005 Jobs | Dublin
Wipro
19416 Jobs | Bengaluru
Accenture in India
16187 Jobs | Dublin 2
EY
15356 Jobs | London
Uplers
11435 Jobs | Ahmedabad
Amazon
10613 Jobs | Seattle,WA
Oracle
9462 Jobs | Redwood City
IBM
9313 Jobs | Armonk
Accenture services Pvt Ltd
8087 Jobs |
Capgemini
7830 Jobs | Paris,France