Jobs
Interviews

24278 Etl Jobs - Page 27

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 years

0 Lacs

India

On-site

What You'll Do Avalara is an AI-first company. We expect every employee to actively leverage AI to enhance productivity, quality, innovation, and customer value. AI is embedded in our workflows and products — and success at Avalara requires embracing AI as an essential capability, not an optional tool. We are looking for a experienced Oracle Cloud ERP Techno Functional Consultant to join our team. You have experience with Oracle Cloud ERP & Oracle EBS, specifically to Cash, Procure to Pay and Tax module. You have understanding of Core Oracle Technology, Oracle business processes, multiple integration tools and the ability to collaborate with partners. You will be reporting to the Senior Technical Lead. Responsibilities What Your Responsibilities Will Be Technical Expertise: programming skills in relevant technologies like Java, SQL, PL/SQL, XML, RESTful APIs, JavaScript, and ADF and web services. Develop custom solutions, extensions, and integrations to meet specific our requirements. Report and Analytics: Proficiency in creating custom reports, dashboards, and analytics using Oracle BI Publisher, Oracle OTBI (Oracle Transactional Business Intelligence), and other reporting tools. Experience reviewing code to find and address potential issues and defects hands-on experience in BI Publisher, OTBI and Data Models Data Integration and Migration: Experience in data integration between Oracle Fusion applications and other systems and data migration from legacy systems to Oracle Fusion. Knowledge of ETL (Extract, Transform, Load) tools. Customization and Extensions: customize and extend Oracle Fusion applications using tools like Oracle JDeveloper, Oracle ADF, and Oracle Application Composer to tailor the software to meet needs. Oracle Fusion Product Knowledge: Expertise in Oracle Fusion Financials, Oracle Fusion SCM (Supply Chain Management), Oracle Fusion Procurement and Oracle Fusion Tax. Security and Access Control: Knowledge of security models, user roles, and access controls within Oracle Fusion applications to ensure data integrity and compliance. Performance Tuning and Optimization: Skills in diagnosing and resolving performance issues, optimizing database queries, and ensuring a smooth operation of Oracle Fusion applications. Problem Troubleshooting: Experience approaching a problem from different angles, analyzing pros and cons of different solutions to identify and address technical issues, system errors, and integration challenges. Experience communicating updates and resolutions to customers and other partners to work with clients, gather requirements, explain technical solutions to non-technical partners, and collaborate with teams. What You’ll Need To Be Successful Qualifications Minimum 5+ years of experience as Oracle Cloud ERP Financials Minimum 5+ years of experience as Oracle EBS Financials Bachelor's degree in Computer Science, Information Technology, or a related field. Previous experience implementing Tax Modules in Oracle Cloud ERP and Oracle EBS- Experience and desire to work in a Global delivery environment Experience with latest integration methodologies. Proficiency in CI/CD tools (Jenkins, GitLab, etc.) How We’ll Take Care Of You Total Rewards In addition to a great compensation package, paid time off, and paid parental leave, many Avalara employees are eligible for bonuses. Health & Wellness Benefits vary by location but generally include private medical, life, and disability insurance. Inclusive culture and diversity Avalara strongly supports diversity, equity, and inclusion, and is committed to integrating them into our business practices and our organizational culture. We also have a total of 8 employee-run resource groups, each with senior leadership and exec sponsorship. What You Need To Know About Avalara We’re defining the relationship between tax and tech. We’ve already built an industry-leading cloud compliance platform, processing over 54 billion customer API calls and over 6.6 million tax returns a year. Our growth is real - we're a billion dollar business - and we’re not slowing down until we’ve achieved our mission - to be part of every transaction in the world. We’re bright, innovative, and disruptive, like the orange we love to wear. It captures our quirky spirit and optimistic mindset. It shows off the culture we’ve designed, that empowers our people to win. We’ve been different from day one. Join us, and your career will be too. We’re An Equal Opportunity Employer Supporting diversity and inclusion is a cornerstone of our company — we don’t want people to fit into our culture, but to enrich it. All qualified candidates will receive consideration for employment without regard to race, color, creed, religion, age, gender, national orientation, disability, sexual orientation, US Veteran status, or any other factor protected by law. If you require any reasonable adjustments during the recruitment process, please let us know.

Posted 5 days ago

Apply

0 years

0 Lacs

India

On-site

The ideal candidate will be responsible for developing high-quality applications. They will also be responsible for designing and implementing testable and scalable code. Responsibilities: Lead backend Python development for innovative healthcare technology solutions Oversee a backend team to achieve product and platform goals in the B2B HealthTech domain Design and implement scalable backend infrastructures with seamless API integration Ensure availability on immediate or short notice for efficient onboarding and project ramp-up Optimize existing backend systems based on real-time healthcare data requirements Collaborate with cross-functional teams to ensure alignment between tech and business goals Review and refine code for quality, scalability, and performance improvements Ideal Candidate: Experienced in building B2B software products using agile methodologies Strong proficiency in Python, with a deep understanding of backend system architecture Comfortable with fast-paced environments and quick onboarding cycles Strong communicator who fosters a culture of innovation, ownership, and collaboration Passionate about driving real-world healthcare impact through technology Skills Required: Primary: TypeScript, AWS, Python, RESTful APIs, Backend Architecture Additional: SQL/NoSQL databases, Docker/Kubernetes (preferred) Strongly Good to Have: Prior experience in Data Engineering , especially in healthcare or real-time analytics Familiarity with ETL pipelines , data lake/warehouse solutions , and stream processing frameworks (e.g., Apache Kafka, Spark, Airflow) Understanding of data privacy, compliance (e.g., HIPAA) , and secure data handling practices Hiring Process Profile Shortlisting Tech Interview Tech Interview Culture Fit

Posted 5 days ago

Apply

9.0 years

0 Lacs

India

On-site

What You'll Do Avalara is an AI-first company. We expect every engineer, manager, and leader to actively leverage AI to enhance productivity, quality, innovation, and customer value. AI is embedded in our workflows, decision-making, and products — and success at Avalara requires embracing AI as an essential capability, not an optional tool. We are seeking an experienced and experienced AI & Machine Learning Technical Manager to lead our dynamic team in developing cutting-edge AI & ML solutions. This role is perfect for someone passionate about applying AI and ML, promote innovation, and create impactful products. You will be responsible for our AI systems (conversational agents, tax code classification for products and services, document intelligence, etc.) and how we apply them at Avalara to simplify and scale tax compliance across our entire portfolio of products. As an important part of our leadership team, you will shape the future of our AI & ML projects, manage a talented team of AI professionals, and collaborate with teams to implement projects. We offer the chance to work on pioneering AI technologies and mentor a team of experts, and contribute to the strategic direction of our AI & ML endeavors. This role will report to Sr Director of AI & ML. What Your Responsibilities Will Be Lead and manage a team of AI&ML engineers and data scientists, overseeing project lifecycles from conception to deployment, ensuring timely delivery. Develop team members, providing guidance on technical challenges, career development, and professional growth opportunities. Stay up to date with the latest AI&ML technologies and methodologies, incorporating new approaches into our projects to maintain competitive advantage. Develop our AI&ML strategy, aligning with our objectives, and ensuring the team's projects support this vision. Collaborate with teams, including product management, engineering, and design, to define project requirements, set priorities, and allocate resources effectively. Foster a culture of innovation, encouraging experimentation and learning, and leading by example in adopting a hands-on approach to problem-solving. Ensure implementing best practices in project management, software development, and quality assurance to optimize team performance and productivity. Manage stakeholder communications, providing regular updates on project status, important milestones, and any challenges or risks, ensuring alignment and support across the organization. What You’ll Need To Be Successful What You'll Need to be Successful Specific Qualifications Expertise in AI technologies and methodologies, with a portfolio of projects demonstrating your ability to apply these in solving complex problems. Experience building and deploying to production APIs powered by AI & Machine Learning systems. Proficiency in programming languages relevant to AI & ML, such as Python, R, Java, Scala, C++ and familiarity with AI & ML frameworks and libraries (e.g., PyTorch, TensorFlow, and Scikit-learn). Experience with cloud computing platforms (AWS, Azure, Google Cloud) and understanding of how to use these for scalable, secure, reliable distributed systems with complex workflows relying on AI & ML solutions. Background in data engineering and familiarity with database technologies, as well as, data processing / ETL pipelines and visualization tools. : General Qualifications Bachelor's degree in Computer Science, Artificial Intelligence, or Machine Learning. 9 years of experience in AI & ML with at least 6 years in a management position, overseeing technical teams. Translate complex technical concepts and challenges into clear strategic plans and applicable solutions. People management skills, with experience mentoring and developing teams. Excellent project management skills, with experience in agile methodologies How We’ll Take Care Of You Total Rewards In addition to a great compensation package, paid time off, and paid parental leave, many Avalara employees are eligible for bonuses. Health & Wellness Benefits vary by location but generally include private medical, life, and disability insurance. Inclusive culture and diversity Avalara strongly supports diversity, equity, and inclusion, and is committed to integrating them into our business practices and our organizational culture. We also have a total of 8 employee-run resource groups, each with senior leadership and exec sponsorship. What You Need To Know About Avalara We’re defining the relationship between tax and tech. We’ve already built an industry-leading cloud compliance platform, processing over 54 billion customer API calls and over 6.6 million tax returns a year. Our growth is real - we're a billion dollar business - and we’re not slowing down until we’ve achieved our mission - to be part of every transaction in the world. We’re bright, innovative, and disruptive, like the orange we love to wear. It captures our quirky spirit and optimistic mindset. It shows off the culture we’ve designed, that empowers our people to win. We’ve been different from day one. Join us, and your career will be too. We’re An Equal Opportunity Employer Supporting diversity and inclusion is a cornerstone of our company — we don’t want people to fit into our culture, but to enrich it. All qualified candidates will receive consideration for employment without regard to race, color, creed, religion, age, gender, national orientation, disability, sexual orientation, US Veteran status, or any other factor protected by law. If you require any reasonable adjustments during the recruitment process, please let us know.

Posted 5 days ago

Apply

5.0 years

0 Lacs

Kanayannur, Kerala, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job Description Salesforce Senior Developer Experience: Total : 5+ Years Relevant : 3+ Years Responsibilities: Meet with clients to determine business, functional and technical requirements and participate in application design, configuration, testing and deployment Perform configuration and customization of the Salesforce.com platform. Participate in efforts to develop and execute testing, training and documentation Participate in the sales cycle as needed (solution definition, pre-sales, estimating and project planning) Willing to be hands-on in producing tangible deliverables (requirements specifications, design deliverables, status reports, project plans) Proactively engage on continuous improvement efforts for application design, support, and practice development efforts. Provide technical assistance and end user troubleshooting for bug fixes, enhancements, and “how-to” assistance. Performs regular reviews on implementation done by less experienced developers and offer feedback and suggestions for those codes Mentors the junior and mid-level developers of the team, and can designate tasks to team members in a balanced and effective manner Sets up a development environment on their own, and has the ability to mentor a team of junior developers Independently communicate with both client technical teams and business owners as needed during the design and implementation Knowledge and Skill: 3+ years of experience working on Salesforce platforms At least Salesforce certification “Salesforce Platform Developer I” Direct experience working on CRM projects for middle market and enterprise size companies Working knowledge and experience with complex business systems integration as well as object-oriented design patterns and development Software engineering skills with Force.com Platform (Apex, LWC, SOQL, Unit Testing) Experience in core web technologies including HTML5, JavaScript and jQuery Demonstrated experience and knowledge of relational databases, data modelling, and ETL tools Experience with web services (REST & SOAP, JSON & XML, etc) Experience with Agile development methodologies such as SCRUM Excellent organizational, verbal and written communication skills EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 5 days ago

Apply

1.0 - 3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

About the Team The Analytics team at Navi plays a pivotal role in driving data-informed decision-making across the organization. We work closely with all business functions to generate insights, build visibility, and validate hypotheses that shape strategy and execution. Our work spans five key verticals: Business and Product Analytics, Credit & Credit Risk Analytics, Collections & Biz-fin Analytics, Self-Serve and Automation. About the Role Data analyst is a key role within the Navi central analytics team, which bridges the gap between business teams and data. Their responsibilities would be around creating visibility across business KPIs, essential reports, dashboards for key business metrics. They will also work on leveraging automation for operational efficiency and data modeling to build a strong data infrastructure for easy access across teams. This is an entry level position within the team where you will get the opportunity to work with highly motivated colleagues to build data based solutions that will have direct business impact. Must Haves ● Bachelor's degree in Technology ● Proven 1-3 years of experience as an Analyst ● Proficiency in data analysis tools Excel, SQL, and any data visualization tools (e.g., Tableau, Power BI etc) ● Basic understanding of data platforms and ETL processes ● Ability to interpret complex data sets, has good attention to detail ● Problem Solving skills ● (Good to have) Prior data wrangling experience in R/Python What We Expect From You Data Management: ○ Collect, organize, and maintain large datasets from multiple sources. ○ Ensure data accuracy, completeness, and consistency. ○ Develop and implement data quality standards and procedures. Analysis and Reporting: ○ Use SQL Query, VLOOKUP, Pivot Tables, and other Excel functions to analyze financial and operational data. ○ Prepare regular and ad-hoc reports for management, highlighting key performance metrics, trends, and insights. ○ Identify areas for improvement and recommend actionable solutions based on data analysis. Dashboard Development: ○ Design and develop interactive dashboards using Tableau, Excel or specialized BI tools. ○ Customize dashboards to meet specific departmental or stakeholder requirements. ○ Continuously monitor and update dashboards to reflect the latest data and insights. Process Optimization and stakeholder management: ○ Collaborate with cross-functional teams to streamline data collection and reporting processes. ○ Identify inefficiencies and bottlenecks in existing workflows and propose enhancements. ○ Implement automation solutions to improve efficiency and accuracy in data management and reporting. Inside Navi We are shaping the future of financial services for a billion Indians through products that are simple, accessible, and affordable. From Personal & Home Loans to UPI, Insurance, Mutual Funds, and Gold — we’re building tech-first solutions that work at scale, with a strong customer-first approach. Founded by Sachin Bansal & Ankit Agarwal in 2018, we are one of India’s fastest-growing financial services organisations. But we’re just getting started! Our Culture The Navi DNA Ambition. Perseverance. Self-awareness. Ownership. Integrity. We’re looking for people who dream big when it comes to innovation. At Navi, you’ll be empowered with the right mechanisms to work in a dynamic team that builds and improves innovative solutions. If you’re driven to deliver real value to customers, no matter the challenge, this is the place for you. We chase excellence by uplifting each other—and that starts with every one of us. Why You'll Thrive at Navi At Navi, it’s about how you think, build, and grow. You’ll thrive here if: You’re impact-driven You take ownership, build boldly, and care about making a real difference. You strive for excellence Good isn’t good enough. You bring focus, precision, and a passion for quality. You embrace change You adapt quickly, move fast, and always put the customer first.

Posted 5 days ago

Apply

5.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job Description Salesforce Senior Developer Experience: Total : 5+ Years Relevant : 3+ Years Responsibilities: Meet with clients to determine business, functional and technical requirements and participate in application design, configuration, testing and deployment Perform configuration and customization of the Salesforce.com platform. Participate in efforts to develop and execute testing, training and documentation Participate in the sales cycle as needed (solution definition, pre-sales, estimating and project planning) Willing to be hands-on in producing tangible deliverables (requirements specifications, design deliverables, status reports, project plans) Proactively engage on continuous improvement efforts for application design, support, and practice development efforts. Provide technical assistance and end user troubleshooting for bug fixes, enhancements, and “how-to” assistance. Performs regular reviews on implementation done by less experienced developers and offer feedback and suggestions for those codes Mentors the junior and mid-level developers of the team, and can designate tasks to team members in a balanced and effective manner Sets up a development environment on their own, and has the ability to mentor a team of junior developers Independently communicate with both client technical teams and business owners as needed during the design and implementation Knowledge and Skill: 3+ years of experience working on Salesforce platforms At least Salesforce certification “Salesforce Platform Developer I” Direct experience working on CRM projects for middle market and enterprise size companies Working knowledge and experience with complex business systems integration as well as object-oriented design patterns and development Software engineering skills with Force.com Platform (Apex, LWC, SOQL, Unit Testing) Experience in core web technologies including HTML5, JavaScript and jQuery Demonstrated experience and knowledge of relational databases, data modelling, and ETL tools Experience with web services (REST & SOAP, JSON & XML, etc) Experience with Agile development methodologies such as SCRUM Excellent organizational, verbal and written communication skills EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 5 days ago

Apply

8.0 - 12.0 years

0 Lacs

Pune, Maharashtra, India

On-site

About VOIS VOIS (Vodafone Intelligent Solutions) is a strategic arm of Vodafone Group Plc, creating value and enhancing quality and efficiency across 28 countries, and operating from 7 locations: Albania, Egypt, Hungary, India, Romania, Spain and the UK. Over 29,000 highly skilled individuals are dedicated to being Vodafone Group’s partner of choice for talent, technology, and transformation. We deliver the best services across IT, Business Intelligence Services, Customer Operations, Business Operations, HR, Finance, Supply Chain, HR Operations, and many more. Established in 2006, VOIS has evolved into a global, multi-functional organisation, a Centre of Excellence for Intelligent Solutions focused on adding value and delivering business outcomes for Vodafone. VOIS India In 2009, VOIS started operating in India and now has established global delivery centres in Pune, Bangalore and Ahmedabad. With more than 14,500 employees, VOIS India supports global markets and group functions of Vodafone, and delivers best-in-class customer experience through multi-functional services in the areas of Information Technology, Networks, Business Intelligence and Analytics, Digital Business Solutions (Robotics & AI), Commercial Operations (Consumer & Business), Intelligent Operations, Finance Operations, Supply Chain Operations and HR Operations and more. Experience And Skills Location: Pune Working Persona: Hybrid Experience: 8 to 12 years Primary Skills Strong experience in Data Analysis, Gap Analysis, Data Profiling, Requirement elicitation and ability to clearly documenting the business requirements. Alternate Skills Good knowledge on Business Analysis life cycle. Good Experiences In Must have technical / professional qualifications: Experienced in requirements engineering, specifically the challenge and validation of data capability and reporting requirements. Knowledge of basics of DWH and exposure to BI projects Basics of RDBMS with knowledge of SQL. SQL, especially Teradata, Wherescape and ETL tool Experienced in requirements analysis, specifically the challenge and validation of data capability and reporting requirements. Essential Core competencies, knowledge, and experience: Good in requirement elicitation Good telecom knowledge Good technical knowledge Good communication. Able to manage priorities and stakeholders. Experience This position requires strong knowledge on BI and Data Warehousing and working knowledge on databases like Teradata/Oracle. Strong exposure to data analysis, gap analysis, data profiling, clearly documenting requirements, good knowledge on Business Analysis life cycle with very strong and clear communication skills. Candidates should have experience in Agile Projects preferably in SAFe methodology. VOIS Equal Opportunity Employer Commitment VOIS is proud to be an Equal Employment Opportunity Employer. We celebrate differences and we welcome and value diverse people and insights. We believe that being authentically human and inclusive powers our employees’ growth and enables them to create a positive impact on themselves and society. We do not discriminate based on age, colour, gender (including pregnancy, childbirth, or related medical conditions), gender identity, gender expression, national origin, race, religion, sexual orientation, status as an individual with a disability, or other applicable legally protected characteristics. As a result of living and breathing our commitment, our employees have helped us get certified as a Great Place to Work in India for four years running. We have been also highlighted among the Top 5 Best Workplaces for Diversity, Equity, and Inclusion, Top 10 Best Workplaces for Women, Top 25 Best Workplaces in IT & IT-BPM and 14th Overall Best Workplaces in India by the Great Place to Work Institute in 2023. These achievements position us among a select group of trustworthy and high-performing companies which put their employees at the heart of everything they do. By joining us, you are part of our commitment. We look forward to welcoming you into our family which represents a variety of cultures, backgrounds, perspectives, and skills! Apply now, and we’ll be in touch!

Posted 5 days ago

Apply

5.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job Description Salesforce Senior Developer Experience: Total : 5+ Years Relevant : 3+ Years Responsibilities: Meet with clients to determine business, functional and technical requirements and participate in application design, configuration, testing and deployment Perform configuration and customization of the Salesforce.com platform. Participate in efforts to develop and execute testing, training and documentation Participate in the sales cycle as needed (solution definition, pre-sales, estimating and project planning) Willing to be hands-on in producing tangible deliverables (requirements specifications, design deliverables, status reports, project plans) Proactively engage on continuous improvement efforts for application design, support, and practice development efforts. Provide technical assistance and end user troubleshooting for bug fixes, enhancements, and “how-to” assistance. Performs regular reviews on implementation done by less experienced developers and offer feedback and suggestions for those codes Mentors the junior and mid-level developers of the team, and can designate tasks to team members in a balanced and effective manner Sets up a development environment on their own, and has the ability to mentor a team of junior developers Independently communicate with both client technical teams and business owners as needed during the design and implementation Knowledge and Skill: 3+ years of experience working on Salesforce platforms At least Salesforce certification “Salesforce Platform Developer I” Direct experience working on CRM projects for middle market and enterprise size companies Working knowledge and experience with complex business systems integration as well as object-oriented design patterns and development Software engineering skills with Force.com Platform (Apex, LWC, SOQL, Unit Testing) Experience in core web technologies including HTML5, JavaScript and jQuery Demonstrated experience and knowledge of relational databases, data modelling, and ETL tools Experience with web services (REST & SOAP, JSON & XML, etc) Experience with Agile development methodologies such as SCRUM Excellent organizational, verbal and written communication skills EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 5 days ago

Apply

7.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

7+ years of experience in data engineering or equivalent technical role. 5+ years of hands-on experience with AWS Cloud Development and DevOps. Strong expertise in SQL, data modeling, and ETL/ELT pipelines. Deep experience with Oracle (PL/SQL, performance tuning, data extraction). Proficiency in Python and/or Scala for data processing tasks. Strong knowledge of cloud infrastructure (networking, security, cost optimization). Experience with infrastructure as code (Terraform). Familiarity with CI/CD pipelines and DevOps tooling (e.g., Jenkins, GitHub Actions).

Posted 5 days ago

Apply

5.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job Description Salesforce Senior Developer Experience: Total : 5+ Years Relevant : 3+ Years Responsibilities: Meet with clients to determine business, functional and technical requirements and participate in application design, configuration, testing and deployment Perform configuration and customization of the Salesforce.com platform. Participate in efforts to develop and execute testing, training and documentation Participate in the sales cycle as needed (solution definition, pre-sales, estimating and project planning) Willing to be hands-on in producing tangible deliverables (requirements specifications, design deliverables, status reports, project plans) Proactively engage on continuous improvement efforts for application design, support, and practice development efforts. Provide technical assistance and end user troubleshooting for bug fixes, enhancements, and “how-to” assistance. Performs regular reviews on implementation done by less experienced developers and offer feedback and suggestions for those codes Mentors the junior and mid-level developers of the team, and can designate tasks to team members in a balanced and effective manner Sets up a development environment on their own, and has the ability to mentor a team of junior developers Independently communicate with both client technical teams and business owners as needed during the design and implementation Knowledge and Skill: 3+ years of experience working on Salesforce platforms At least Salesforce certification “Salesforce Platform Developer I” Direct experience working on CRM projects for middle market and enterprise size companies Working knowledge and experience with complex business systems integration as well as object-oriented design patterns and development Software engineering skills with Force.com Platform (Apex, LWC, SOQL, Unit Testing) Experience in core web technologies including HTML5, JavaScript and jQuery Demonstrated experience and knowledge of relational databases, data modelling, and ETL tools Experience with web services (REST & SOAP, JSON & XML, etc) Experience with Agile development methodologies such as SCRUM Excellent organizational, verbal and written communication skills EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 5 days ago

Apply

4.0 years

0 Lacs

Ahmedabad, Gujarat, India

Remote

Experience : 4.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Office (Ahmedabad) Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Attri) What do you need for this opportunity? Must have skills required: Azure, Docker, TensorFlow, Python, Shell Scripting Attri is Looking for: About Attri Attri is an AI organization that helps businesses initiate and accelerate their AI efforts. We offer the industry’s first end-to-end enterprise machine learning platform, empowering teams to focus on ML development rather than infrastructure. From ideation to execution, our global team of AI experts supports organizations in building scalable, state-of-the-art ML solutions. Our mission is to redefine businesses by harnessing cutting-edge technology and a unique, value-driven approach. With team members across continents, we celebrate diversity, curiosity, and innovation. We’re now looking for a Senior DevOps Engineer to join our fast-growing, remote-first team. If you're passionate about automation, scalable cloud systems, and supporting high-impact AI workloads, we’d love to connect. What You'll Do (Responsibilities): Design, implement, and manage scalable, secure, and high-performance cloud-native infrastructure across Azure. Build and maintain Infrastructure as Code (IaC) using Terraform or CloudFormation. Develop event-driven and serverless architectures using AWS Lambda, SQS, and SAM. Architect and manage containerized applications using Docker, Kubernetes, ECR, ECS, or AKS. Establish and optimize CI/CD pipelines using GitHub Actions, Jenkins, AWS CodeBuild & CodePipeline. Set up and manage monitoring, logging, and alerting using Prometheus + Grafana, Datadog, and centralized logging systems. Collaborate with ML Engineers and Data Engineers to support MLOps pipelines (Airflow, ML Pipelines) and Bedrock with Tensorflow or PyTorch. Implement and optimize ETL/data streaming pipelines using Kafka, EventBridge, and Event Hubs. Automate operations and system tasks using Python and Bash, along with Cloud CLIs and SDKs. Secure infrastructure using IAM/RBAC and follow best practices in secrets management and access control. Manage DNS and networking configurations using Cloudflare, VPC, and PrivateLink. Lead architecture implementation for scalable and secure systems, aligning with business and AI solution needs. Conduct cost optimization through budgeting, alerts, tagging, right-sizing resources, and leveraging spot instances. Contribute to backend development in Python (Web Frameworks), REST/Socket and gRPC design, and testing (unit/integration). Participate in incident response, performance tuning, and continuous system improvement. Good to Have: Hands-on experience with ML lifecycle tools like MLflow and Kubeflow Previous involvement in production-grade AI/ML projects or data-intensive systems Startup or high-growth tech company experience Qualifications: Bachelor’s degree in Computer Science, Information Technology, or a related field. 5+ years of hands-on experience in a DevOps, SRE, or Cloud Infrastructure role. Proven expertise in multi-cloud environments (AWS, Azure, GCP) and modern DevOps tooling. Strong communication and collaboration skills to work across engineering, data science, and product teams. Benefits: Competitive Salary 💸 Support for continual learning (free books and online courses) 📚 Leveling Up Opportunities 🌱 Diverse team environment 🌍 How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 5 days ago

Apply

1.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

About The Role We are looking for a motivated and detail-oriented Business Intelligence (BI) Analyst to join our growing team in Bangalore. In this role, you will work closely with cross-functional teams to transform data into insightful dashboards, KPIs, and reports that drive strategic decisions. Key Responsibilities Design, build, and maintain interactive dashboards and reports using tools like Power BI or Tableau. Work with stakeholders to identify, define, and track KPIs and business metrics. Write SQL queries to extract and manipulate data from relational databases. Use Python for data analysis, automation, and ad-hoc reporting scripts, as needed. Collaborate with data engineering teams to ensure data availability, quality, and consistency. Analyze large datasets to identify trends, patterns, and actionable insights. Ensure timely delivery of BI outputs in line with business goals and project timelines. Maintain documentation for processes, models, and dashboards. Assist in improving existing dashboards and processes for better scalability and performance. Required Skills And Qualifications Bachelor’s degree in Computer Science, Information Systems, Statistics, Mathematics, or a related field. 1+ year of hands-on experience in Business Intelligence or Data Analysis. Strong proficiency in SQL, with the ability to write advanced queries. Experience in building dashboards using Power BI, Tableau, or similar BI tools. Knowledge of Python for data manipulation and automation. Ability to translate business requirements into effective data visualizations and reports. Strong understanding of data modeling, ETL processes, and data quality concepts. Excellent communication, problem-solving, and time management skills. Ability to work independently and collaboratively in a fast-paced environment. Nice To Have Experience working in an Agile or fast-paced startup environment. Knowledge of cloud platforms like AWS, GCP, or Azure. Familiarity with version control tools (e.g., Git). What We Offer Competitive salary and benefits Opportunity to work on high-impact projects Supportive and results-driven work culture Learning and growth opportunities in a data-centric environment

Posted 5 days ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Python + AWS/DataBricks Developer 📍 Hyderabad (Work from Office) 📅 5+ years experience | Immediate joiners preferred 🔹 Must-have Skills: Expert Python programming (3.7+) Strong AWS (EC2, S3, Lambda, Glue, CloudFormation) DataBricks platform experience ETL pipeline development SQL/NoSQL databases PySpark/Pandas proficiency 🔹 Good-to-have: AWS certifications Terraform knowledge Airflow experience Interested candidates can share profiles to shruti.pandey@codeethics.in Please mention the position you're applying for! #Hiring #ReactJS #Python #AWS #DataBricks #HyderabadJobs #TechHiring #WFO

Posted 5 days ago

Apply

0.0 - 2.0 years

0 Lacs

Pune, Maharashtra, India

On-site

We are seeking a highly skilled Technical Data Analyst to join our growing team. This role requires a strong technical foundation in Oracle PL/SQL and Python, combined with expertise in data analysis tools and techniques. The ideal candidate will be a strategic thinker with a proven ability to lead and mentor a team of data analysts, driving data-driven insights and contributing to key business decisions. They will also be responsible for researching and evaluating emerging AI tools and techniques for potential application in data analysis projects. Responsibilities: Design, develop, and maintain complex Oracle PL/SQL queries and procedures for data extraction, transformation, and loading (ETL) processes. Utilize Python scripting for data analysis, automation, and reporting. Perform in-depth data analysis to identify trends, patterns, and anomalies, providing actionable insights to improve business performance. Collaborate with cross-functional teams to understand business requirements and translate them into technical specifications. Develop and maintain data quality standards and ensure data integrity across various systems. Leverage data analysis and visualization tools (e.g., Tableau, Power BI, Qlik Sense) to create interactive dashboards and reports for business stakeholders. Stay up-to-date with the latest data analysis tools, techniques, and industry best practices, including AI/ML advancements. Research and evaluate emerging AI/ML tools and techniques for potential application in data analysis projects. Preferred Qualifications: Hands-on work experience as Technical Data Analyst(not a business analyst) with Oracle PL/SQL & python programming to interpret analytical tasks and analyze the large datasets. Proficiency in Python scripting for data analysis and automation. Expertise in data visualization tools such as Tableau, Power BI, or Qlik Sense. Awareness and understanding of AI/ML tools and techniques in data analytics (e.g., machine learning algorithms, natural language processing, predictive modeling). Practical experience applying AI/ML techniques in data analysis projects is a plus. Strong analytical, problem-solving, communication, and interpersonal skills. Experience in the financial services industry. Qualifications: 0-2 years of relevant experience Experience in programming/debugging used in business applications Working knowledge of industry practice and standards Comprehensive knowledge of specific business area for application development Working knowledge of program languages Consistently demonstrates clear and concise written and verbal communication Education: Bachelor’s degree/University degree or equivalent experience This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required. Mandatory Skills Required - Ab Initio, Oracle PL/SQL, Unix/Linux. Minimum 2 years of hands-on Development experience ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.

Posted 5 days ago

Apply

3.0 years

0 Lacs

Hyderabad, Telangana, India

Remote

At Seismic, we're proud of our engineering culture where technical excellence and innovation drive everything we do. We're a remote-first data engineering team responsible for the critical data pipeline that powers insights for over 2,300 customers worldwide. Our team manages all data ingestion processes, leveraging technologies like Apache Kafka, Spark, various C# microservices services, and a shift-left data mesh architecture to transform diverse data streams into the valuable reporting models that our customers rely on daily to make data-driven decisions. Additionally, we're evolving our analytics platform to include AI-powered agentic workflows. Who You Are Have working knowledge of one OO language, preferably C#, but won’t hold your Java expertise against you (you’re the type of person who’s interested in learning and becoming an expert at new things). Additionally, we’ve been using Python more and more, and bonus points if you’re familiar with Scala. Have experience with architecturally complex distributed systems. Highly focused on operational excellence and quality – you have a passion to write clean and well tested code and believe in the testing pyramid. Outstanding verbal and written communication skills with the ability to work with others at all levels, effective at working with geographically remote and culturally diverse teams. You enjoy solving challenging problems, all while having a blast with equally passionate team members. Conversant in AI engineering. You’ve been experimenting with building ai solutions/integrations using LLMs, prompts, Copilots, Agentic ReAct workflows, etc. At Seismic, we’re committed to providing benefits and perks for the whole self. To explore our benefits available in each country, please visit the Global Benefits page. Please be aware we have noticed an increase in hiring scams potentially targeting Seismic candidates. Read our full statement on our Careers page. Seismic is the global leader in AI-powered enablement, empowering go-to-market leaders to drive strategic growth and deliver exceptional customer experiences at scale. The Seismic Enablement Cloud™ is the only unified AI-powered platform that prepares customer-facing teams with the skills, content, tools, and insights needed to maximize every buyer interaction and strengthen client relationships. Trusted by more than 2,000 organizations worldwide, Seismic helps businesses achieve measurable outcomes and accelerate revenue growth. Seismic is headquartered in San Diego with offices across North America, Europe, Asia and Australia. Learn more at seismic.com. Seismic is committed to building an inclusive workplace that ignites growth for our employees and creates a culture of belonging that allows all employees to be seen and valued for who they are. Learn more about DEI at Seismic here. Collaborating with experienced software engineers, data scientists and product managers to rapidly build, test, and deploy code to create innovative solutions and add value to our customers' experience. Building large scale platform infrastructure and REST APIs serving machine learning driven content recommendations to Seismic products. Leveraging the power of context in third-party applications such as CRMs to drive machine learning algorithms and models. Helping build next-gen Agentic tooling for reporting and insights Processing large amounts of internal and external system data for analytics, caching, modeling and more. Identifying performance bottlenecks and implementing solutions for them. Participating in code reviews, system design reviews, agile ceremonies, bug triage and on-call rotations. BS or MS in Computer Science, similar technical field of study, or equivalent practical experience. 3+ years of software development experience within a SaaS business. Must have a familiarity with .NET Core, and C# and frameworks. Experience in data engineering - building and managing Data Pipelines, ETL processes, and familiarity with various technologies that drive them: Kafka, FiveTran (Optional), Spark/Scala (Optional), etc. Data warehouse experience with Snowflake, or similar (AWS Redshift, Apache Iceberg, Clickhouse, etc). Familiarity with RESTFul microservice-based APIs Experience in modern CI/CD pipelines and infrastructure (Jenkins, Github Actions, Terraform, Kubernetes) a big plu (or equivalent) Experience with the SCRUM and the AGILE development process. Familiarity developing in cloud-based environments Optional: Experience with 3rd party integrations Optional: familiarity with Meeting systems like Zoom, WebEx, MS Teams Optional: familiarity with CRM systems like Salesforce, Microsoft Dynamics 365, Hubspot. If you are an individual with a disability and would like to request a reasonable accommodation as part of the application or recruiting process, please click here. Headquartered in San Diego and with employees across the globe, Seismic is the global leader in sales enablement , backed by firms such as Permira, Ameriprise Financial, EDBI, Lightspeed Venture Partners, and T. Rowe Price. Seismic also expanded its team and product portfolio with the strategic acquisitions of SAVO, Percolate, Grapevine6, and Lessonly. Our board of directors is composed of several industry luminaries including John Thompson, former Chairman of the Board for Microsoft. Seismic is an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to gender, age, race, religion, or any other classification which is protected by applicable law. Please note this job description is not designed to cover or contain a comprehensive listing of activities, duties or responsibilities that are required of the employee for this job. Duties, responsibilities and activities may change at any time with or without notice.

Posted 5 days ago

Apply

3.0 years

0 Lacs

Andhra Pradesh, India

On-site

At PwC, our people in business application consulting specialise in consulting services for a variety of business applications, helping clients optimise operational efficiency. These individuals analyse client needs, implement software solutions, and provide training and support for seamless integration and utilisation of business applications, enabling clients to achieve their strategic objectives. In SAP technology at PwC, you will specialise in utilising and managing SAP software and solutions within an organisation. You will be responsible for tasks such as installation, configuration, administration, development, and support of SAP products and technologies. Driven by curiosity, you are a reliable, contributing member of a team. In our fast-paced environment, you are expected to adapt to working with a variety of clients and team members, each presenting varying challenges and scope. Every experience is an opportunity to learn and grow. You are expected to take ownership and consistently deliver quality work that drives value for our clients and success as a team. As you navigate through the Firm, you build a brand for yourself, opening doors to more opportunities. Skills Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: Apply a learning mindset and take ownership for your own development. Appreciate diverse perspectives, needs, and feelings of others. Adopt habits to sustain high performance and develop your potential. Actively listen, ask questions to check understanding, and clearly express ideas. Seek, reflect, act on, and give feedback. Gather information from a range of sources to analyse facts and discern patterns. Commit to understanding how the business works and building commercial awareness. Learn and apply professional and technical standards (e.g. refer to specific PwC tax and audit guidance), uphold the Firm's code of conduct and independence requirements. SAP Native Hana Developer Technical Skills Bachelor's or Master's degree in a relevant field (e.g., computer science, information systems, engineering). Minimum of 3 years of experience in HANA Native development and configurations, including at least 1 year with SAP BTP Cloud Foundry and HANA Cloud. Demonstrated experience in working with various data sources SAP(SAP ECC, SAP CRM, SAP S/4HANA) and non-SAP (Oracle, Salesforce, AWS) Demonstrated expertise in designing and implementing solutions utilizing the SAP BTP platform. Solid understanding of BTP HANA Cloud and its service offerings. Strong focus on building expertise in constructing calculation views within the HANA Cloud environment (BAS) and other supporting data artifacts. Experience with HANA XS Advanced and HANA 2.0 versions. Ability to optimize queries and data models for performance in SAP HANA development environment and sound understanding of indexing, partitioning and other performance optimization techniques. Proven experience in applying SAP HANA Cloud development tools and technologies, including HDI containers, HANA OData Services , HANA XSA, strong SQL scripting, SDI/SLT replication, Smart Data Access (SDA) and Cloud Foundry UPS services. Experience with ETL processes and tools (SAP Data Services Preferred). Ability to debug and optimize existing queries and data models for performance. Hands-on experience in utilizing Git within Business Application Studio and familiarity with Github features and repository management. Familiarity with reporting tools and security based concepts within the HANA development environment. Understanding of the HANA Transport Management System, HANA Transport Container and CI/CD practices for object deployment. Knowledge of monitoring and troubleshooting techniques for SAP HANA BW environments. Familiarity with reporting tools like SAC/Power BI building dashboards and consuming data models is a plus. HANA CDS views: (added advantage) Understanding of associations, aggregations, and annotations in CDS views. Ability to design and implement data models using CDS. Certification in SAP HANA or related areas is a plus Functional knowledge of SAP business processes (FI/CO, MM, SD, HR).

Posted 5 days ago

Apply

6.0 - 9.0 years

0 Lacs

Gurugram, Haryana, India

On-site

About Everest Group Confident decisions driven by deep expertise and tenacious research. Everest Group is a research firm helping business leaders confidently navigate today’s market challenges, driving maximized operational and financial performance and transformative experiences. Our deep expertise and tenacious research focused on technology, business processes, and engineering through the lenses of talent, sustainability, and sourcing delivers precise and action-oriented guidance. For more information, visit www.everestgrp.com. About the Role Everest Group is seeking a highly driven and experienced Practice Director (PD) to join our team focused on Data, Automation & AI. This is a strategic and high-impact role, ideal for someone currently working in an analyst role at a leading peer research/advisory firm. The ideal candidate will bring deep expertise in evaluating and advising on technology products, with a specific focus on Data, Analytics, and Artificial Intelligence domains. Key Responsibilities Research Leadership: Lead research efforts in the Data, Analytics, and AI technology space, producing high-quality, insight-driven reports, market assessments, and provider evaluations. Thought Leadership: Create and publish forward-thinking insights on emerging trends, innovations, and market developments in data platforms, analytics solutions, machine learning operations (MLOps), GenAI infrastructure, and more. Advisory Engagements: Support client engagements by delivering market insights, competitive benchmarking, and strategic guidance based on proprietary research and market intelligence. Stakeholder Collaboration: Collaborate with internal teams across geographies, including analysts, marketing, and business development to drive go-to-market strategies and project delivery. Client Interactions: Present insights to enterprise clients, technology vendors, and service providers via briefings, webinars, and in-person sessions. Team Development: Mentor and support junior analysts (SAs and As) and contribute to building knowledge capabilities across the practice. Required Experience & Skills Domain Expertise: Deep understanding of and hands-on experience in evaluating Data, Analytics, and AI tools/technologies (e.g., database platforms, data governance platforms, ETL/ELT tools, BI platforms, AI / ML Platforms, etc.). Must have a strong consulting/advisory research background Industry Experience: 6 to 9 years of analyst experience with demonstrated focus on Data and AI technologies. Candidates with fewer years of experience may be considered for a Senior Analyst (SA) position. Analytical Skills: Strong ability to analyze market trends, vendor strategies, and enterprise needs to deliver actionable insights. Communication: Excellent written and verbal communication skills, including experience publishing research and presenting to executive audiences. Educational Background: Master from a top university is preferred. Bachelor’s degree is a must Preferred Qualifications Prior experience in primary/secondary research methodologies, market modeling, and competitive landscaping. Experience working with clients in a consulting, advisory, or research capacity. Exposure to global markets and understanding of enterprise technology adoption trends. Everest Group complies with the GDPR, CCPA/CPRA and other data protection regulations. For more information on how Everest Group processes your personal information, please read our Privacy Notice (www.everestgrp.com/privacy-notice-for-applicants-employees-and-contractors/). By submitting this application, you indicate that you have read and understand our privacy terms and consent to the processing of your personal information by us. To exercise your data subject rights under GDPR, CCPA/CPRA you can fill in our form available at Data Rights – Everest Group (everestgrp.com). You can email your data protection request to privacy@everestgrp.com. Everest Group is an equal opportunity employer. We have a culture of inclusion, and we provide equal opportunities for all applicants and employees, including those with disabilities. We are committed to providing an environment that is free of all discrimination and harassment and to treating all individuals with respect.

Posted 6 days ago

Apply

3.0 years

0 Lacs

Pune, Maharashtra, India

Remote

Your Future Evolves Here Evolent Health has a bold mission to change the health of the nation by changing the way health care is delivered. Our pursuit of this mission is the driving force that brings us to work each day. We believe in embracing new ideas, challenging ourselves and failing forward. We respect and celebrate individual talents and team wins. We have fun while working hard and Evolenteers often make a difference working in everything from scrubs to jeans. Are we growing? Absolutely and Globally. In 2021 we grew our teams by almost 50% and continue to grow even more in 2022. Are we recognized as a company you are supported by for your career and growth, and a great place to work? Definitely. Evolent Health International (Pune, India) has been certified as “Great Places to Work” in 2021. In 2020 and 2021 Evolent in the U.S. was both named Best Company for Women to Advance list by Parity.org and earned a perfect score on the Human Rights Campaign (HRC) Foundation’s Corporate Equality Index (CEI). This index is the nation's foremost benchmarking survey and report measuring corporate policies and practices related to LGBTQ+ workplace equality. We recognize employees that live our values, give back to our communities each year, and are champions for bringing our whole selves to work each day. If you’re looking for a place where your work can be personally and professionally rewarding, don’t just join a company with a mission. Join a mission with a company behind it. What You’ll Be Doing: Job Summary Design and develop BI reporting and data platforms. Creates the development of user-facing data visualization and presentation tools, including Microsoft SQL Server Reporting Services (SSRS) reports, Power BI dashboards, MicroStrategy and Excel PivotTables. Work on the development of data retrieval and data management for Evolent Health. Responsible for ensuring that the data assets of an organization are aligned with the organization in achieving its strategic goals. The architecture should cover databases, data integration and the means to get to the data. Help implement effective business analytics practices to enhance decision-making, efficiency, and performance. Assist with technology improvements to ensure continuous enhancements of the core BI platform. Data Analysis: Ability to perform complex data analysis using advanced SQL skills and Excel to support internal /external client’s data requests and queries for ad-hoc requests for business continuity and analytics. Communicate with non-technical business users to gather specific requirements for reports and BI solutions. Provide maintenance support for existing BI applications and reports Present work when requested and participate in knowledge-sharing sessions with team members. Required Qualifications 3-5 years of experience in BI/ Data Warehouse domain developing BI solutions and data analysis tasks using MSBI suites. Strong proficiency in Power BI: building reports, dashboards, DAX, and Power Query (M). Experience with Microsoft Fabric, including Lakehouse, Dataflows Gen2, and Direct Lake capabilities, Power Automate. Experience with Azure Data Services: Azure Data Factory, Azure Synapse, Azure Data Lake, or similar. Hands-on experience with SQL Server Reporting Services (SSRS) and SQL Server Integration Services (SSIS). Knowledge of Advanced SQL for data manipulation and performance tuning. Experience implementing ETL/ELT pipelines. Ability to work with both relational and cloud-based data sources. Preferred Qualifications Healthcare industry experience with exposure to authorizations/claims/eligibility and patient clinical data Experience with Python, Spark, or Databricks for data engineering or transformation. Familiarity with DevOps/GitRepo for BI, including deployment automation and CI/CD in Azure DevOps. Understanding of data governance, security models, and compliance. Experience with semantic modeling in Power BI and/or tabular models using Analysis Services. Exposure to AI and machine learning integrations within Microsoft Fabric or Azure. Experience with Power Apps, Microsoft purview Mandatory Requirements: Employees must have a high-speed broadband internet connection with a minimum speed of 50 Mbps and the ability to set up a wired connection to their home network to ensure effective remote work. These requirements may be updated as needed by the business. Evolent Health is an equal opportunity employer and considers all qualified applicants equally without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, or disability status .

Posted 6 days ago

Apply

5.0 - 12.0 years

0 Lacs

Coimbatore, Tamil Nadu, India

On-site

Data Software Engineer Location:Chennai and Coimbatore Mode:Hybrid Interview:Walkin 5-12 Years of in Big Data & Data related technology experience  Expert level understanding of distributed computing principles  Expert level knowledge and experience in Apache Spark  Hands on programming with Python  Proficiency with Hadoop v2, Map Reduce, HDFS, Sqoop  Experience with building stream-processing systems, using technologies such as Apache Storm or Spark-Streaming  Good understanding of Big Data querying tools, such as Hive, and Impala  Experience with integration of data from multiple data sources such as RDBMS (SQL Server, Oracle), ERP, Files  Good understanding of SQL queries, joins, stored procedures, relational schemas  Experience with NoSQL databases, such as HBase, Cassandra, MongoDB  Knowledge of ETL techniques and frameworks  Performance tuning of Spark Jobs  Experience with AZURE Databricks  Ability to lead a team efficiently  Experience with designing and implementing Big data solutions  Practitioner of AGILE methodology

Posted 6 days ago

Apply

2.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

About Marriott: Marriott Tech Accelerator is part of Marriott International, a global leader in hospitality. Marriott International, Inc. is a leading American multinational company that operates a vast array of lodging brands, including hotels and residential properties. It consists of over 30 well-known brands and nearly 8,900 properties situated in 141 countries and territories. Role Title: Security Data Engineer Position Summary: Marriott International’s Global Information Security is seeking a Data Engineer who can build and maintain the infrastructure and systems that collect, process, and store large amounts of security data for Marriott to use for security related analysis and decision-making. Job Responsibilities: Implement and maintain scalable data pipelines using tools such as Cribl Stream and Splunk Develop and maintain ETL (Extract, Transform, Load) processes Ensure data quality and implement validation checks Automate data workflows and processes Work with distributed computing frameworks (e.g., Hadoop, Spark) Implement solutions for processing large-scale datasets Utilize cloud platforms (AWS, Azure) for data management Optimize data retrieval and query performance Build integrations with various data sources Ensure compatibility between different systems and platforms Implement data security controls and access management Maintain data integrity and reliability Work closely with security data scientists, analysts, and business stakeholders Translate business requirements into technical specifications Monitor and troubleshoot data system performance Implement optimizations for efficiency and scalability Ensure high availability of data resources Skill and Experience: 2-4 years of data engineering, data analytics, data management, and/or information security experience that includes: 2+ years of experience in data engineering and/or data analytics in an enterprise environment 1+ years of experience in information protection / information security. Strong background in statistics, mathematics, and software engineering. Proficiency in Python, R, Java, or Scala Strong knowledge of SQL Expertise in relational databases (e.g., MySQL, PostgreSQL) Experience with NoSQL databases (e.g., MongoDB, Cassandra) Familiarity with cloud platforms (AWS, Azure, GCP) and big data frameworks such as Hadoop, Spark, and Kafka Experience with ETL (Extract, Transform, Load) processes Proficiency in data pipeline development and optimization Knowledge of cybersecurity principles, tools, and best practices Preferred: Programming languages: Python, R, SQL Big data technologies: Hadoop, Spark, and Kafka Cloud platforms: AWS, Azure, GCP Relevant certifications such as AWS Certified Data Analytics – Specialty, Google Cloud Professional Data Engineer, or IBM Certified Data Engineer Experience with Security information and event management (SIEM) systems such as Splunk Experience with data pipeline management and data transformation tools such as Cribl Familiarity with MLOps practice Understanding of machine learning algorithms and AI applications in data engineering Verbal and written communication skills to articulate complex technical concepts to both technical and non-technical stakeholders Experience working in Agile and Scrum methodologies Education and Certifications: Bachelor’s degree in computer / data science, information management, Cybersecurity or related field or equivalent experience / certification Work location: Hyderabad, India. Work mode: Hybrid

Posted 6 days ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Company Description Experian is a global data and technology company, powering opportunities for people and businesses around the world. We help to redefine lending practices, uncover and prevent fraud, simplify healthcare, create marketing solutions, all using our unique combination of data, analytics and software. We also assist millions of people to realise their financial goals and help them save time and money. We operate across a range of markets, from financial services to healthcare, automotive, agribusiness, insurance, and many more industry segments. We invest in people and new advanced technologies to unlock the power of data. As a FTSE 100 Index company listed on the London Stock Exchange (EXPN), we have a team of 22,500 people across 32 countries. Our corporate headquarters are in Dublin, Ireland. Learn more at experianplc.co m Job Description Job description We are looking for a Senior Software Engineer to join our Ascend Cloud Foundation Platform team. Background We unlock the power of data to create opportunities for consumers, businesses and society. At life’s big moments – from buying a home or car, to sending a child to university, to growing a business exponentially by connecting it with new customers – we empower consumers and our clients to manage their data with confidence so they can maximize every opportunity. We require a senior software engineer in Hyderabad, India to work alongside our UK colleagues to deliver business outcomes for the UK&I region. You will join an established agile technical team, where you will work with the Lead Engineer and Product Owner to help develop the consumer data attributes, work with data analytics to validate the accuracy of the calculations whilst ensuring that you work to the highest technical standards. Key Responsibilities Design, develop, and maintain scalable and efficient data pipelines and ETL processes to extract, transform, and load data from various sources into our data lake or warehouse. Collaborate with cross-functional teams including data scientists, analysts, and software engineers to understand data requirements, define data models, and implement solutions that meet business needs. Ensure the security, integrity, and quality of data throughout the data lifecycle, implementing best practices for data governance, encryption, and access control. Develop and maintain data infrastructure components such as data warehouses, data lakes, and data processing frameworks, leveraging cloud services (e.g., AWS, Azure, GCP) and containerization technologies (e.g., Docker, Kubernetes). Implement monitoring, logging, and alerting mechanisms to ensure the reliability and availability of data pipelines and systems, and to proactively identify and address issues. Work closely with stakeholders to understand business requirements, prioritize tasks, and deliver solutions in a timely manner within an Agile working environment. Collaborate with the risk, security and compliance teams to ensure adherence to regulatory requirements (e.g., GDPR, PCI DSS) and industry standards related to data privacy and security. Stay updated on emerging technologies, tools, and best practices in the field of data engineering, and propose innovative solutions to improve efficiency, performance, and scalability. Mentor and coach junior engineers, fostering a culture of continuous learning and professional development within the team. Participate in code reviews, design discussions, and other Agile ceremonies to promote collaboration, transparency, and continuous improvement. Qualifications Qualifications Qualified to Degree, HND or HNC standard in a software engineering and/or data engineering discipline or can demonstrate commercial experience Required Skills/ Experience Experience of the full development lifecycle Strong communication skills with the ability to explain solutions to technical and non-technical audiences Write clean, scalable and re-usable code that implements SOLID principles, common design patterns where applicable and adheres to published coding standards Excellent attention to detail, ability to analyse, investigate and compare large data sets when required. 3 or more years of programming using Scala 2 or more years of programming using Python Some experience of using Terraform to provision and deploy cloud services and components Experience of developing on Apache Spark Experience of developing with AWS cloud services including (but not limited to) AWS Glue, S3, Step Functions, Lambdas, EventBridge and SQS BDD / TDD experience Jenkins CI / CD experience Application Lifecycle Management Tools - BitBucket & Jira Performing Pull Request reviews Understanding of Agile methodologies Automated Testing Tools Advantageous Experience Mentoring or coaching junior engineers Cloud Solution Architecture Document databases Relational Databases Experience with Container technologies (e.g. Kubernetes) Would Consider Alternative Skills And Experience Java (rather than Scala) Google Cloud or Microsoft Azure (rather than AWS) Azure Pipelines or TeamCity (rather than Jenkins) Github (rather than BitBucket) Azure DevOps (rather than Jira) CloudFormation (rather than Terraform) Additional Information Our uniqueness is that we celebrate yours. Experian's culture and people are important differentiators. We take our people agenda very seriously and focus on what matters; DEI, work/life balance, development, authenticity, collaboration, wellness, reward & recognition, volunteering... the list goes on. Experian's people first approach is award-winning; World's Best Workplaces™ 2024 (Fortune Global Top 25), Great Place To Work™ in 24 countries, and Glassdoor Best Places to Work 2024 to name a few. Check out Experian Life on social or our Careers Site and Glassdoor to understand why. Experian is proud to be an Equal Opportunity and Affirmative Action employer. Innovation is a critical part of Experian's DNA and practices, and our diverse workforce drives our success. Everyone can succeed at Experian and bring their whole self to work, irrespective of their gender, ethnicity, religion, color, sexuality, physical ability or age. If you have a disability or special need that requires accommodation, please let us know at the earliest opportunity. Benefits Experian care for employee's work life balance, health, safety and wellbeing. In support of this endeavor, we offer best-in-class family well-being benefits, enhanced medical benefits and paid time off. Experian Careers - Creating a better tomorrow together Find out what its like to work for Experian by clicking here

Posted 6 days ago

Apply

3.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Company Description At Nielsen, we are passionate about our work to power a better media future for all people by providing powerful insights that drive client decisions and deliver extraordinary results. Our talented, global workforce is dedicated to capturing audience engagement with content - wherever and whenever it’s consumed. Together, we are proudly rooted in our deep legacy as we stand at the forefront of the media revolution. When you join Nielsen, you will join a dynamic team committed to excellence, perseverance, and the ambition to make an impact together. We champion you, because when you succeed, we do too. We enable your best to power our future. Job Description This role will be part of a team that develops software that processes data captured every day from over a quarter of a million Computer and Mobile devices worldwide. Measuring panelists activities as they surf the Internet via Browsers, or utilizing Mobile App’s download from Apple’s and Google’s store. The Nielsen software meter used to capture this usage data has been optimized to be unobtrusive yet gather many biometric data points that the backend system can use to identify who is using the device, and also detect fraudulent behavior. The Software Engineer is ultimately responsible for delivering technical solutions: starting from the project's onboard until post launch support and including design, development, testing. It is expected to coordinate, support and work with multiple delocalized project teams in multiple regions. As a member of the technical staff with our Digital Meter Processing team, you will further develop the backend system that processes massive amounts of data every day, across 3 different AWS regions. Your role will involve designing, implementing, and maintaining robust, scalable solutions that leverage a Java based system that runs in an AWS environment. You will play a key role in shaping the technical direction of our projects and mentoring other team members. Qualifications Responsibilities System Deployment: Conceive, design and build new features in the existing backend processing pipelines. CI/CD Implementation: Design and implement CI/CD pipelines for automated build, test, and deployment processes. Ensure continuous integration and delivery of features, improvements, and bug fixes. Code Quality and Best Practices: Enforce coding standards, best practices, and design principles. Conduct code reviews and provide constructive feedback to maintain high code quality. Performance Optimization: Identify and address performance bottlenecks in both reading, processing and writing data to the backend data stores. Mentorship and Collaboration: Mentor junior engineers, providing guidance on technical aspects and best practices. Collaborate with cross-functional teams to ensure a cohesive and unified approach to software development. Security and Compliance: Implement security best practices for all tiers of the system. Ensure compliance with industry standards and regulations related to AWS platform security. Key Skills Bachelor's or Master’s degree in Computer Science, Software Engineering, or a related field. Proven experience, minimum 3 years, in high-volume data processing development expertise using ETL tools such as AWS Glue or PySpark, Java, SQL and databases such as Postgres Minimum 2 years development on an AWS platform Strong understanding of CI/CD principles and tools. GitLab a plus Excellent problem-solving and debugging skills. Strong communication and collaboration skills with ability to communicate complex technical concepts and align organization on decisions Sound problem-solving skills with the ability to quickly process complex information and present it clearly and simply Utilizes team collaboration to create innovative solutions efficiently Other Desirable Skills Knowledge of networking principles and security best practices. AWS certifications Experience with Data Warehouses, ETL, and/or Data Lakes very desirable Experience with RedShift, Airflow, Python, Lambda, Prometheus, Grafana, & OpsGeni a bonus Exposure to the Google Cloud Platform (GCP) Additional Information Please be aware that job-seekers may be at risk of targeting by scammers seeking personal data or money. Nielsen recruiters will only contact you through official job boards, LinkedIn, or email with a nielsen.com domain. Be cautious of any outreach claiming to be from Nielsen via other messaging platforms or personal email addresses. Always verify that email communications come from an @nielsen.com address. If you're unsure about the authenticity of a job offer or communication, please contact Nielsen directly through our official website or verified social media channels.

Posted 6 days ago

Apply

0 years

0 Lacs

North Delhi, Delhi, India

On-site

*Job Title: Business Analyst /Data Analyst Location: Lucknow /Delhi Department: Data Analytics About Innefu Labs Pvt Ltd: Innefu Labs is a leading cybersecurity and data analytics company, providing innovative solutions to government and corporate clients. Our mission is to leverage advanced technology to solve complex challenges and deliver value-driven results. Join our dynamic team and be a part of the future of cybersecurity and analytics. Job Summary: We are looking for a passionate and detail-oriented Data Analyst to join our data analytics team. The ideal candidate will have strong analytical skills and the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy. Key Responsibilities: Collect and analyze large datasets to derive actionable insights. Develop and implement data collection systems and other strategies that optimize statistical efficiency and data quality. Identify, analyze, and interpret trends or patterns in complex data sets. Filter and clean data by reviewing reports and performance indicators to locate and correct code problems. Work with management to prioritize business and information needs. Locate and define new process improvement opportunities. Create detailed reports and presentations to communicate findings to stakeholders. Collaborate with other teams to integrate and apply data analysis to ongoing projects. Qualifications: Bachelor’s degree in Mathematics, Economics, Computer Science, Information Management, or Statistics. Proven working experience as a Data Analyst or Business Data Analyst. Technical expertise regarding data models, database design development, data mining, and segmentation techniques. Strong knowledge of and experience with reporting packages (Business Objects, etc.), databases (SQL, etc.), programming (XML, Javascript, or ETL frameworks). Knowledge of statistics and experience using statistical packages for analyzing datasets (Excel, SPSS, SAS, etc.). Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy. Adept at queries, report writing, and presenting findings. Excellent communication and collaboration skills. Preferred Skills: Experience in the cybersecurity industry. Familiarity with machine learning techniques and algorithms. Proficiency in Python/R for data analysis.

Posted 6 days ago

Apply

0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Job Description Are You Ready to Make It Happen at Mondelēz International? Join our Mission to Lead the Future of Snacking. Make It With Pride. Together with analytics team leaders you will support our business with excellent data models to uncover trends that can drive long-term business results. How You Will Contribute You will: Work in close partnership with the business leadership team to execute the analytics agenda Identify and incubate best-in-class external partners to drive delivery on strategic projects Develop custom models/algorithms to uncover signals/patterns and trends to drive long-term business performance Execute the business analytics program agenda using a methodical approach that conveys to stakeholders what business analytics will deliver What You Will Bring A desire to drive your future and accelerate your career and the following experience and knowledge: Using data analysis to make recommendations to senior leaders Technical experience in roles in best-in-class analytics practices Experience deploying new analytical approaches in a complex and highly matrixed organization Savvy in usage of the analytics techniques to create business impacts Are You Ready to Make It Happen at Mondelēz International? Join our Mission to Lead the Future of Snacking. Make It With Pride. Together with various commercial teams and leaders you will support our business with Data & Analytics capabilities at scale to help the company make faster and smarter business decisions. How You Will Contribute You will be: Developing foundational analytics capabilities across pricing, promotions, trade investment, in store execution, call planning—enabling more informed, strategic choices by customer and channel. Maintaining and operationalizing data assets, ensuring data integrity, and supporting the connection of multiple commercial datasets (e.g., sales, pricing, promotional history) into usable insights. Enabling repeatable processes and standard reports, which are essential for scaling analytics across markets and functions. What You Will Bring A desire to drive your future and accelerate your career and the following experience and knowledge: Using data and analytics to bring forth capabilities to enable analysis and decision making in the broader organization Technical experience to develop and guide partners to deliver best-in-class solutions Experience deploying new analytical approaches in a complex and highly matrixed organization More About The Organization Data & Analytics (D&A) is a critical enabler for growth across the markets, by leading the Data & Analytics strategic vision & roadmap, building momentum by rallying the rest of the organization, implementing data & analytics identified priorities at scale across AMEA to deliver strong business value across all levels of organization at right cost structure, facilitating and conceptualizing adoption plans as well as continuous improvement plans. The D&A organization has a set of central teams; Data Management, Analytics Products, Center of Enablement and Data Science resources, which should be maximized at the service of the AMEA business priorities – this requires strong collaboration and influencing skills to drive adoption, relevancy, and business impact with speed. This role sits in the regional D&A structure that brings together these practices to deliver value based on an opportunity that is identified in the region. We partner commercial functions; Consumer-Marketing, Customer-Sales, Supply Chain and Finance on the D&A agenda for each function. Building capability within the teams including leading, coaching, guiding, and inspiring the direct and indirect internal & external teams, multifunctional teams in region & business units in order to build a robust ecosystem that can deliver and embed analytics in the identified business processes. What you need to know about this position: A key success driver in this role is to leverage foundational analytics capabilities across pricing, promotions, and trade investment to enable more informed, strategic decisions by customer and channel. The role will involve prototyping and scaling analytics solutions to meet unmet business needs within AMEA and potentially across global markets through collaboration with internal teams or external partners. The role is expected to evolve into a regional expert in commercial data and the market landscape, enabling consultation on the deployment of globally scalable solutions. This includes understanding business needs, translating stakeholder requirements, and ensuring alignment with regional priorities. The analyst will drive end-to-end analytics solutions—from data ingestion and processing to visualization and insights—by conducting workshops and business interviews. This includes defining and operationalizing advanced methodologies and translating data assets (e.g., sales, pricing, promotional history) into actionable insights. The role will also focus on designing and maintaining scalable data products—dashboards, automated reports, and self-serve analytics tools—that deliver business value and empower commercial users to make data-informed decisions with speed and accuracy. Through consultation and training, the analyst will help raise data coverage, usage, accessibility, and literacy across the organization, embedding a data-driven culture within commercial teams. The role will also involve coaching and nurturing an agile and collaborative D&A team, fostering innovation and continuous improvement in analytical approaches. What extra ingredients you will bring: Job specific requirements: Data & Analytics Skills Working Knowledge: Analytics (diagnostic, descriptive, predictive and prescriptive) techniques Practitioner: Data management, for example data integration (ETL) or metadata. Awareness: Data architecture, for example the difference between data warehouse, data lake or data hub. Working Knowledge: Data modelling, for creation of right reusable data assets Awareness: Data governance, for example MDM, data quality and data stewardship practices. Working Knowledge: Statistical skills, for example understanding the difference between correlation and causation. Working Knowledge: Business data, for Customer POS, Media GRP, Shopper Panels, Shipments, Geo-Location, Trade Spend, etc Working Knowledge: UX/design, for example by creating products and visualizations that are easy to work with and support the activities required by the end users Technology Skills Working Knowledge: Programming languages like SQL, Python or R Practitioner: Analytics and Business Intelligence tools like Microsoft Power BI or Tableau. Soft skills to be successful in the role (not a pre-requisite): Leadership with high level of self-initiative and drive, for example leading the discussions on D&A agenda in the region and building a combined vision across multiple stakeholders Communication, for example conveying information to diverse audiences in a way that is easily understood and actionable. Facilitation and conflict resolution, for example hosting sessions to elicit ideas from others, understand their issues and encourage group participation. Creative thinking and being comfortable with unknown or unchartered territories, for example framing new concepts for business teams and brainstorming with business users about future product and services. Teamwork & Collaboration, for example working with both business domain teams as well as D&A teams and stakeholders. Storytelling & influencing, for example by creating a consistent, clear storyline for better understanding and/or to assert ideas and persuading others to gain support across an organization or to adopt new behaviors. Domain skills would be an advantage (not a pre-requisite): Commercial/Sales Acumen, for example understanding business concepts, practices and business domain (RTM/RGM) language to engage in problem solving sessions and discuss business issues in stakeholder language. Project management capabilities to manage a workplan, for example understanding of project management concepts to organize their own work and the ability to collaborate with project managers to align business expectations with the D&A team delivery capabilities. Vendor negotiation and effort estimation skills, for example to manage the right partner skills at right cost based on the complexity and importance of the initiatives to be delivered or supported Education / Certifications: Computer Science graduate with preferred master’s in information management/data science/applied Analytics. Within Country Relocation support available and for candidates voluntarily moving internationally some minimal support is offered through our Volunteer International Transfer Policy Business Unit Summary Headquartered in Singapore, Mondelēz International’s Asia, Middle East and Africa (AMEA) region is comprised of six business units, has more than 21,000 employees and operates in more than 27 countries including Australia, China, Indonesia, Ghana, India, Japan, Malaysia, New Zealand, Nigeria, Philippines, Saudi Arabia, South Africa, Thailand, United Arab Emirates and Vietnam. Seventy-six nationalities work across a network of more than 35 manufacturing plants, three global research and development technical centers and in offices stretching from Auckland, New Zealand to Casablanca, Morocco. Mondelēz International in the AMEA region is the proud maker of global and local iconic brands such as Oreo and belVita biscuits, Kinh Do mooncakes, Cadbury, Cadbury Dairy Milk and Milka chocolate, Halls candy, Stride gum, Tang powdered beverage and Philadelphia cheese. We are also proud to be named a Top Employer in many of our markets. Mondelēz International is an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, gender, sexual orientation or preference, gender identity, national origin, disability status, protected veteran status, or any other characteristic protected by law. Job Type Regular Analytics & Modelling Analytics & Data Science

Posted 6 days ago

Apply

4.0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Senior DevOps Engineer Experience: 4 - 7 Years Exp Salary : Competitive Preferred Notice Period : Within 30 Days Opportunity Type: Onsite (Ahmedabad) Placement Type: Permanent (*Note: This is a requirement for one of Uplers' Clients) Must have skills: Azure OR Docker, TensorFlow, Python OR Shell Scripting Attri (One of Uplers' Clients) is Looking for: Senior DevOps Engineer who is passionate about their work, eager to learn and grow, and who is committed to delivering exceptional results. If you are a team player, with a positive attitude and a desire to make a difference, then we want to hear from you. What You'll Do (Responsibilities): Design, implement, and manage scalable, secure, and high-performance cloud-native infrastructure across Azure. Build and maintain Infrastructure as Code (IaC) using Terraform or CloudFormation. Develop event-driven and serverless architectures using AWS Lambda, SQS, and SAM. Architect and manage containerized applications using Docker, Kubernetes, ECR, ECS, or AKS. Establish and optimize CI/CD pipelines using GitHub Actions, Jenkins, AWS CodeBuild & CodePipeline. Set up and manage monitoring, logging, and alerting using Prometheus + Grafana, Datadog, and centralized logging systems. Collaborate with ML Engineers and Data Engineers to support MLOps pipelines (Airflow, ML Pipelines) and Bedrock with Tensorflow or PyTorch. Implement and optimize ETL/data streaming pipelines using Kafka, EventBridge, and Event Hubs. Automate operations and system tasks using Python and Bash, along with Cloud CLIs and SDKs. Secure infrastructure using IAM/RBAC and follow best practices in secrets management and access control. Manage DNS and networking configurations using Cloudflare, VPC, and PrivateLink. Lead architecture implementation for scalable and secure systems, aligning with business and AI solution needs. Conduct cost optimization through budgeting, alerts, tagging, right-sizing resources, and leveraging spot instances. Contribute to backend development in Python (Web Frameworks), REST/Socket and gRPC design, and testing (unit/integration). Participate in incident response, performance tuning, and continuous system improvement. Good to Have: Hands-on experience with ML lifecycle tools like MLflow and Kubeflow Previous involvement in production-grade AI/ML projects or data-intensive systems Startup or high-growth tech company experience Qualifications: Bachelor’s degree in Computer Science, Information Technology, or a related field. 5+ years of hands-on experience in a DevOps, SRE, or Cloud Infrastructure role. Proven expertise in multi-cloud environments (AWS, Azure, GCP) and modern DevOps tooling. Strong communication and collaboration skills to work across engineering, data science, and product teams. How to apply for this opportunity: Easy 3-Step Process: 1. Click On Apply! And Register or log in on our portal 2. Upload updated Resume & Complete the Screening Form 3. Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring and getting hired reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant product and engineering job opportunities and progress in their career. (Note: There are many more opportunities apart from this on the portal.) So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 6 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies