Jobs
Interviews

8820 Hadoop Jobs - Page 50

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0 years

0 Lacs

Greater Chennai Area

On-site

We’re growing our Ford GBS team in our Chennai office and are looking for a Senior Data Analyst SUMMARY: OneMagnify is a one-of-a-kind combination of technology, creative, strategy, data, analytics, and a wide range of marketing services that cut across multiple practice and solution areas to deliver a whole that is truly greater than the sum of its parts. Today’s sophisticated business and marketing landscape requires a fresh, vigilant, and more connected approach to human communications – and we get it. We’re always looking for diverse, passionate, and skilled candidates to join our award-winning work environments. Our offices have been rated Top, Best, and Coolest places to work by numerous sources for multiple years in a row dating back to 2010. In fact, we were recently rated one of the Best and Brightest companies to work for in the nation. Apply today. Show us what you can do. You might be exactly who we need. ABOUT YOU: You have a passion for data, a keen attention to detail and an analytical mindset. You are a problem-solver that enjoys working collaboratively to use data insight to drive business solutions. What You'll Do: Works under guidance of Team Lead and/or Ford supervisors to support client objectives and project goals by developing data-driven and insightful reports, visualizations, dashboards, and communicating results within project lifecycle guidelines, using appropriate programming languages & visualization tools. What You'll Need: Successful candidate must be familiar with solving complex data and analytics issues through data manipulation, analysis, presentation and reporting. She/he will be responsible for multi-tasking between adhoc and project based deliverables. Bachelor/Masters Degree in a technical field is required (computer science, information systems, mathematics, statistics). Knowledge of SQL, Alteryx and Qlikview is required. Familiarity with other related tools such as Qliksense, Hadoop, Teradata, Python, SAS, R or other dahboarding software (Tableau, Spotfire, etc.) will be considered Benefits We offer a comprehensive benefits package including Medical Insurance, PF, Gratuity, paid holidays, and more. About Us Whether it’s awareness, advocacy, engagement, or efficacy, we move brands forward with work that connects with audiences and delivers results. Through meaningful analytics, engaging communications and innovative technology solutions, we help clients tackle their most ambitious projects and overcome their biggest challenges. We are an equal opportunity employer We believe that Innovative ideas and solutions start with unique perspectives. That’s why we’re committed to providing every employee a workplace that’s free of discrimination and intolerance. We’re proud to be an equal opportunity employer and actively search for like-minded people to join our team We’re growing our Ford GBS team in our Chennai office and are looking for a Senior Data Analyst

Posted 2 weeks ago

Apply

7.0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Job Description Building off our Cloud momentum, Oracle has formed a new organization - Health Data Intelligence. This team will focus on product development and product strategy for Oracle Health, while building out a complete platform supporting modernized, automated healthcare. This is a net new line of business, constructed with an entrepreneurial spirit that promotes an energetic and creative environment. We are unencumbered and will need your contribution to make it a world class engineering center with the focus on excellence. Oracle Health Data Analytics has a rare opportunity to play a critical role in how Oracle Health products impact and disrupt the healthcare industry by transforming how healthcare and technology intersect. Career Level - IC4 Responsibilities As a member of the software engineering division, you will take an active role in the definition and evolution of standard practices and procedures. Define specifications for significant new projects and specify, design and develop software according to those specifications. You will perform professional software development tasks associated with the developing, designing and debugging of software applications or operating systems. Design and build distributed, scalable, and fault-tolerant software systems. Build cloud services on top of the modern OCI infrastructure. Participate in the entire software lifecycle, from design to development, to quality assurance, and to production. Invest in the best engineering and operational practices upfront to ensure our software quality bar is high. Optimize data processing pipelines for orders of magnitude higher throughput and faster latencies. Leverage a plethora of internal tooling at OCI to develop, build, deploy, and troubleshoot software. Qualifications 7+ years of experience in the software industry working on design, development and delivery of highly scalable products and services. Understanding of the entire product development lifecycle that includes understanding and refining the technical specifications, HLD and LLD of world-class products and services, refining the architecture by providing feedback and suggestions, developing, and reviewing code, driving DevOps, managing releases and operations. Strong knowledge of Java or JVM based languages. Experience with multi-threading and parallel processing. Strong knowledge of big data technologies like Spark, Hadoop Map Reduce, Crunch, etc. Past experience of building scalable, performant, and secure services/modules. Understanding of Micro Services architecture and API design Experience with Container platforms Good understanding of testing methodologies. Experience with CI/CD technologies. Experience with observability tools like Spunk, New Relic, etc Good understanding of versioning tools like Git/SVN. Qualifications Career Level - IC4 About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law.

Posted 2 weeks ago

Apply

5.0 - 9.0 years

15 - 25 Lacs

Bengaluru

Work from Office

Primary Skill Set: Data Engineering, Python,Pyspark,Cloud (AWS/GCP), SCALA. Primary Skill: Snowflake, Cloud (AWS, GCP), SCALA, Python, Spark, Big Data and SQL. QUALIFICATION: Bachelors or masters degree JOB RESPONSIBILITY: strong development experience in Snowflake, Cloud (AWS, GCP), SCALA, Python, Spark, Big Data and SQL. Work closely with stakeholders, including product managers and designers, to align technical solutions with business goals. Maintain code quality through reviews and make architectural decisions that impact scalability and performance. Performs Root cause Analysis for any critical defects and address technical challenges, optimize workflows, and resolve issues efficiently. Expert in Agile, Waterfall Program/Project mplementation. Manages strategic and tactical relationships with program stakeholders. Successfully executing projects within strict deadlines while managing intense pressure. Good understanding of SDLC (Software Development Life Cycle) Identify potential technical risks and implement mitigation strategies Excellent verbal, written, and interpersonal communication abilities, coupled with strong problem-solving, facilitation, and analytical skills. Cloud Management Activities – To have a good understanding of the cloud architecture /containerization and application management on AWS and Kubernetes, to have in

Posted 2 weeks ago

Apply

7.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Description Building off our Cloud momentum, Oracle has formed a new organization - Health Data Intelligence. This team will focus on product development and product strategy for Oracle Health, while building out a complete platform supporting modernized, automated healthcare. This is a net new line of business, constructed with an entrepreneurial spirit that promotes an energetic and creative environment. We are unencumbered and will need your contribution to make it a world class engineering center with the focus on excellence. Oracle Health Data Analytics has a rare opportunity to play a critical role in how Oracle Health products impact and disrupt the healthcare industry by transforming how healthcare and technology intersect. Career Level - IC4 Responsibilities As a member of the software engineering division, you will take an active role in the definition and evolution of standard practices and procedures. Define specifications for significant new projects and specify, design and develop software according to those specifications. You will perform professional software development tasks associated with the developing, designing and debugging of software applications or operating systems. Design and build distributed, scalable, and fault-tolerant software systems. Build cloud services on top of the modern OCI infrastructure. Participate in the entire software lifecycle, from design to development, to quality assurance, and to production. Invest in the best engineering and operational practices upfront to ensure our software quality bar is high. Optimize data processing pipelines for orders of magnitude higher throughput and faster latencies. Leverage a plethora of internal tooling at OCI to develop, build, deploy, and troubleshoot software. Qualifications 7+ years of experience in the software industry working on design, development and delivery of highly scalable products and services. Understanding of the entire product development lifecycle that includes understanding and refining the technical specifications, HLD and LLD of world-class products and services, refining the architecture by providing feedback and suggestions, developing, and reviewing code, driving DevOps, managing releases and operations. Strong knowledge of Java or JVM based languages. Experience with multi-threading and parallel processing. Strong knowledge of big data technologies like Spark, Hadoop Map Reduce, Crunch, etc. Past experience of building scalable, performant, and secure services/modules. Understanding of Micro Services architecture and API design Experience with Container platforms Good understanding of testing methodologies. Experience with CI/CD technologies. Experience with observability tools like Spunk, New Relic, etc Good understanding of versioning tools like Git/SVN. Qualifications Career Level - IC4 About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law.

Posted 2 weeks ago

Apply

3.0 - 5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

About Lowe’s Lowe’s is a FORTUNE® 100 home improvement company serving approximately 16 million customer transactions a week in the United States. With total fiscal year 2024 sales of more than $83 billion, Lowe’s operates over 1,700 home improvement stores and employs approximately 300,000 associates. Based in Mooresville, N.C., Lowe’s supports the communities it serves through programs focused on creating safe, affordable housing, improving community spaces, helping to develop the next generation of skilled trade experts and providing disaster relief to communities in need. For more information, visit Lowes.com. Lowe’s India, the Global Capability Center of Lowe’s Companies Inc., is a hub for driving our technology, business, analytics, and shared services strategy. Based in Bengaluru with over 4,500 associates, it powers innovations across omnichannel retail, AI/ML, enterprise architecture, supply chain, and customer experience. From supporting and launching homegrown solutions to fostering innovation through its Catalyze platform, Lowe’s India plays a pivotal role in transforming home improvement retail while upholding strong commitment to social impact and sustainability. For more information, visit Lowes India About Team The Customer Insights team uses data and analytics capabilities to provide strategic insights around customer and competitive landscape to help drive effective market strategy for Lowe’s. The team works closely with various key functions across the business to provide insights and recommendations across different business areas. The team is responsible to track, report and analyze various customer and market related metrics and generate actionable insights on key customer segments, customer experience vis-à-vis Lowe’s and our market position along with opportunities to help inform an effective Go To Market Strategy and win market share across key markets, customer segments and product categories. Job Summary The primary responsibility is to report, analyze and provide insights on Customer Experience at Lowe’s across selling channels, customer segments and product categories. The individual will apply analytical methods to combine internal and external data and analyze trends in Customer Experience Metrics and the factors that play a key role driving improvement in those metrics. This position is responsible for following best practices in turning business questions into data analysis, analyzing results and identifying insights for decision making; determine additional research/analytics that may be needed to enhance the solution; and coordinate with cross-functional teams to ensure project continuity and strategic alignment. The individual will also have to proactively take up initiatives to apply modern tools and techniques to improve efficiency, accuracy and overall quality of insights offered to stakeholders. Roles & Responsibilities Core Responsibilities: Analyze Customer feedback, LTR, NPS data to understand in-market trends and where to focus to improve Customer experience. Work with our US (Mooresville) team to assist them in defining various reporting / analysis needs and building appropriate methodologies to provide actionable insights on Experience Metrics. Identifying the appropriate univariate and multivariate analysis to identify key customer trends and insights – Segmentation, Bayesian Networks, Factor analysis etc. Synthesize disparate sources of data—primary and secondary to develop cohesive stories, trends and insights that will drive strategic decision making across the enterprise. Leverage available information across workstreams to help connect dots and provide holistic insights on Customer Experience trends and Strategy. Work with the data operation teams to enhance data capabilities and develop tools to improve ease of access to data and BI for the broader organization Years Of Experience 3-5 Years’ Hands on experience with Customer Experience Analytics / Customer Analytics / Customer Insights Education Qualification & Certifications (optional) Required Minimum Qualifications Master’s degree in Economics / Statistics / Analytics or MBA in Marketing Skill Set Required Primary Skills (must have) Hands on experience in SQL, Teradata, Hadoop, Python Hands-on Analytics Experience of building Statistical / Mathematical models and multivariate analysis like Segmentation, Logistic Regression, Bayesian Networks, Factor analysis, Conjoint analysis, etc. Ability to apply Analytical tools and techniques to extract insights from Structured and Unstructured data. Consulting Skills - Ability to impact business decisions through analytics and research. Hands on experience in creating Executive level audience ready presentations to tell impactful stories. Excellent communication skill to connect with people from diverse background and experience Secondary Skills (desired) Experience on working with text data would be an advantage. Experience in working with Customer Experience or Voice of Customer Metrics will be a good to have. Familiarity with retail industry and key concepts Lowe's is an equal opportunity employer and administers all personnel practices without regard to race, color, religious creed, sex, gender, age, ancestry, national origin, mental or physical disability or medical condition, sexual orientation, gender identity or expression, marital status, military or veteran status, genetic information, or any other category protected under federal, state, or local law. Starting rate of pay may vary based on factors including, but not limited to, position offered, location, education, training, and/or experience. For information regarding our benefit programs and eligibility, please visit https://talent.lowes.com/us/en/benefits.

Posted 2 weeks ago

Apply

7.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

About the Role We are seeking a highly skilled and experienced Data Analyst with a strong background in the Banking, Financial Services, and Insurance (BFSI) domain. The ideal candidate will be proficient in Python, SQL, and SAS and possess a keen analytical mindset to derive insights from complex data sets and support data-driven decision-making. Location: Chennai, Hyderabad, Mumbai, Delhi, Pune. Key Responsibilities Perform data extraction, transformation, and analysis using SQL, Python, and SAS . Conduct exploratory and statistical analysis to identify trends, patterns, and actionable insights. Work closely with business and technical stakeholders to understand data requirements and translate them into effective solutions. Build and maintain dashboards, reports, and data models to support business operations and strategic initiatives. Collaborate with cross-functional teams in an Agile environment to deliver data solutions aligned with organizational goals. Ensure data quality, accuracy, and consistency across systems and reports. Provide domain-specific insights related to BFSI operations, products, and customer behavior. Mandatory Skills & Experience Minimum 7 years of experience as a Data Analyst. Proficient in Python, SQL, and SAS – hands-on experience is a must. Strong experience working in the BFSI (Banking, Financial Services, and Insurance) domain. Excellent understanding of data analysis, data modeling, and reporting techniques. Ability to handle large datasets, perform data wrangling, and build meaningful visualizations. Experience working in Agile teams using tools such as JIRA and Confluence is a plus. Strong communication and stakeholder management skills. Preferred Qualifications Bachelor's or Master’s degree in Computer Science, Statistics, Mathematics, or a related field. Knowledge of tools like Power BI, Tableau, or other BI tools is an advantage. Exposure to GCP, Hadoop, or cloud data platforms is a plus. About the company Capco, a Wipro company, is a global technology and management consultancy specializing in driving digital transformation in the financial services industry. With a growing client portfolio comprising of over 100 global organizations, Capco operates at the intersection of business and technology by combining innovative thinking with unrivalled industry knowledge to deliver end-to-end data-driven solutions and fast-track digital initiatives for banking and payments, capital markets, wealth and asset management, insurance, and the energy sector. Capco’s cutting-edge ingenuity is brought to life through its Innovation Labs and award-winning Be Yourself At Work culture and diverse talent. Our approach is tailor-made to fit with each client’s problem with an emphasis on building long-term strategic partnerships that foster collaboration and trust. We have the people, the vision, and the passion. Capco is committed to providing clients with practical solutions. We offer a globally integrated service with offices in leading financial centers across the Americas, Europe and Asia Pacific.

Posted 2 weeks ago

Apply

3.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Description Oracle Cloud Infrastructure (OCI) is a strategic growth area for Oracle. It is a comprehensive cloud service offering in the enterprise software industry, spanning Infrastructure as a Service (IaaS), Platform as a Service (PaaS), and Software as a Service (SaaS). OCI is currently building a future-ready Gen2 cloud Data Science service platform. At the core of this platform, lies Cloud AI Cloud Service. What OCI AI Cloud Services are: A set of services on the public cloud, that are powered by ML and AI to meet the Enterprise modernization needs, and that work out of the box. These services and models can be easily specialized for specific customers/domains by demonstrating existing OCI services. Key Points: Enables customers to add AI capabilities to their Apps and Workflows easily via APIs or Containers, Useable without needing to build AI expertise in-house and Covers key gaps – Decision Support, NLP, for Public Clouds and Enterprise in NLU, NLP, Vision and Conversational AI. You’re Opportunity: As we innovate to provide a single collaborative ML environment for data-science professionals, we will be extremely happy to have you join us and share the very future of our Machine Learning platform - by building an AI Cloud service. We are addressing exciting challenges at the intersection of artificial intelligence and innovative cloud infrastructure. We are building cloud services in Computer vision for Image/Video and Document Analysis, Decision Support (Anomaly Detection, Time series forecasting, Fraud detection, Content moderation, Risk prevention, predictive analytics), Natural Language Processing (NLP), and, Speech that works out of the box for enterprises. Our product vision includes the ability for enterprises to be able to customize the services for their business and train them to specialize in their data by creating micro models that enhance the global AI models. What You’ll Do Develop scalable infrastructure, including microservices and a backend, that automates training, deployment, and optimization of ML model inference. Building a core of Artificial Intelligence and AI services such as Vision, Speech, Language, Decision, and others. Brainstorm and design various POCs using AI Perpetual AI Services for new or existing enterprise problems. Collaborate with fellow data scientists/SW engineers to build out other parts of the infrastructure, effectively communicating your needs, understanding theirs, and addressing external and internal shareholder product challenges. Lead research and development efforts to explore new tools, frameworks, and methodologies to improve backend development processes. Experiment with ML models in Python/C++ using machine learning libraries (Pytorch, ONNX, TensorRT, Triton, TensorFlow, Jax), etc. Leverage Cloud technology – Oracle Cloud (OCI), AWS, GCP, Azure, or similar technology. Qualifications Master’s degree or equivalent experience (preferred) in computer science, Statistics or Mathematics, artificial intelligence, machine learning, Computer vision, operations research, or related technical field. 3+ years for PhD or equivalent experience, 5+ years for Masters, or demonstrated ability designing, implementing, and deploying machine learning models in production environments. Practical experience in design, implementation, and production deployment of distributed systems using microservices architecture and APIs using common frameworks like Spring Boot (Java), etc. Practical experience working in a cloud environment: Oracle Cloud (OCI), AWS, GCP, Azure, and containerization (Docker, Kubernetes). Working knowledge of current techniques, approaches, and inference optimization strategies in machine learning models. Experience with performance tuning, scalability, and load balancing techniques. Expert in at least one high-level language such as Java/C++ (Java preferred). Expert in at least one scripting language such as Python, Javascript, and Shell . Deep understanding of data structures, and algorithms, and excellent problem-solving skills. Experience or willingness to learn and work in Agile and iterative development and DevOps processes. Strong drive to learn and master new technologies and techniques. You enjoy a fast-paced work environment. Additional Preferred Qualifications Experience with Cloud Native Frameworks tools and products is a plus Experience in Computer vision tasks like Image Classification, Object Detection, Segmentation, Text detection & recognition, Information extraction from documents, etc. Having an impressive set of GitHub projects or contributions to open-source technologies is a plus Hands-on experience with horizontally scalable data stores such as Hadoop and other NoSQL technologies like Cassandra is a plus. Our vision is to provide an immersive AI experience on Oracle Cloud. Aggressive as it might sound, our growth journey is fueled by highly energetic, technology-savvy engineers like YOU who are looking to grow with us to meet the demands of building a powerful next-generation platform. Are you ready to do something big? Career Level - IC3 About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law.

Posted 2 weeks ago

Apply

7.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Job Description Building off our Cloud momentum, Oracle has formed a new organization - Health Data Intelligence. This team will focus on product development and product strategy for Oracle Health, while building out a complete platform supporting modernized, automated healthcare. This is a net new line of business, constructed with an entrepreneurial spirit that promotes an energetic and creative environment. We are unencumbered and will need your contribution to make it a world class engineering center with the focus on excellence. Oracle Health Data Analytics has a rare opportunity to play a critical role in how Oracle Health products impact and disrupt the healthcare industry by transforming how healthcare and technology intersect. Career Level - IC4 Responsibilities As a member of the software engineering division, you will take an active role in the definition and evolution of standard practices and procedures. Define specifications for significant new projects and specify, design and develop software according to those specifications. You will perform professional software development tasks associated with the developing, designing and debugging of software applications or operating systems. Design and build distributed, scalable, and fault-tolerant software systems. Build cloud services on top of the modern OCI infrastructure. Participate in the entire software lifecycle, from design to development, to quality assurance, and to production. Invest in the best engineering and operational practices upfront to ensure our software quality bar is high. Optimize data processing pipelines for orders of magnitude higher throughput and faster latencies. Leverage a plethora of internal tooling at OCI to develop, build, deploy, and troubleshoot software. Qualifications 7+ years of experience in the software industry working on design, development and delivery of highly scalable products and services. Understanding of the entire product development lifecycle that includes understanding and refining the technical specifications, HLD and LLD of world-class products and services, refining the architecture by providing feedback and suggestions, developing, and reviewing code, driving DevOps, managing releases and operations. Strong knowledge of Java or JVM based languages. Experience with multi-threading and parallel processing. Strong knowledge of big data technologies like Spark, Hadoop Map Reduce, Crunch, etc. Past experience of building scalable, performant, and secure services/modules. Understanding of Micro Services architecture and API design Experience with Container platforms Good understanding of testing methodologies. Experience with CI/CD technologies. Experience with observability tools like Spunk, New Relic, etc Good understanding of versioning tools like Git/SVN. Qualifications Career Level - IC4 About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law.

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Eviden, part of the Atos Group, with an annual revenue of circa € 5 billion is a global leader in data-driven, trusted and sustainable digital transformation. As a next generation digital business with worldwide leading positions in digital, cloud, data, advanced computing and security, it brings deep expertise for all industries in more than 47 countries. By uniting unique high-end technologies across the full digital continuum with 47,000 world-class talents, Eviden expands the possibilities of data and technology, now and for generations to come. Role Overview The Senior Tech Lead - AWS Data Engineering leads the design, development and optimization of data solutions on the AWS platform. The jobholder has a strong background in data engineering, cloud architecture, and team leadership, with a proven ability to deliver scalable and secure data systems. Responsibilities Lead the design and implementation of AWS-based data architectures and pipelines. Architect and optimize data solutions using AWS services such as S3, Redshift, Glue, EMR, and Lambda. Provide technical leadership and mentorship to a team of data engineers. Collaborate with stakeholders to define project requirements and ensure alignment with business goals. Ensure best practices in data security, governance, and compliance. Troubleshoot and resolve complex technical issues in AWS data environments. Stay updated on the latest AWS technologies and industry trends. Key Technical Skills & Responsibilities Overall 10+Yrs of Experience in IT Minimum 5-7 years in design and development of cloud data platforms using AWS services Must have experience of design and development of data lake / data warehouse / data analytics solutions using AWS services like S3, Lake Formation, Glue, Athena, EMR, Lambda, Redshift Must be aware about the AWS access control and data security features like VPC, IAM, Security Groups, KMS etc Must be good with Python and PySpark for data pipeline building. Must have data modeling including S3 data organization experience Must have an understanding of hadoop components, No SQL database, graph database and time series database; and AWS services available for those technologies Must have experience of working with structured, semi-structured and unstructured data Must have experience of streaming data collection and processing. Kafka experience is preferred. Experience of migrating data warehouse / big data application to AWS is preferred . Must be able to use Gen AI services (like Amazon Q) for productivity gain Eligibility Criteria Bachelor’s degree in Computer Science, Data Engineering, or a related field. Extensive experience with AWS data services and tools. AWS certification (e.g., AWS Certified Data Analytics - Specialty). Experience with machine learning and AI integration in AWS environments. Strong understanding of data modeling, ETL/ELT processes, and cloud integration. Proven leadership experience in managing technical teams. Excellent problem-solving and communication skills. Our Offering Global cutting-edge IT projects that shape the future of digital and have a positive impact on environment. Wellbeing programs & work-life balance - integration and passion sharing events. Attractive Salary and Company Initiative Benefits Courses and conferences Attractive Salary Hybrid work culture Let’s grow together.

Posted 2 weeks ago

Apply

7.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Job Description Building off our Cloud momentum, Oracle has formed a new organization - Health Data Intelligence. This team will focus on product development and product strategy for Oracle Health, while building out a complete platform supporting modernized, automated healthcare. This is a net new line of business, constructed with an entrepreneurial spirit that promotes an energetic and creative environment. We are unencumbered and will need your contribution to make it a world class engineering center with the focus on excellence. Oracle Health Data Analytics has a rare opportunity to play a critical role in how Oracle Health products impact and disrupt the healthcare industry by transforming how healthcare and technology intersect. Career Level - IC4 Responsibilities As a member of the software engineering division, you will take an active role in the definition and evolution of standard practices and procedures. Define specifications for significant new projects and specify, design and develop software according to those specifications. You will perform professional software development tasks associated with the developing, designing and debugging of software applications or operating systems. Design and build distributed, scalable, and fault-tolerant software systems. Build cloud services on top of the modern OCI infrastructure. Participate in the entire software lifecycle, from design to development, to quality assurance, and to production. Invest in the best engineering and operational practices upfront to ensure our software quality bar is high. Optimize data processing pipelines for orders of magnitude higher throughput and faster latencies. Leverage a plethora of internal tooling at OCI to develop, build, deploy, and troubleshoot software. Qualifications 7+ years of experience in the software industry working on design, development and delivery of highly scalable products and services. Understanding of the entire product development lifecycle that includes understanding and refining the technical specifications, HLD and LLD of world-class products and services, refining the architecture by providing feedback and suggestions, developing, and reviewing code, driving DevOps, managing releases and operations. Strong knowledge of Java or JVM based languages. Experience with multi-threading and parallel processing. Strong knowledge of big data technologies like Spark, Hadoop Map Reduce, Crunch, etc. Past experience of building scalable, performant, and secure services/modules. Understanding of Micro Services architecture and API design Experience with Container platforms Good understanding of testing methodologies. Experience with CI/CD technologies. Experience with observability tools like Spunk, New Relic, etc Good understanding of versioning tools like Git/SVN. Qualifications Career Level - IC4 About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law.

Posted 2 weeks ago

Apply

10.0 years

0 Lacs

Bengaluru, Karnataka, India

Remote

About Tide At Tide, we are building a business management platform designed to save small businesses time and money. We provide our members with business accounts and related banking services, but also a comprehensive set of connected administrative solutions from invoicing to accounting. Launched in 2017, Tide is now used by over 1 million small businesses across the world and is available to UK, Indian and German SMEs. Headquartered in central London, with offices in Sofia, Hyderabad, Delhi, Berlin and Belgrade, Tide employs over 2,000 employees. Tide is rapidly growing, expanding into new products and markets and always looking for passionate and driven people. Join us in our mission to empower small businesses and help them save time and money. About The Role As Staff Data Scientist for the business area you will work closely with the Business Team, Product Managers, Data Governance team, Analysts, Scientists and Data Engineers in order to deliver Company, Business, Product OKRs (Objectives and Key Results). You will also look into data initiatives that drive efficiency in the e2e (end to end) process, from data ingestion to insight generation including data science / machine learning models for decisioning. This role is an excellent opportunity for anyone who is interested in helping /building/embedding data initiatives into several products in a rapidly scaling environment. You will be able to influence our roadmap, learn about best practices and be able to quickly see the impact of your work. As a Staff Data Scientist You’ll Be Develop and plan our roadmap for our domain analytics and data engineering & science team Run scrum ceremonies with our Product/Business team Triage requests, create the work breakdown structure and assign it to respective Engineers/Scientists Work with Engineers, Scientists and governance team to identify challenges they face and work with them to identify solutions to these problems Ensure stakeholders are updated and informed about changes in our domain specific data needs Build and track metrics for the performance of our Engineering & Science team. Feedback to Product and Business Teams Ability to deal with ambiguity and propose innovative solutions without getting blocked What Are We Looking For You have 10+ years of experience in Software development or Machine Learning. With 4+ years of product management experience and at least 2 years as a Product Owner embedding data initiatives into products especially Data Science and Machine Learning You can prioritise ML Data Science and Machine Learning product roadmaps for the respective businesses based on OKRs and priorities You have a deep understanding of managing technical products with a background in data You have a high level understanding with big-data technologies such as Spark, SparkML, Hadoop etc. Strong knowledge of Cloud (AWS or other) You’ve delivered on fast-growing product-focused company before as a Data Manager or Data Lead or Data Program manager (products where the customer is retail or small business - as opposed to internal-facing tools) You’re organised, pragmatic and capable of engaging, guiding and leading cross functional teams or managing large scale enterprise products. You have technical knowledge and experience and have strong empathy for developer audience You’re a self-starter who can work comfortably in a fast-moving company where priorities can change and processes may need to be created from scratch with minimal guidance. You have significant experience working with varied stakeholders You have good technical knowledge in SQL, strong in Python programming You have a good understanding on how the performance optimization works in the end to end data pipeline including ML/DS inferencing You have excellent leadership skills - you have managed a team of data scientists before and coached them to become better versions of themselves OUR TECH STACK (You don’t have to excel in all, but willing to learn them): Databricks on AWS Python Snowflake Tecton - feature store Fiddler - model observability platform What You Will Get In Return Make work, work for you! We are embracing new ways of working and support flexible working arrangements. With our Working Out of Office (WOO) policy our colleagues can work remotely from home or anywhere in their assigned Indian state. Additionally, you can work from a different country or Indian state for 90 days of the year. Plus, you’ll get: Competitive salary Self & Family Health Insurance Term & Life Insurance OPD Benefits Mental wellbeing through Plumm Learning & Development Budget WFH Setup allowance 15 days of Privilege leaves 12 days of Casual leaves 12 days of Sick leaves 3 paid days off for volunteering or L&D activities Stock Options TIDEAN WAYS OF WORKING At Tide, we champion a flexible workplace model that supports both in-person and remote work to cater to the specific needs of our different teams. While remote work is supported, we believe in the power of face-to-face interactions to foster team spirit and collaboration. Our offices are designed as hubs for innovation and team-building, where we encourage regular in-person gatherings to foster a strong sense of community. TIDE IS A PLACE FOR EVERYONE At Tide, we believe that we can only succeed if we let our differences enrich our culture. Our Tideans come from a variety of backgrounds and experience levels. We consider everyone irrespective of their ethnicity, religion, sexual orientation, gender identity, family or parental status, national origin, veteran, neurodiversity or differently-abled status. We celebrate diversity in our workforce as a cornerstone of our success. Our commitment to a broad spectrum of ideas and backgrounds is what enables us to build products that resonate with our members’ diverse needs and lives. We are One Team and foster a transparent and inclusive environment, where everyone’s voice is heard. At Tide, we thrive on diversity, embracing various backgrounds and experiences. We welcome all individuals regardless of ethnicity, religion, sexual orientation, gender identity, or disability. Our inclusive culture is key to our success, helping us build products that meet our members' diverse needs. We are One Team, committed to transparency and ensuring everyone’s voice is heard. You personal data will be processed by Tide for recruitment purposes and in accordance with Tide's Recruitment Privacy Notice .

Posted 2 weeks ago

Apply

0 years

0 Lacs

Telangana, India

On-site

Overview Job Summary: We are seeking a highly skilled Databricks Platform Operations Engineer to join our team, responsible for daily monitoring and resolution of data load issues, platform optimization, capacity planning, and governance management. This role is pivotal in ensuring the stability, scalability, and security of our Databricks environment while acting as a technical architect for platform best practices. The ideal candidate will bring a strong operational background, potentially with earlier experience as a Linux, Hadoop, or Spark administrator, and possess deep expertise in managing cloud-based data platforms. Databricks Operations Engineer Location: Hyderabad/Bangalore Shift: 24x7 Work Mode: Work from Office Responsibilities Key Responsibilities: Primary Responsibility: Data Load Monitoring & Issue Resolution Monitor data ingestion and processing dashboards daily to identify, diagnose, and resolve data load and pipeline issues promptly. Act as the primary responder to data pipeline failures, collaborating with data engineering teams for rapid troubleshooting and remediation. Ensure data availability, reliability, and integrity through proactive incident management and validation. Maintain detailed logs and reports on data load performance and incident resolution. Platform Optimization & Capacity Planning Continuously optimize Databricks cluster configurations, job execution, and resource allocation for cost efficiency and performance. Conduct capacity planning to anticipate future resource needs and scaling requirements based on workload trends. Analyze platform usage patterns and recommend infrastructure enhancements to support business growth. Databricks Governance & Security Implement and enforce data governance policies within Databricks, including access control, data lineage, and compliance standards. Manage user permissions and roles using Azure AD, AWS IAM, or equivalent systems to uphold security and governance best practices. Collaborate with security and compliance teams to ensure adherence to organizational policies and regulatory requirements. Technical Architecture & Collaboration Serve as a Databricks platform architect, providing guidance on environment setup, best practices, and integration with other data systems. Work closely with data engineers, data scientists, governance teams, and business stakeholders to align platform capabilities with organizational goals. Develop and maintain comprehensive documentation covering platform architecture, operational procedures, and governance frameworks. Operational Excellence & Automation Troubleshoot and resolve platform and job-related issues in collaboration with internal teams and Databricks support. Automate routine administrative and monitoring tasks using scripting languages (Python, Bash, PowerShell) and infrastructure-as-code tools (Terraform, ARM templates). Participate in on-call rotations and incident management processes to ensure continuous platform availability. Requirements Required Qualifications: Experience in administering Databricks or comparable cloud-based big data platforms. Experience with Jenkins Scripting/ Pipeline Scripting Demonstrated experience in daily monitoring and troubleshooting of data pipelines and load processes. Strong expertise in Databricks platform optimization, capacity planning, governance, and architecture. Background experience as Linux Administrator, Hadoop Administrator, or Spark Administrator is highly desirable. Proficiency with cloud platforms (Azure, AWS, or GCP) and their integration with Databricks. Experience managing user access and permissions with Azure Active Directory, AWS IAM, or similar identity management tools. Solid understanding of data governance principles, including RBAC, data lineage, security, and compliance. Proficient in scripting languages such as Python, Bash, or PowerShell for automation and operational tasks. Excellent troubleshooting, problem-solving, communication, and collaboration skills. Preferred Skills: Experience with infrastructure-as-code tools like Terraform or ARM templates. Familiarity with data catalog and governance tools such as Azure Purview. Working knowledge of Apache Spark and SQL to support platform administration and governance monitoring. Experience designing and implementing data lakehouse architectures.

Posted 2 weeks ago

Apply

0 years

0 Lacs

India

On-site

Sanctity AI is a Netherlands-based startup founded by an IIT alum, specializing in ethical, safe, and impactful artificial intelligence. Our agile team is deeply focused on critical areas like AI alignment, responsible LLM training, prompt orchestration, and advanced agent infrastructure. In a landscape where many talk ethics, we build and deploy solutions that genuinely embody ethical AI principles. Sanctity AI is positioned at the forefront of solving real-world alignment challenges, shaping the future of trustworthy artificial intelligence. We leverage proprietary algorithms, rigorous ethical frameworks, and cutting-edge research to deliver AI solutions with unparalleled transparency, robustness, and societal impact. Sanctity AI represents a rare opportunity in the rapidly evolving AI ecosystem, committed to sustainable innovation and genuine human-AI harmony. The Role As an AI ML Intern reporting directly to the founder, you’ll go beyond just coding. You’ll own whole pipelines—from data wrangling to deploying cutting-edge ML models in production. You’ll also get hands-on experience with large language models (LLMs), prompt engineering, semantic search, and retrieval-augmented generation. Whether it’s spinning up APIs in FastAPI, containerizing solutions with Docker, or exploring vector and graph databases like Pinecone and Neo4j, you’ll be right at the heart of our AI innovation. What You’ll Tackle Data to Insights: Dive into heaps of raw data, and turn it into actionable insights that shape real decisions. Model Building & Deployment: Use Scikit-learn, XGBoost, LightGBM, and advanced deep learning frameworks (TensorFlow, PyTorch, Keras) to develop state-of-the-art models. Then, push them to production—scaling on AWS, GCP, or other cloud platforms. LLM & Prompt Engineering: Fine-tune and optimize large language models. Experiment with prompt strategies and incorporate RAG (Retrieval-Augmented Generation) for more insightful outputs. Vector & Graph Databases: Implement solutions using Pinecone, Neo4j, or similar technologies for advanced search and data relationships. Microservices & Big Data: Leverage FastAPI (or similar frameworks) to build robust APIs. If you love large-scale data processing, dabble in Apache Spark, Hadoop, or Kafka to handle the heavy lifting. Iterative Improvement: Observe model performance, gather metrics, and keep refining until the results shine. Who You Are Python Pro: You write clean, efficient Python code using libraries like Pandas, NumPy, and Scikit-learn. Passionate About AI/ML: You’ve got a solid grasp of algorithms and can’t wait to explore deep learning or advanced NLP. LLM Enthusiast: You’re familiar with training or fine-tuning large language models and love the challenge of prompt engineering. Cloud & Containers Savvy: You’ve at least toyed with AWS, GCP, or similar, and have some experience with Docker or other containerization tools. Data-Driven & Detail-Oriented: You enjoy unearthing insights in noisy datasets and take pride in well-documented, maintainable code. Curious & Ethical: You believe AI should be built responsibly and love learning about new ways to do it better. Languages: You can fluently communicate complex technical ideas in English. Fluency in Dutch, Spanish or French is a plus. Math Wizard: You have a strong grip on Advanced Mathematics and Statistical modeling. This is a core requirement. Why Join Us? Real-World Impact: Your work will address real world and industry challenges—problems that genuinely need AI solutions. Mentorship & Growth: Team up daily with founders and seasoned AI pros, accelerating your learning and skill-building. Experimentation Culture: We encourage big ideas and bold experimentation. Want to try a new approach? Do it. Leadership Path: Show us your passion and skills, and you could move into a core founding team member role, shaping our future trajectory. Interested? Send over your résumé, GitHub repos, or any project links that showcase your passion and talent. We can’t wait to see how you think, build, and innovate. Let’s team up to create AI that isn’t just powerful—but also responsibly built for everyone.

Posted 2 weeks ago

Apply

10.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Company Description Phonologies manages telephony infrastructure for contact center applications and chatbots. Our platform helps businesses answer millions of customer support queries using automated voicebots, improving customer interactions and operational efficiencies. Leading pharmacy chains, Fortune 500 companies, and North America's largest carrier rely on our solutions to reduce the burden on live agents and lower costs. Phonologies is based in India and operates globally. Role Description This is a full-time on-site role for a Lead Data Engineer & Architect, located in Pune. The Lead Data Engineer & Architect will be responsible for designing and implementing data architecture, developing and maintaining data pipelines, conducting data analysis, and collaborating with various teams to improve data-driven decision-making. The role also includes leading data engineering projects and ensuring data quality and security. Qualifications 10+ years in enterprise data engineering and architecture Expert in ETL, orchestration, and streaming pipelines Skilled in Hadoop, Spark, Azure, Kafka, Kubernetes Built MLOps and AutoML-ready production pipelines Delivered telecom, banking, and public sector solutions Leads cross-functional teams with strong client focus Certified in data platforms and AI leadership

Posted 2 weeks ago

Apply

10.0 years

0 Lacs

Greater Delhi Area

Remote

About Tide At Tide, we are building a business management platform designed to save small businesses time and money. We provide our members with business accounts and related banking services, but also a comprehensive set of connected administrative solutions from invoicing to accounting. Launched in 2017, Tide is now used by over 1 million small businesses across the world and is available to UK, Indian and German SMEs. Headquartered in central London, with offices in Sofia, Hyderabad, Delhi, Berlin and Belgrade, Tide employs over 2,000 employees. Tide is rapidly growing, expanding into new products and markets and always looking for passionate and driven people. Join us in our mission to empower small businesses and help them save time and money. About The Role As Staff Data Scientist for the business area you will work closely with the Business Team, Product Managers, Data Governance team, Analysts, Scientists and Data Engineers in order to deliver Company, Business, Product OKRs (Objectives and Key Results). You will also look into data initiatives that drive efficiency in the e2e (end to end) process, from data ingestion to insight generation including data science / machine learning models for decisioning. This role is an excellent opportunity for anyone who is interested in helping /building/embedding data initiatives into several products in a rapidly scaling environment. You will be able to influence our roadmap, learn about best practices and be able to quickly see the impact of your work. As a Staff Data Scientist You’ll Be Develop and plan our roadmap for our domain analytics and data engineering & science team Run scrum ceremonies with our Product/Business team Triage requests, create the work breakdown structure and assign it to respective Engineers/Scientists Work with Engineers, Scientists and governance team to identify challenges they face and work with them to identify solutions to these problems Ensure stakeholders are updated and informed about changes in our domain specific data needs Build and track metrics for the performance of our Engineering & Science team. Feedback to Product and Business Teams Ability to deal with ambiguity and propose innovative solutions without getting blocked What Are We Looking For You have 10+ years of experience in Software development or Machine Learning. With 4+ years of product management experience and at least 2 years as a Product Owner embedding data initiatives into products especially Data Science and Machine Learning You can prioritise ML Data Science and Machine Learning product roadmaps for the respective businesses based on OKRs and priorities You have a deep understanding of managing technical products with a background in data You have a high level understanding with big-data technologies such as Spark, SparkML, Hadoop etc. Strong knowledge of Cloud (AWS or other) You’ve delivered on fast-growing product-focused company before as a Data Manager or Data Lead or Data Program manager (products where the customer is retail or small business - as opposed to internal-facing tools) You’re organised, pragmatic and capable of engaging, guiding and leading cross functional teams or managing large scale enterprise products. You have technical knowledge and experience and have strong empathy for developer audience You’re a self-starter who can work comfortably in a fast-moving company where priorities can change and processes may need to be created from scratch with minimal guidance. You have significant experience working with varied stakeholders You have good technical knowledge in SQL, strong in Python programming You have a good understanding on how the performance optimization works in the end to end data pipeline including ML/DS inferencing You have excellent leadership skills - you have managed a team of data scientists before and coached them to become better versions of themselves OUR TECH STACK (You don’t have to excel in all, but willing to learn them): Databricks on AWS Python Snowflake Tecton - feature store Fiddler - model observability platform What You Will Get In Return Make work, work for you! We are embracing new ways of working and support flexible working arrangements. With our Working Out of Office (WOO) policy our colleagues can work remotely from home or anywhere in their assigned Indian state. Additionally, you can work from a different country or Indian state for 90 days of the year. Plus, you’ll get: Competitive salary Self & Family Health Insurance Term & Life Insurance OPD Benefits Mental wellbeing through Plumm Learning & Development Budget WFH Setup allowance 15 days of Privilege leaves 12 days of Casual leaves 12 days of Sick leaves 3 paid days off for volunteering or L&D activities Stock Options TIDEAN WAYS OF WORKING At Tide, we champion a flexible workplace model that supports both in-person and remote work to cater to the specific needs of our different teams. While remote work is supported, we believe in the power of face-to-face interactions to foster team spirit and collaboration. Our offices are designed as hubs for innovation and team-building, where we encourage regular in-person gatherings to foster a strong sense of community. TIDE IS A PLACE FOR EVERYONE At Tide, we believe that we can only succeed if we let our differences enrich our culture. Our Tideans come from a variety of backgrounds and experience levels. We consider everyone irrespective of their ethnicity, religion, sexual orientation, gender identity, family or parental status, national origin, veteran, neurodiversity or differently-abled status. We celebrate diversity in our workforce as a cornerstone of our success. Our commitment to a broad spectrum of ideas and backgrounds is what enables us to build products that resonate with our members’ diverse needs and lives. We are One Team and foster a transparent and inclusive environment, where everyone’s voice is heard. At Tide, we thrive on diversity, embracing various backgrounds and experiences. We welcome all individuals regardless of ethnicity, religion, sexual orientation, gender identity, or disability. Our inclusive culture is key to our success, helping us build products that meet our members' diverse needs. We are One Team, committed to transparency and ensuring everyone’s voice is heard. You personal data will be processed by Tide for recruitment purposes and in accordance with Tide's Recruitment Privacy Notice .

Posted 2 weeks ago

Apply

6.0 years

0 Lacs

Mumbai Metropolitan Region

Remote

Business Area: Professional Services Seniority Level: Mid-Senior level Job Description: At Cloudera, we empower people to transform complex data into clear and actionable insights. With as much data under management as the hyperscalers, we're the preferred data partner for the top companies in almost every industry. Powered by the relentless innovation of the open source community, Cloudera advances digital transformation for the world’s largest enterprises. Team Description Cloudera is seeking a Solutions Consultant to join its APAC Professional Services team. In this role you’ll have the opportunity to develop massively scalable solutions to solve complex data problems using CDP, NiFi, Spark and related Big Data technology. This role is a client facing opportunity that combines consulting skills with deep technical design and development in the Big Data space. This role will present the successful candidate the opportunity to work across multiple industries and large customer organizations. As the Solution Consultant you will : Work directly with customers to implement Big Data solutions at scale using the Cloudera Data Platform and Cloudera Dataflow Design and implement CDP platform architectures and configurations for customers Perform platform installation and upgrades for advanced secured cluster configurations Analyze complex distributed production deployments, and make recommendations to optimize performance Able to document and present complex architectures for the customers technical teams Work closely with Cloudera’ teams at all levels to help ensure the success of project consulting engagements with customer Drive projects with customers to successful completion Write and produce technical documentation, blogs and knowledgebase articles Participate in the pre-and post- sales process, helping both the sales and product teams to interpret customers’ requirements Keep current with the Hadoop Big Data ecosystem technologies Work in different timezone We’re excited about you if you have: 6+ years in Information Technology and System Architecture experience 4+ years of Professional Services (customer facing) experience architecting large scale storage, data center and /or globally distributed solutions 3+ years designing and deploying 3 tier architectures or large-scale Hadoop solutions Ability to understand big data use-cases and recommend standard design patterns commonly used in Hadoop-based and streaming data deployments. Knowledge of the data management eco-system including: Concepts of data warehousing, ETL, data integration, etc. Ability to understand and translate customer requirements into technical requirements Experience implementing data transformation and processing solutions Experience designing data queries against data in the HDFS environment using tools such as Apache Hive Experience setting up multi-node Hadoop clusters Experience in configuring security configurations (LDAP/AD, Kerberos/SPNEGO) Experience in Cloudera Software and/or HDP Certification (HDPCA / HDPCD) is a plus Strong experience implementing software and/or solutions in the enterprise Linux environment Strong understanding with various enterprise security solutions such as LDAP and/or Kerberos Strong understanding of network configuration, devices, protocols, speeds and optimizations Strong understanding of the Java ecosystem including debugging, logging, monitoring and profiling tools Excellent verbal and written communications What you can expect from us: Generous PTO Policy Support work life balance with Unplugged Days Flexible WFH Policy Mental & Physical Wellness programs Phone and Internet Reimbursement program Access to Continued Career Development Comprehensive Benefits and Competitive Packages Paid Volunteer Time Employee Resource Groups EEO/VEVRAA

Posted 2 weeks ago

Apply

9.0 - 12.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Description: About Us At Bank of America, we are guided by a common purpose to help make financial lives better through the power of every connection. Responsible Growth is how we run our company and how we deliver for our clients, teammates, communities and shareholders every day. One of the keys to driving Responsible Growth is being a great place to work for our teammates around the world. We’re devoted to being a diverse and inclusive workplace for everyone. We hire individuals with a broad range of backgrounds and experiences and invest heavily in our teammates and their families by offering competitive benefits to support their physical, emotional, and financial well-being. Bank of America believes both in the importance of working together and offering flexibility to our employees. We use a multi-faceted approach for flexibility, depending on the various roles in our organization. Working at Bank of America will give you a great career with opportunities to learn, grow and make an impact, along with the power to make a difference. Join us! Global Business Services Global Business Services delivers Technology and Operations capabilities to Lines of Business and Staff Support Functions of Bank of America through a centrally managed, globally integrated delivery model and globally resilient operations. Global Business Services is recognized for flawless execution, sound risk management, operational resiliency, operational excellence and innovation. In India, we are present in five locations and operate as BA Continuum India Private Limited (BACI), a non-banking subsidiary of Bank of America Corporation and the operating company for India operations of Global Business Services. Process Overview As a part of Global Risk Analytics, Enterprise Risk Analytics (ERA ) is responsible for the development of cross-business holistic analytical models and tools. Team responsibilities include: Financed Emissions responsible for supporting the calculation of asset level balance sheet Financed Emissions, which are integral to the Bank ’s goal of achieving Net-zero greenhouse gas emissions by 2050. Financial Crimes Modelling & Analytics responsible for enterprise-wide financial crimes and compliance surveillance model development and ongoing monitoring across all lines of business globally. Operational Risk responsible for operational risk loss forecasting and capital model development for CCAR/stress testing and regulatory capita l reporting/economic capital measurement purpose. Business Transformations is a central team of Project Managers and Quantitative S/W engineers partnering with coverage area ERA teams with the end goal of onboarding ERA production processes on GCP/production platforms as well as identify risk/gaps in ERA processes which can be fixed with well-designed and controlled S/W solutions. Trade Surveillance Analytics responsible for modelling and analytics supporting trade surveillance activities within risk. Advanced Analytics responsible for driving research, development, and implementation of new enhanced risk metrics and provide quantitative support for loss forecasting and stress testing requirements, including process improvement and automation Job Description The role will be responsible for independently conducting quantitative analytics and modeling projects Responsibilities Perform model development proof of concept, research model methodology, explore internal & external data sources, design model development data, and develop preliminary model Conduct complex data analytics on modeling data, identify, explain & address data quality issues, apply data exclusions, perform data transformation, and prepare data for model development Analyze portfolio definition, define model boundary, analyze model segmentation, develop Financed Emissions models for different asset classes, analyze and benchmark model results Work with Financed Emissions Data Team & Climate Risk Tech on the production process of model development & implementation data, including support data sourcing efforts, provide data requirements, perform data acceptance testing, etc. Work with Financed Emissions Production & Reporting Team on model implementation, model production run analysis, result analysis & visualization Work with ERA Model Implementation team & GCP Tech on model implementation, including opine on implementation design, provide implementation data model & requirements, perform model implementation result testing, etc. Work with Model Risk Management (MRM) on model reviews and obtain model approvals Work with GEG (Global Environmental Group) and FLU (Front Line Unit) on model requirements gathering & analysis, Climate Risk target setting, disclosure, analysis & reporting Requirements Education B.E. / B. Tech/M.E. /M. Tech Certifications If any : NA Experience Range : 9 - 12 years Foundational Skills* Advanced knowledge of SQL, SAS and Python Advanced Excel, VSCode, LaTex, Tableau skills Experience in multiple data environment such as Oracle, Hadoop, and Teradata Knowledge of data architecture concepts, data models, ETL processes Knowledge of climate risk, financial concepts & products Experience in extracting, and combining data across from multiple sources, and aggregate data for model development Experience in conducting quantitative analysis, performing model driven analytics, and developing models Experience in documenting business requirements for data, model, implementation, etc. Desired Skills Basics of Finance Basics of Climate Risk Work Timings 11:30 AM to 8:30 PM Job Location Chennai

Posted 2 weeks ago

Apply

5.0 - 10.0 years

15 - 25 Lacs

Hyderabad/Secunderabad, Bangalore/Bengaluru, Delhi / NCR

Hybrid

Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose the relentless pursuit of a world that works better for people – we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Lead Consultant-Data Engineer, AWS+Python, Spark, Kafka for ETL! Responsibilities Develop, deploy, and manage ETL pipelines using AWS services, Python, Spark, and Kafka. Integrate structured and unstructured data from various data sources into data lakes and data warehouses. Design and deploy scalable, highly available, and fault-tolerant AWS data processes using AWS data services (Glue, Lambda, Step, Redshift) Monitor and optimize the performance of cloud resources to ensure efficient utilization and cost-effectiveness. Implement and maintain security measures to protect data and systems within the AWS environment, including IAM policies, security groups, and encryption mechanisms. Migrate the application data from legacy databases to Cloud based solutions (Redshift, DynamoDB, etc) for high availability with low cost Develop application programs using Big Data technologies like Apache Hadoop, Apache Spark, etc with appropriate cloud-based services like Amazon AWS, etc. Build data pipelines by building ETL processes (Extract-Transform-Load) Implement backup, disaster recovery, and business continuity strategies for cloud-based applications and data. Responsible for analysing business and functional requirements which involves a review of existing system configurations and operating methodologies as well as understanding evolving business needs Analyse requirements/User stories at the business meetings and strategize the impact of requirements on different platforms/applications, convert the business requirements into technical requirements Participating in design reviews to provide input on functional requirements, product designs, schedules and/or potential problems Understand current application infrastructure and suggest Cloud based solutions which reduces operational cost, requires minimal maintenance but provides high availability with improved security Perform unit testing on the modified software to ensure that the new functionality is working as expected while existing functionalities continue to work in the same way Coordinate with release management, other supporting teams to deploy changes in production environment Qualifications we seek in you! Minimum Qualifications Experience in designing, implementing data pipelines, build data applications, data migration on AWS Strong experience of implementing data lake using AWS services like Glue, Lambda, Step, Redshift Experience of Databricks will be added advantage Strong experience in Python and SQL Proven expertise in AWS services such as S3, Lambda, Glue, EMR, and Redshift. Advanced programming skills in Python for data processing and automation. Hands-on experience with Apache Spark for large-scale data processing. Experience with Apache Kafka for real-time data streaming and event processing. Proficiency in SQL for data querying and transformation. Strong understanding of security principles and best practices for cloud-based environments. Experience with monitoring tools and implementing proactive measures to ensure system availability and performance. Excellent problem-solving skills and ability to troubleshoot complex issues in a distributed, cloud-based environment. Strong communication and collaboration skills to work effectively with cross-functional teams. Preferred Qualifications/ Skills Master’s Degree-Computer Science, Electronics, Electrical. AWS Data Engineering & Cloud certifications, Databricks certifications Experience with multiple data integration technologies and cloud platforms Knowledge of Change & Incident Management process Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values diversity and inclusion, respect and integrity, customer focus, and innovation. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training.

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

We're seeking a talented and experienced Big Data & AI Specialist to join our growing team. The ideal candidate will have a strong background in Python and PySpark, with a proven ability to work with large datasets and implement robust data solutions. Familiarity with Artificial Intelligence (AI) concepts and their application in real-world scenarios is also highly valued. Responsibilities: Design, develop, and maintain scalable and efficient data pipelines using Python and PySpark for batch and real-time processing of large datasets. Implement and optimize data ingestion, transformation, and loading processes within a big data ecosystem. Collaborate with data scientists and other stakeholders to understand data requirements and translate them into technical solutions. Develop and deploy machine learning models and AI-driven solutions, leveraging your familiarity with AI concepts. Ensure data quality, integrity, and security across all data solutions. Troubleshoot and resolve performance issues and data-related problems. Stay up-to-date with the latest trends and technologies in big data, AI, and cloud platforms. Participate in code reviews and contribute to the overall technical excellence of the team. Qualifications: Bachelor's or Master's degree in Computer Science, Data Science, Engineering, or a related field. 5+ years of professional experience in big data technologies. Expert-level proficiency in Python for data manipulation, scripting, and application development. Strong hands-on experience with PySpark for big data processing and analysis. Experience with big data frameworks and tools such as Hadoop, Spark, Hive, Kafka, etc. Familiarity with Artificial Intelligence (AI) concepts , including machine learning algorithms, deep learning, natural language processing (NLP), or computer vision. Experience with cloud platforms (AWS, Azure, GCP) and their big data/AI services is a plus. Solid understanding of data warehousing concepts and ETL/ELT processes. Excellent problem-solving, analytical, and communication skills. Ability to work independently and as part of a collaborative team.

Posted 2 weeks ago

Apply

0 years

0 Lacs

Goa

On-site

Job Description RWaltz Software Services Group Inc. has multiple openings in Alpharetta, GA for the Job Position: Senior Business Analyst Job Type: Full-time Work authorization: United States (Required) Hours per Week: 40 Hrs. Job responsibilities: Provide expertise to stakeholders to identify and analyze business needs to develop system requirements and translate them into technical specifications. Provide advanced knowledge of systems to ensure functionality meets stakeholder requirements and serve as a resource for complex issues and solutions. Lead the evaluation, design, and implementation of new business systems; identify enhancements to existing systems; measure user adoption for system changes. Analyze, develop, and recommend business processes or plans to improve system efficiencies and business processes. Provide direction to implement new requirements and plans; provide input to system strategic planning. Work with functional leaders to help develop a business case for additional software development. Analyze data transfers from source to the target platforms to understand impact on subsequent support interfaces & downstream applications. Research and compare hardware and software needs and make recommendations for the company. Translate complex data into meaningful interpretations to help with effective action, both for business stakeholders and the end user. Perform End-to-End functional testing on the systems and provide key information to IT teams for improvements, and create a guide to system updates. Applicants need to have the following qualifications: Must have a Bachelor's Degree or equivalent in Computer Science, Computer Information Systems, Management Information Sciences, Data Science, Statistics, Business, Engineering (C.S./ Electrical/electronic) or related field, and 60 months of work experience in the same role or related position with the same/similar job duties. OR Must have relevant experience in one or more of the following tools: Python, SQL, MS/Oracle SQL, Tableau, AWS, Microsoft Szure, Kafka, Hadoop, Jira, Power BI Must be willing to travel to unanticipated work locations throughout the USA at company-paid expense. To Apply: Please mail your resume to the HR Manager, Rwaltz Software Services Inc., 5910 Shiloh Rd. E, Ste 123, Alpharetta, GA 30005, with the job title being applied for in the subject line. Applicants are required to be eligible to lawfully work in the U.S. immediately; the employer will not transfer or sponsor applicants for U.S. work authorization (such as an H-1B visa) for this opportunity. Direct Hires Only. No Recruiters or Solicitations.

Posted 2 weeks ago

Apply

0 years

0 Lacs

Goa

On-site

Job Description RWaltz Software Services Group Inc. has multiple openings in Alpharetta, GA for the Job Position: Business Analyst Job Type: Full-time Work authorization: United States (Required) Hours per Week: 40 Hrs. Job responsibilities Work under supervision and assist to identify and analyze business needs to develop system requirements and translate those into technical specifications. Assist with the evaluation, design, and implementation new business systems; identify enhancements to existing systems; measure user adoption for system changes. Will help to analyze, and develop business processes or plans to improve system efficiencies and business processes. Provide input to the system strategic planning. Analyze data transfers from source to the target platforms to understand impact on subsequent support interfaces & downstream applications. Research and compare hardware and software needs. Translate complex data into meaningful interpretations to help with effective action, both for business stakeholders and the end user. Perform End-to-End functional testing on the systems and provide key information to IT teams for improvements, and create a guide to system updates. Applicants need to have the following qualifications: Must have a Master's Degree or equivalent in Computer Science, Computer Information Systems, Management Information Sciences, Data Science, Statistics, Business, Engineering (C.S./ Electrical/electronic) or related field and 12 months of work experience in the same role or related position with same/similar job duties. Must have relevant experience in one or more of the following tools: Python, SQL, MS/Oracle SQL, Tableau, AWS, Microsoft Szure, Kafka, Hadoop, Jira, Power BI Must be willing to travel to unanticipated work locations throughout the USA at company-paid expense. To Apply: Please mail your resume to the HR Manager, Rwaltz Software Services Inc., 5910 Shiloh Rd. E, Ste 123, Alpharetta, GA 30005, with the job title being applied for in the subject line. Applicants are required to be eligible to lawfully work in the U.S. immediately; the employer will not transfer or sponsor applicants for U.S. work authorization (such as an H-1B visa) for this opportunity. Direct Hires Only. No Recruiters or Solicitations.

Posted 2 weeks ago

Apply

2.0 years

9 - 9 Lacs

Thiruvananthapuram

On-site

Equifax is seeking creative, high-energy and driven software engineers with hands-on development skills to work on a variety of meaningful projects. Our software engineering positions provide you the opportunity to join a team of talented engineers working with leading-edge technology. You are ideal for this position if you are a forward-thinking, committed, and enthusiastic software engineer who is passionate about technology. What you’ll do? Perform general application development activities, including unit testing, code deployment to development environment and technical documentation. Works on one or more projects, making contributions to unfamiliar code written by team members. Participates in estimation process, use case specifications, reviews of test plans and test cases, requirements, and project planning. Diagnose and resolve performance issues. Documents code/processes so that any other developer is able to dive in with minimal effort. Develop, and operate high scale applications from the backend to UI layer, focusing on operational excellence, security and scalability. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.). Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit engineering team employing agile software development practices. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Able to write, debug, and troubleshoot code in mainstream open source technologies. Lead effort for Sprint deliverables, and solve problems with medium complexity What experience you need Bachelor's degree or equivalent experience 2+ years experience working with software design and Java, Python and Javascript programming languages and SQL 2+ years experience with software build management tools like Maven or Gradle 2+ years experience with HTML, CSS and frontend/web development 2+ years experience with software testing, performance, and quality engineering techniques and strategies 2+ years experience with Cloud technology: GCP, AWS, or Azure What could set you apart Knowledge or experience with Apache Beam for stream and batch data processing. Familiarity with big data tools and technologies like Apache Kafka, Hadoop, or Spark. Experience with containerization and orchestration tools (e.g., Docker, Kubernetes). Exposure to data visualization tools or platforms.

Posted 2 weeks ago

Apply

3.0 - 5.0 years

3 Lacs

Mananthavady

On-site

We are seeking a skilled and motivated Data Scientist with 3–5 years of hands-on experience in data analytics, machine learning, and business intelligence. The ideal candidate will be responsible for deriving actionable insights from data, building predictive models, and supporting data-driven decision-making across various business units. Key Responsibilities: Analyze structured and unstructured datasets to extract insights and identify trends.Design and implement machine learning models for classification, regression, clustering, and recommendation. Collaborate with business stakeholders to understand objectives and translate them into data solutions. Perform data wrangling, preprocessing, feature engineering, and model validation. Build dashboards and reports using visualization tools like Power BI or Tableau. Present findings and recommendations to technical and non-technical audiences. Contribute to model deployment and monitoring processes. Required Skills & Qualifications: Bachelor’s or Master’s degree in Data Science, Computer Science, Statistics, Mathematics, or a related field. 3–5 years of industry experience in a Data Scientist or similar role. Proficient in programming languages such as Python or R . Strong experience with data manipulation tools like pandas , NumPy , and machine learning libraries like scikit-learn , XGBoost , or TensorFlow/PyTorch . Solid knowledge of SQL and database querying. Experience with data visualization tools like Power BI , Tableau , or Matplotlib/Seaborn . Familiarity with version control (e.g., Git) and basic software development practices. Preferred Qualifications: Experience working with cloud platforms (AWS, Azure, or GCP). Exposure to big data tools (Spark, Hadoop) is a plus. Knowledge of NLP, time-series forecasting, or deep learning techniques is desirable. Strong problem-solving and communication skills. What We Offer: A collaborative, innovative work environment. Opportunities to work on real-world data challenges across industries. Access to modern tools, cloud platforms, and machine learning infrastructure. Competitive salary and performance-based incentives. Job Type: Full-time Pay: From ₹30,000.00 per month Benefits: Food provided Schedule: Day shift Work Location: In person

Posted 2 weeks ago

Apply

5.0 years

3 - 7 Lacs

Hyderābād

On-site

Job Title: Databricks Developer / Data Engineer Duration - 12 Months with Possible Extension Location: Hyderabad, Telangana (Hybrid) 1-2 days onsite at client location Job Summary: We are seeking a highly skilled Databricks Developer / Data Engineer with 5+ years of experience in building scalable data pipelines, managing large datasets, and optimizing data workflows in cloud environments. The ideal candidate will have hands-on expertise in Azure Databricks, Azure Data Factory, and other Azure-native services, playing a key role in enabling data-driven decision-making across the organization. Key Responsibilities: Design, develop, and maintain scalable ETL/ELT pipelines for data ingestion, transformation, and integration Work with both structured and unstructured data from a variety of internal and external sources Collaborate with data analysts, scientists, and engineers to ensure data quality, integrity, and availability Build and manage data lakes, data warehouses, and data models (Azure Databricks, Azure Data Factory, Snowflake, etc.) Optimize performance of large-scale batch and real-time processing systems Implement data governance , metadata management, and data lineage practices Monitor and troubleshoot pipeline issues; perform root cause analysis and proactive resolution Automate data validation and quality checks Ensure compliance with data privacy, security, and regulatory requirements Maintain thorough documentation of architecture, data workflows, and processes Mandatory Qualifications: 5+ years of hands-on experience with: Azure Blob Storage, Azure Data Lake Storage, Azure SQL Database Azure Logic Apps, Azure Data Factory, Azure Databricks, Azure ML Azure DevOps Services, Azure API Management, Webhooks Intermediate-level proficiency in Python scripting and PySpark Basic understanding of Power BI and visualization functionalities Technical Skills & Experience Required: Proficient in SQL and working with both relational and non-relational databases (e.g., SQL, PostgreSQL, MongoDB, Cassandra) Hands-on experience with Apache Spark, Hadoop, Hive for big data processing Proficiency in building scalable data pipelines using Azure Data Factory and Azure Databricks Solid knowledge of cloud-native tools : Delta Lake, Azure ML, Azure DevOps Understanding of data modeling , OLAP/OLTP systems , and data warehousing best practices Experience with CI/CD pipelines , version control with Git , and working with Azure Repos Knowledge of data security , privacy policies, and compliance frameworks Excellent problem-solving , troubleshooting , and analytical skills

Posted 2 weeks ago

Apply

0 years

0 Lacs

India

On-site

Why We Work at Dun & Bradstreet Dun & Bradstreet unlocks the power of data through analytics, creating a better tomorrow. Each day, we are finding new ways to strengthen our award-winning culture and accelerate creativity, innovation and growth. Our 6,000+ global team members are passionate about what we do. We are dedicated to helping clients turn uncertainty into confidence, risk into opportunity and potential into prosperity. Bold and diverse thinkers are always welcome. Come join us! Learn more at dnb.com/careers . D&B is looking for an experienced Senior Golang Java Backend Developer to join our team in India and be instrumental in taking our products to the next level. In this role, you will be working in close collaboration with a team of highly empowered, experienced developers who are building a high-performance, highly scaled global platform. Responsibilities Want to conceive, build, and operate highly distributed systems deployed around the planet. Employ cutting-edge technologies and techniques in a rapidly evolving domain. Thrive in a progressive, environment which relies on communication and initiative rather than process to deliver at a high velocity. Have a "Product Owner" rather than a "Task Implementer" attitude Are curious and always improving your skill set Desired Qualifications Experience building systems involving messaging and/or event-driven architectures. Deep technical understanding of at least one of Core Java & Golang and willing to work with both. Strong handle on concurrency challenges and design solutions. Strong buyer of Agile/Lean values. Heavy emphasis on code testing and designing for testability. Maturity and aptitude to operate in a high-freedom/high-responsibility environment. Strong troubleshooting skills. Experience with techops, supporting and troubleshooting large systems. Exposure to devops automation such as Chef/Ansible. Exposure to IAAS platforms such as AWS EC2, Rackspace, etc… Experience with Apache Cassandra, Hadoop, or other NoSQL databases. Involvement in an open-source community. This position is internally titled as Senior Software Engineer All Dun & Bradstreet job postings can be found at https://www.dnb.com/about-us/careers-and-people/joblistings.html and https://jobs.lever.co/dnb . Official communication from Dun & Bradstreet will come from an email address ending in @dnb.com. Notice to Applicants: Please be advised that this job posting page is hosted and powered by Lever. Your use of this page is subject to Lever's Privacy Notice and Cookie Policy , which governs the processing of visitor data on this platform.

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies