Jobs
Interviews

627 Mapreduce Jobs - Page 15

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 - 11.0 years

45 - 50 Lacs

Noida, Kolkata, Chennai

Work from Office

Dear Candidate, We are hiring a Julia Developer to build computational and scientific applications requiring speed and mathematical accuracy. Ideal for domains like finance, engineering, or AI research. Key Responsibilities: Develop applications and models using the Julia programming language . Optimize for performance, parallelism, and numerical accuracy . Integrate with Python or C++ libraries where needed. Collaborate with data scientists and engineers on simulations and modeling. Maintain well-documented and reusable codebases. Required Skills & Qualifications: Proficient in Julia , with knowledge of multiple dispatch and type system Experience in numerical computing or scientific research Familiarity with Plots.jl, Flux.jl, or DataFrames.jl Understanding of Python, R, or MATLAB is a plus Soft Skills: Strong troubleshooting and problem-solving skills. Ability to work independently and in a team. Excellent communication and documentation skills. Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Delivery Manager Integra Technologies

Posted 1 month ago

Apply

6.0 years

0 Lacs

Hyderabad, Telangana, India

Remote

LivePerson (NASDAQ: LPSN) is the global leader in enterprise conversations. Hundreds of the world's leading brands — including HSBC, Chipotle, and Virgin Media — use our award-winning Conversational Cloud platform to connect with millions of consumers. We power nearly a billion conversational interactions every month, providing a uniquely rich data set and safety tools to unlock the power of Conversational AI for better customer experiences. At LivePerson, we foster an inclusive workplace culture that encourages meaningful connection, collaboration, and innovation. Everyone is invited to ask questions, actively seek new ways to achieve success, and reach their full potential. We are continually looking for ways to improve our products and make things better. This means spotting opportunities, solving ambiguities, and seeking effective solutions to the problems our customers care about. Overview: We are looking for an experienced Data Engineer to provide data engineering expertise and support to various analytical products of LivePerson, and assist in migrating our existing data processing ecosystem from Hadoop (Spark, MapReduce, Java, and Scala) to Databricks on GCP. The goal is to leverage Databricks' scalability, performance, and ease of use to enhance our current workflows. You will: Assessment and Planning: Review the existing Hadoop infrastructure, including Spark and MapReduce jobs. Analyze Java and Scala codebases for compatibility with Databricks. Identify dependencies, libraries, and configurations that may require modification. Propose a migration plan with clear timelines and milestones. Code Migration: Refactor Spark jobs to run efficiently on Databricks. Migrate MapReduce jobs where applicable or rewrite them using Spark DataFrame/Dataset API. Update Java and Scala code to comply with Databricks' runtime environment. Testing and Validation: Develop unit and integration tests to ensure parity between the existing and new systems. Compare performance metrics before and after migration. Implement error handling and logging consistent with best practices in Databricks. Optimization and Performance Tuning: Fine-tune Spark configurations for performance improvements on Databricks. Optimize data ingestion and transformation processes. Deployment and Documentation: Deploy migrated jobs to production in Databricks. Document changes, configurations, and processes thoroughly. Provide knowledge transfer to internal teams if required. Required skills: 6+ years of experience in Data Engineering with focus on building data pipelines, data platforms and ETL (Extract, transform, Load) processes on Hadoop and Databricks. Strong Expertise in Databricks (Spark on Databricks, Delta Lake, etc.) preferably on GCP. Strong expertise in the Hadoop ecosystem (Spark, MapReduce, HDFS) with solid foundations of Spark and its internals. Proficiency in Scala and Java. Strong SQL knowledge. Strong understanding of data engineering and optimization techniques. Solid understanding on Data governance, Data modeling and enterprise scale data lakehouse platform Experience with test frameworks like Great Expectations Minimum Qualifications : Bachelor's degree in Computer Science or a related field Certified Databricks Engineer- Preferred You should be an expert in: Databricks with spark and its internals (3 years) - MUST Data engineering in Hadoop ecosystem (5 years) - MUST Scala and Java (5 years) - MUST SQL - MUST Benefits: Health: Medical, Dental and Vision Time away: vacation and holidays Development: Access to internal professional development resources. Equal opportunity employer Why you'll love working here : As leaders in enterprise customer conversations, we celebrate diversity, empowering our team to forge impactful conversations globally. LivePerson is a place where uniqueness is embraced, growth is constant, and everyone is empowered to create their own success. And, we're very proud to have earned recognition from Fast Company, Newsweek, and BuiltIn for being a top innovative, beloved, and remote-friendly workplace. Belonging at LivePerson: We are proud to be an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to age, ancestry, color, family or medical care leave, gender identity or expression, genetic information, marital status, medical condition, national origin, physical or mental disability, protected veteran status, race, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable laws, regulations and ordinances. We also consider qualified applicants with criminal histories, consistent with applicable federal, state, and local law. We are committed to the accessibility needs of applicants and employees. We provide reasonable accommodations to job applicants with physical or mental disabilities. Applicants with a disability who require reasonable accommodation for any part of the application or hiring process should inform their recruiting contact upon initial connection. The talent acquisition team at LivePerson has recently been notified of a phishing scam targeting candidates applying for our open roles. Scammers have been posing as hiring managers and recruiters in an effort to access candidates' personal and financial information. This phishing scam is not isolated to only LivePerson and has been documented in news articles and media outlets. Please note that any communication from our hiring teams at LivePerson regarding a job opportunity will only be made by a LivePerson employee with an @ liveperson.com email address. LivePerson does not ask for personal or financial information as part of our interview process, including but not limited to your social security number, online account passwords, credit card numbers, passport information and other related banking information. If you have any questions and or concerns, please feel free to contact recruiting-lp@liveperson.com

Posted 1 month ago

Apply

2.0 - 6.0 years

5 - 9 Lacs

Pune

Work from Office

Data Engineer1 Common Skils - SQL, GCP BQ, ETL pipelines using Pythin/Airflow,Experience on Spark/Hive/HDFS,Data modeling for Data conversion Resources (4) Prior experience working on a conv/migration HR project is additional skill needed along with above mentioned skills Common Skils - SQL, GCP BQ, ETL pipelines using Pythin/Airflow, Experience on Spark/Hive/HDFS, Data modeling for Data conversion Resources (4) Prior experience working on a conv/migration HR project is additional skill needed along with above mentioned skills Data Engineer - Knows HR Knowledge , all other requirement from Functional Area given by UBER Customer Name Customer Nameuber

Posted 1 month ago

Apply

175.0 years

0 Lacs

Gurugram, Haryana, India

On-site

At American Express, our culture is built on a 175-year history of innovation, shared values and Leadership Behaviors, and an unwavering commitment to back our customers, communities, and colleagues. As part of Team Amex, you'll experience this powerful backing with comprehensive support for your holistic well-being and many opportunities to learn new skills, develop as a leader, and grow your career. Here, your voice and ideas matter, your work makes an impact, and together, you will help us define the future of American Express. How will you make an impact in this role? We are building an energetic, high-performance team with a nimble and creative mindset to drive our technology and products. American Express (AXP) is a powerful brand, a great place to work and has unparalleled scale. Join us for an exciting opportunity in the Marketing Data Technology (Mar Tech Data Team) within American Express Technologies. This team is specialized in creating and expanding suite of data and insight solutions to power the customer marketing ecosystem. The team creates and manages various batch/Realtime marketing data products that fuels the Customer Marketing Platforms. Being part of the team, you will get numerous opportunities to utilize and learn bigdata and GCP cloud technologies. Job Responsibilities: Responsible for delivering the features or software functionality independently and reliably. Develop technical design documentation. Functions as core member of an agile team by contributing to software builds through consistent development practices with respect to tools, common components, and documentation. Performs hands-on ETL development for marketing data applications. Participate in code reviews and automated testing. Helps other junior members of the team deliver. Demonstrates analytical thinking - recommends improvements, best practices and conducts experiments to prove/disprove them Provides continuous support for ongoing application availability. Learns, understands, participates fully in all team ceremonies, including work breakdown, estimation, and retrospectives. Willingness to learn new technologies and exploit them to their optimal potential, including substantiated ability to innovate and take pride in quickly deploying working software. High energy demonstrated, willingness to learn new technologies and takes pride in how fast they develop working software. Minimum Qualifications: Bachelor's Degree with minimum 4+ years of overall software design and development experience. Expert in SQL and Data warehousing concepts. Hands-on expertise with cloud platforms, ideally Google Cloud Platform (GCP) Working knowledge of data storage solutions like Big Query or Cloud SQL and data engineering tools like AirFlow or Cloud Workflows. Experience with other GCP services like Cloud Storage, Pub/Sub, or Data Catalog. Familiarity with Agile or other rapid application development methods. Hands on experience with one or more programming languages (Java, Python). Hands-on expertise with software development in Big Data (Hadoop, MapReduce, Spark, HIVE). Experience with CICD pipelines, Automated test frameworks, DevOps and source code management tools (XLR, Jenkins, Git, Sonar, Stash, Maven, Jira, Confluence, Splunk etc.). Knowledge of various Shell Scripting tools and ansible will be added advantage. Strong communication and analytical skills including effective presentation skills We back you with benefits that support your holistic well-being so you can be and deliver your best. This means caring for you and your loved ones' physical, financial, and mental health, as well as providing the flexibility you need to thrive personally and professionally: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law. Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations.

Posted 1 month ago

Apply

15.0 - 20.0 years

10 - 14 Lacs

Hyderabad

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Capital Markets Regulatory Compliance Good to have skills : MicroStrategy Business Intelligence, Microsoft Power Business Intelligence (BI), Capital Markets AuditMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a member of the Business Acceptance Unit, you will participate in Business Acceptance Test for ECAG Regulatory Reporting and Analytics applications, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that application requirements are met, overseeing the development process, and providing guidance to team members. You will also engage in problem-solving activities, ensuring that the applications align with regulatory compliance standards while fostering a productive and inclusive work environment. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of application features.- Create and maintain functional specifications for the Datawarehouse applications. Participate in discussions related to project planning, functionality, and review of functional specifications. Create test plan, test narratives, and define the test scope based on functional specifications and user stories. Develop manual test cases to test software changes in Datawarehouse applications. Create Feature Files in Gherkins format for test automation. Write python scripts for test automation. Create test data scenarios and execute test cases. Create and maintain regression test suite. Periodically report the test results and create test statistics. Follow-up on bugs identified and retest of software. Professional & Technical Skills: - Must To Have Skills: Proficiency in Capital Markets Regulatory Compliance.- Good To Have Skills: Experience with MicroStrategy Business Intelligence, Capital Markets Audit, Microsoft Power Business Intelligence (BI).- Strong understanding of regulatory frameworks and compliance requirements in capital markets.- Experience in application design and development methodologies.- Ability to analyze complex regulatory requirements and translate them into actionable application features.- Sound understanding of test methodology and agile software development methodology. Functional knowledge in derivatives and OTC clearing- Experience in collaboration tools such as Jira and GitHub. Experience in testing Datawarehouse / reporting applications. Experience using statistical computer languages (R, Python, SLQ, etc.) to manipulate data and draw insights from large data sets. Experience with distributed data/computing tools:Map/Reduce, Hadoop, Hive, Spark, etc. would be an advantage. Experience in visualizing/presenting data for stakeholders using:Zeppelin, Power BI, MicroStrategy will be an advantage. Additional Information:- The candidate should have minimum 5 years of experience in Capital Markets Regulatory Compliance.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

3.0 - 8.0 years

13 - 18 Lacs

Hyderabad

Work from Office

Project Role : Quality Engineering Lead (Test Lead) Project Role Description : Leads a team of quality engineers through multi-disciplinary team planning and ecosystem integration to accelerate delivery and drive quality across the application lifecycle. Applies business and functional knowledge to develop end-to-end testing strategies through the use of quality processes and methodologies. Applies testing methodologies, principles and processes to define and implement key metrics to manage and assess the testing process including test execution and defect resolution. Must have skills : Capital Markets Regulatory Compliance Good to have skills : MicroStrategy Business Intelligence, Microsoft Power Business Intelligence (BI), Capital Markets AuditMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, of the Business Acceptance Unit, you will participate in Business Acceptance Test for ECAG Regulatory Reporting and Analytics application you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various stakeholders to gather requirements, overseeing the development process, and ensuring that the applications meet the necessary compliance standards. You will also engage in problem-solving discussions with your team, providing guidance and support to ensure successful project outcomes. Your role will require you to stay updated on industry trends and best practices to enhance application performance and compliance. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Facilitate communication between technical teams and stakeholders to ensure alignment on project goals.- Mentor junior team members, providing them with the necessary support and guidance to enhance their skills.Create and maintain functional specifications for the Datawarehouse applications. Participate in discussions related to project planning, functionality, and review of functional specifications. Create test plan, test narratives, and define the test scope based on functional specifications and user stories. Develop manual test cases to test software changes in Datawarehouse applications. Create Feature Files in Gherkins format for test automation. Write python scripts for test automation. Create test data scenarios and execute test cases. Create and maintain regression test suite. Periodically report the test results and create test statistics. Follow-up on bugs identified and retest of software. Professional & Technical Skills: - Must To Have Skills: Proficiency in Capital Markets Regulatory Compliance.- Good To Have Skills: Experience with MicroStrategy Business Intelligence, Capital Markets Audit, Microsoft Power Business Intelligence (BI).- Strong understanding of regulatory frameworks and compliance requirements in capital markets.- Experience in application design and development processes.- Ability to analyze complex problems and develop effective solutions.Functional knowledge in derivatives and OTC clearing- Experience in collaboration tools such as Jira and GitHub. Experience in testing Datawarehouse / reporting applications. Experience using statistical computer languages (R, Python, SLQ, etc.) to manipulate data and draw insights from large data sets. Experience with distributed data/computing tools:Map/Reduce, Hadoop, Hive, Spark, etc. would be an advantage. Experience in visualizing/presenting data for stakeholders using:Zeppelin, Power BI, MicroStrategy will be an advantage. Additional Information:- The candidate should have minimum 3 years of experience in Capital Markets Regulatory Compliance.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

3.0 - 8.0 years

10 - 14 Lacs

Hyderabad

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Capital Markets Regulatory Compliance Good to have skills : MicroStrategy Business Intelligence, Microsoft Power Business Intelligence (BI), Capital Markets AuditMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, of the Business Acceptance Unit, you will participate in Business Acceptance Test for ECAG Regulatory Reporting and Analytics application you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various stakeholders to gather requirements, overseeing the development process, and ensuring that the applications meet the necessary compliance standards. You will also engage in problem-solving discussions with your team, providing guidance and support to ensure successful project outcomes. Your role will require you to stay updated on industry trends and best practices to enhance application performance and compliance. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Facilitate communication between technical teams and stakeholders to ensure alignment on project goals.- Mentor junior team members, providing them with the necessary support and guidance to enhance their skills.Create and maintain functional specifications for the Datawarehouse applications. Participate in discussions related to project planning, functionality, and review of functional specifications. Create test plan, test narratives, and define the test scope based on functional specifications and user stories. Develop manual test cases to test software changes in Datawarehouse applications. Create Feature Files in Gherkins format for test automation. Write python scripts for test automation. Create test data scenarios and execute test cases. Create and maintain regression test suite. Periodically report the test results and create test statistics. Follow-up on bugs identified and retest of software. Professional & Technical Skills: - Must To Have Skills: Proficiency in Capital Markets Regulatory Compliance.- Good To Have Skills: Experience with MicroStrategy Business Intelligence, Capital Markets Audit, Microsoft Power Business Intelligence (BI).- Strong understanding of regulatory frameworks and compliance requirements in capital markets.- Experience in application design and development processes.- Ability to analyze complex problems and develop effective solutions.Functional knowledge in derivatives and OTC clearing- Experience in collaboration tools such as Jira and GitHub. Experience in testing Datawarehouse / reporting applications. Experience using statistical computer languages (R, Python, SLQ, etc.) to manipulate data and draw insights from large data sets. Experience with distributed data/computing tools:Map/Reduce, Hadoop, Hive, Spark, etc. would be an advantage. Experience in visualizing/presenting data for stakeholders using:Zeppelin, Power BI, MicroStrategy will be an advantage. Additional Information:- The candidate should have minimum 3 years of experience in Capital Markets Regulatory Compliance.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

7.0 - 12.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : PySpark Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your day will involve overseeing the application development process, collaborating with team members, and making key decisions to ensure project success. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the application development process effectively- Ensure timely delivery of projects- Provide guidance and mentorship to team members Professional & Technical Skills: - Must To Have Skills: Proficiency in PySpark- Strong understanding of big data processing- Experience with data manipulation and transformation- Hands-on experience in building scalable applications- Knowledge of cloud platforms and services Additional Information:- The candidate should have a minimum of 7.5 years of experience in PySpark- This position is based at our Bengaluru office- A 15 years full-time education is required Qualification 15 years full time education

Posted 1 month ago

Apply

175.0 years

7 - 9 Lacs

Gurgaon

On-site

At American Express, our culture is built on a 175-year history of innovation, shared values and Leadership Behaviors, and an unwavering commitment to back our customers, communities, and colleagues. As part of Team Amex, you'll experience this powerful backing with comprehensive support for your holistic well-being and many opportunities to learn new skills, develop as a leader, and grow your career. Here, your voice and ideas matter, your work makes an impact, and together, you will help us define the future of American Express. How will you make an impact in this role? We are building an energetic, high-performance team with a nimble and creative mindset to drive our technology and products. American Express (AXP) is a powerful brand, a great place to work and has unparalleled scale. Join us for an exciting opportunity in the Marketing Data Technology (Mar Tech Data Team) within American Express Technologies. This team is specialized in creating and expanding suite of data and insight solutions to power the customer marketing ecosystem. The team creates and manages various batch/Realtime marketing data products that fuels the Customer Marketing Platforms. Being part of the team, you will get numerous opportunities to utilize and learn bigdata and GCP cloud technologies. Job Responsibilities: Responsible for delivering the features or software functionality independently and reliably. Develop technical design documentation. Functions as core member of an agile team by contributing to software builds through consistent development practices with respect to tools, common components, and documentation. Performs hands-on ETL development for marketing data applications. Participate in code reviews and automated testing. Helps other junior members of the team deliver. Demonstrates analytical thinking - recommends improvements, best practices and conducts experiments to prove/disprove them Provides continuous support for ongoing application availability. Learns, understands, participates fully in all team ceremonies, including work breakdown, estimation, and retrospectives. Willingness to learn new technologies and exploit them to their optimal potential, including substantiated ability to innovate and take pride in quickly deploying working software. High energy demonstrated, willingness to learn new technologies and takes pride in how fast they develop working software. Minimum Qualifications: Bachelor's Degree with minimum 4+ years of overall software design and development experience. Expert in SQL and Data warehousing concepts. Hands-on expertise with cloud platforms, ideally Google Cloud Platform (GCP) Working knowledge of data storage solutions like Big Query or Cloud SQL and data engineering tools like AirFlow or Cloud Workflows. Experience with other GCP services like Cloud Storage, Pub/Sub, or Data Catalog. Familiarity with Agile or other rapid application development methods. Hands on experience with one or more programming languages (Java, Python). Hands-on expertise with software development in Big Data (Hadoop, MapReduce, Spark, HIVE). Experience with CICD pipelines, Automated test frameworks, DevOps and source code management tools (XLR, Jenkins, Git, Sonar, Stash, Maven, Jira, Confluence, Splunk etc.). Knowledge of various Shell Scripting tools and ansible will be added advantage. Strong communication and analytical skills including effective presentation skills We back you with benefits that support your holistic well-being so you can be and deliver your best. This means caring for you and your loved ones' physical, financial, and mental health, as well as providing the flexibility you need to thrive personally and professionally: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law. Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations.

Posted 1 month ago

Apply

1.0 - 5.0 years

0 Lacs

Bengaluru, Karnataka

On-site

Category Engineering Experience Sr. Associate Primary Address Bangalore, Karnataka Overview Voyager (94001), India, Bangalore, Karnataka Senior Associate- Data Engineer Do you love building and pioneering in the technology space? Do you enjoy solving complex business problems in a fast-paced, collaborative,inclusive, and iterative delivery environment? At Capital One India, you'll be part of a big group of makers, breakers, doers and disruptors, who solve real problems and meet real customer needs. We are seeking Data Engineers who are passionate about marrying data with emerging technologies. As a Capital One Data Engineer, you’ll have the opportunity to be on the forefront of driving a major transformation within Capital One. What You’ll Do: Collaborate with and across Agile teams to design, develop, test, implement, and support technical solutions in full-stack development tools and technologies Work with a team of developers with deep experience in machine learning, distributed microservices, and full stack systems Utilize programming languages like Java, Scala, Python and Open Source RDBMS and NoSQL databases and Cloud based data warehousing services such as Redshift and Snowflake Share your passion for staying on top of tech trends, experimenting with and learning new technologies, participating in internal & external technology communities, and mentoring other members of the engineering community Collaborate with digital product managers, and deliver robust cloud-based solutions that drive powerful experiences to help millions of Americans achieve financial empowerment Perform unit tests and conduct reviews with other team members to make sure your code is rigorously designed, elegantly coded, and effectively tuned for performance Basic Qualifications: Bachelor’s Degree At least 1.5 years of experience in application development (Internship experience does not apply) At least 1 year of experience in big data technologies Preferred Qualifications: 3+ years of experience in application development including Python, SQL, Scala, or Java 1+ years of experience with a public cloud (AWS, Microsoft Azure, Google Cloud) 2+ years experience with Distributed data/computing tools (MapReduce, Hadoop, Hive, EMR, Kafka, Spark, Gurobi, or MySQL) 1+ years experience working on real-time data and streaming applications 1+ years of experience with NoSQL implementation (Mongo, Cassandra) 1+ years of data warehousing experience (Redshift or Snowflake) 2+ years of experience with UNIX/Linux including basic commands and shell scripting 1+ years of experience with Agile engineering practices ***At this time, Capital One will not sponsor a new applicant for employment authorization for this position. No agencies please. Capital One is an equal opportunity employer (EOE, including disability/vet) committed to non-discrimination in compliance with applicable federal, state, and local laws. Capital One promotes a drug-free workplace. Capital One will consider for employment qualified applicants with a criminal history in a manner consistent with the requirements of applicable laws regarding criminal background inquiries, including, to the extent applicable, Article 23-A of the New York Correction Law; San Francisco, California Police Code Article 49, Sections 4901-4920; New York City’s Fair Chance Act; Philadelphia’s Fair Criminal Records Screening Act; and other applicable federal, state, and local laws and regulations regarding criminal background inquiries. If you have visited our website in search of information on employment opportunities or to apply for a position, and you require an accommodation, please contact Capital One Recruiting at 1-800-304-9102 or via email at RecruitingAccommodation@capitalone.com. All information you provide will be kept confidential and will be used only to the extent required to provide needed reasonable accommodations. For technical support or questions about Capital One's recruiting process, please send an email to Careers@capitalone.com Capital One does not provide, endorse nor guarantee and is not liable for third-party products, services, educational tools or other information available through this site. Capital One Financial is made up of several different entities. Please note that any position posted in Canada is for Capital One Canada, any position posted in the United Kingdom is for Capital One Europe and any position posted in the Philippines is for Capital One Philippines Service Corp. (COPSSC). This carousel contains a column of headings. Selecting a heading will change the main content in the carousel that follows. Use the Previous and Next buttons to cycle through all the options, use Enter to select. This carousel shows one item at a time. Use the preceding navigation carousel to select a specific heading to display the content here. How We Hire We take finding great coworkers pretty seriously. Step 1 Apply It only takes a few minutes to complete our application and assessment. Step 2 Screen and Schedule If your application is a good match you’ll hear from one of our recruiters to set up a screening interview. Step 3 Interview(s) Now’s your chance to learn about the job, show us who you are, share why you would be a great addition to the team and determine if Capital One is the place for you. Step 4 Decision The team will discuss — if it’s a good fit for us and you, we’ll make it official! How to Pick the Perfect Career Opportunity Overwhelmed by a tough career choice? Read these tips from Devon Rollins, Senior Director of Cyber Intelligence, to help you accept the right offer with confidence. Your wellbeing is our priority Our benefits and total compensation package is designed for the whole person. Caring for both you and your family. Healthy Body, Healthy Mind You have options and we have the tools to help you decide which health plans best fit your needs. Save Money, Make Money Secure your present, plan for your future and reduce expenses along the way. Time, Family and Advice Options for your time, opportunities for your family, and advice along the way. It’s time to BeWell. Career Journey Here’s how the team fits together. We’re big on growth and knowing who and how coworkers can best support you.

Posted 1 month ago

Apply

1.0 - 5.0 years

0 Lacs

Bengaluru, Karnataka

On-site

Voyager (94001), India, Bangalore, Karnataka Senior Associate- Data Engineer Do you love building and pioneering in the technology space? Do you enjoy solving complex business problems in a fast-paced, collaborative, inclusive, and iterative delivery environment? At Capital One India, you'll be part of a big group of makers, breakers, doers and disruptors, who solve real problems and meet real customer needs. We are seeking Data Engineers who are passionate about marrying data with emerging technologies. As a Capital One Data Engineer, you’ll have the opportunity to be on the forefront of driving a major transformation within Capital One. What You’ll Do: Collaborate with and across Agile teams to design, develop, test, implement, and support technical solutions in full-stack development tools and technologies Work with a team of developers with deep experience in machine learning, distributed microservices, and full stack systems Utilize programming languages like Java, Scala, Python and Open Source RDBMS and NoSQL databases and Cloud based data warehousing services such as Redshift and Snowflake Share your passion for staying on top of tech trends, experimenting with and learning new technologies, participating in internal & external technology communities, and mentoring other members of the engineering community Collaborate with digital product managers, and deliver robust cloud-based solutions that drive powerful experiences to help millions of Americans achieve financial empowerment Perform unit tests and conduct reviews with other team members to make sure your code is rigorously designed, elegantly coded, and effectively tuned for performance Basic Qualifications: Bachelor’s Degree At least 1.5 years of experience in application development (Internship experience does not apply) At least 1 year of experience in big data technologies Preferred Qualifications: 3+ years of experience in application development including Python, SQL, Scala, or Java 1+ years of experience with a public cloud (AWS, Microsoft Azure, Google Cloud) 2+ years experience with Distributed data/computing tools (MapReduce, Hadoop, Hive, EMR, Kafka, Spark, Gurobi, or MySQL) 1+ years experience working on real-time data and streaming applications 1+ years of experience with NoSQL implementation (Mongo, Cassandra) 1+ years of data warehousing experience (Redshift or Snowflake) 2+ years of experience with UNIX/Linux including basic commands and shell scripting 1+ years of experience with Agile engineering practices ***At this time, Capital One will not sponsor a new applicant for employment authorization for this position. No agencies please. Capital One is an equal opportunity employer (EOE, including disability/vet) committed to non-discrimination in compliance with applicable federal, state, and local laws. Capital One promotes a drug-free workplace. Capital One will consider for employment qualified applicants with a criminal history in a manner consistent with the requirements of applicable laws regarding criminal background inquiries, including, to the extent applicable, Article 23-A of the New York Correction Law; San Francisco, California Police Code Article 49, Sections 4901-4920; New York City’s Fair Chance Act; Philadelphia’s Fair Criminal Records Screening Act; and other applicable federal, state, and local laws and regulations regarding criminal background inquiries. If you have visited our website in search of information on employment opportunities or to apply for a position, and you require an accommodation, please contact Capital One Recruiting at 1-800-304-9102 or via email at RecruitingAccommodation@capitalone.com. All information you provide will be kept confidential and will be used only to the extent required to provide needed reasonable accommodations. For technical support or questions about Capital One's recruiting process, please send an email to Careers@capitalone.com Capital One does not provide, endorse nor guarantee and is not liable for third-party products, services, educational tools or other information available through this site. Capital One Financial is made up of several different entities. Please note that any position posted in Canada is for Capital One Canada, any position posted in the United Kingdom is for Capital One Europe and any position posted in the Philippines is for Capital One Philippines Service Corp. (COPSSC).

Posted 1 month ago

Apply

0 years

0 Lacs

Indore, Madhya Pradesh, India

On-site

Attention All Data Folks!!! We are Hiring Sr/ Lead Data Engineer in Indore, MP (Hybrid Model) Below is the JD for the reference Sr/Lead Data Engineer Indore, MP (Hybrid) Full Time [Key Responsibilities]: Gather and assemble large, complex sets of data that meet non-functional and functional business requirements.Skills: SQL, Python, R, Data Modeling, Data Warehousing, AWS (S3, Athena) Create new data pipelines or enhance existing pipelines to accommodate non-standard data formats from customers.Skills: ETL Tools (e.g., Apache NiFi, Talend), Python (Pandas, PySpark), AWS Glue, JSON, XML, YAML Identify, design, and implement internal process improvements, including re-designing infrastructure for greater scalability, optimizing data delivery, and automating manual processes.Skills: Apache Airflow, Terraform, Kubernetes, AWS Lambda, CI/CD pipelines, Docker Build and maintain required infrastructure for optimal extraction, transformation, and loading (ETL) of data from various data sources using AWS and SQL technologies.Skills: SQL, AWS Redshift, AWS RDS, EMR (Elastic MapReduce), Snowflake Use existing methods or develop new tools/methods to analyze the data and perform required data sanity validations to ensure completeness and accuracy as per technical and functional requirements.Skills: Python (NumPy, Pandas), Data Validation Tools, Tableau, Power BI Work with stakeholders including Customer Onboarding, Delivery, Product, and other functional teams, assisting them with any data-related technical or infrastructure-related issues.Skills: Stakeholder Communication, JIRA, Agile Methodologies Provide actionable insights into key data metrics (volumes, trends, outliers, etc.), highlight any challenges/improvements, and provide recommendations and solutions to relevant stakeholders.Skills: Data Analysis, Data Visualization Tools (Tableau, Looker), Advanced Excel Coordinate with the Technical Program Manager (TPM) to prioritize discovered issues in the Data Sanity Report and own utility communications Skills: Project Management Tools, Reporting Tools, Clear Documentation Practices. [About Ccube] Ccube: Pioneering Data-Driven Solutions in the Cloud Ccube is a specialized firm that delivers measurable results across a wide range of industries by focusing exclusively on Data and Artificial Intelligence within Cloud environments. We leverage cutting-edge technologies and innovative strategies to help our clients harness the power of data and achieve their business objectives. Core Competencies: Strategic Planning and Design of Data Systems: We collaborate with our clients to develop comprehensive data strategies and design robust data systems that align with their business goals. Our team of experts provides guidance on data architecture, data governance, and data management best practices. Development and Unification of Data Frameworks: We build and integrate data frameworks that enable seamless data flow and analysis. Our solutions facilitate data ingestion, data transformation, and data storage, ensuring data is readily available for business intelligence and decision-making. Advanced Data Analysis and Artificial Intelligence Applications: We employ sophisticated data analysis techniques and artificial intelligence algorithms to extract valuable insights from data. Our solutions include predictive modeling, machine learning, and natural language processing, enabling our clients to make data-driven decisions and optimize their operations. Cloud Computing, Data Operations, and Machine Learning Operations: We leverage the scalability and flexibility of cloud computing to deliver efficient and cost-effective data solutions. Our team of experts manages data operations and machine learning operations, ensuring seamless integration and optimal performance. Organizational Principles at Ccube: At Ccube, we are guided by a set of core principles that shape our culture and drive our success: Efficiency: We strive to maximize efficiency by optimizing resource utilization and streamlining processes. Client Satisfaction: We are committed to providing exceptional service and exceeding our clients' expectations. Innovation: We embrace innovation and continuously explore new technologies and approaches to deliver cutting-edge solutions. Humility: We maintain professional modesty and recognize that there is always room for improvement. Employee Advantages: Ccube offers a stimulating and rewarding work environment with numerous benefits for our employees: Dynamic Startup Environment: We provide a fast-paced and entrepreneurial environment where employees can learn, grow, and make a significant impact. Career Growth Opportunities: We offer ample opportunities for career advancement and professional development. Performance-Based Incentives: We reward high-performing employees with competitive compensation and performance-based bonuses. Equity Participation: We offer equity participation options to eligible employees, providing them with ownership and a stake in the company's success. Professional Development Reimbursement: We encourage continuous learning and reimburse employees for eligible professional development expenses. Join Ccube and be part of a team that is shaping the future of data and AI in the cloud. Powered by JazzHR oE7x9xRa5q

Posted 1 month ago

Apply

8.0 - 11.0 years

35 - 37 Lacs

Kolkata, Ahmedabad, Bengaluru

Work from Office

Dear Candidate, We are hiring a Data Engineer to build and maintain data pipelines for our analytics platform. Perfect for engineers focused on data processing and scalability. Key Responsibilities: Design and implement ETL processes Manage data warehouses and ensure data quality Collaborate with data scientists to provide necessary data Optimize data workflows for performance Required Skills & Qualifications: Proficiency in SQL and Python Experience with data pipeline tools like Apache Airflow Familiarity with big data technologies (Spark, Hadoop) Bonus: Knowledge of cloud data services (AWS Redshift, Google BigQuery) Soft Skills: Strong troubleshooting and problem-solving skills. Ability to work independently and in a team. Excellent communication and documentation skills. Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Delivery Manager Integra Technologies

Posted 1 month ago

Apply

3.0 - 6.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Location: Bangalore Experience: 3-7 Yrs Notice Period: Immediate to 15 Days Job Description We are looking for energetic, high-performing and highly skilled Quality Assurance Engineer to help shape our technology and product roadmap. You will be part of the fast-paced, entrepreneurial Enterprise Personalization portfolio focused on delivering the next generation global marketing capabilities. This team is responsible for Global campaign tracking of new accounts acquisition and bounty payments and leverages transformational technologies, such as SQL, Hadoop, Spark, Pyspark, HDFS, MapReduce, Hive, HBase, Kafka & Java.Focus: Provides domain expertise to engineers on Automation, Testing and Quality Assurance (QA) methodologies and processes, crafts and executes test scripts, assists in preparation of test strategies, sets up and maintains test data & environments as well as logs results. 3 - 6 years of hands-on software testing experience in developing test cases and test plans with extensive knowledge of automated testing and architecture. Expert knowledge of Testing Frameworks and Test Automation Design Patterns like TDD, BDD etc. Expertise in developing software test cases for Hive, Spark, SQL written in pyspark SQL and Scala. Hands-on experience in Performance and Load Testing tools such as JMeter, pytest or similar tool. Experience with industry standard tools for defect tracking, source code management, test case management, test automation, and other management and monitoring tools Experience working with Agile methodology Experience with Cloud Platform (GCP) Experience in designing, developing, testing and debugging, and operating resilient distributed systems using Big Data ClustersGood sense for software quality, the clean code principles, test driven development and an agile mindset High engagement, self-organization, strong communication skills and team spirit Experience with building and adopting new test frameworks.Bonus skills:Testing Machine learning/data mining Roles & Responsibilities Responsible for testing and quality assurance of large data processing pipeline using Pyspark and SQL. Develops and tests software, including ongoing refactoring of code, and drives continuous improvement in code structure and quality· Functions as a platform SME who drives quality and automation strategy at application level, identifies new opportunities and drives Software Engineers to deliver the highest quality code. Delivers on capabilities for the portfolio automation strategy and executes against the test and automation strategy defined at the portfolio level. Works with engineers to drive improvements in code quality via manual and automated testing. Involved in the review of the user story backlog and requirements specifications for completeness and weaknesses in function, performance, reliability, scalability, testability, usability, and security and compliance testing. Provides recommendations Plans and defines testing approach, providing advice on prioritization of testing activity in support of identified risks in project schedules or test scenarios.

Posted 1 month ago

Apply

7.0 - 12.0 years

9 - 14 Lacs

Mumbai

Work from Office

The Senior Spark Tech Lead will be responsible for integrating and maintaining the Quantexa platform, a spark based software provided by a UK fintech, into our existing systems to enhance our anti-money laundering capabilities. This role requires a deep expertise in Spark development, as well as an ability to analyze and understand underlying data. Additionally, the candidate should have an interest in exploring open-source applications distributed by Apache, Kubernetes, OpenSearch and Oracle. Should be able to work as a Scrum Master Responsibilities Direct Responsibilities Integrate and upgrade the Quantexa tool with our existing systems for enhanced anti-money laundering measures. Develop and maintain Spark-based applications deployed on Kubernetes clusters. Conduct data analysis to understand and interpret underlying data structures. Collaborate with cross-functional teams to ensure seamless integration and functionality. Stay updated with the latest trends and best practices in Spark development and Kubernetes. Contributing Responsibilities Taking complete ownership of project activities and understand each tasks in details. Ensure that the team delivers on time without any delays and deliveries are of high quality standards. Estimation, Planning and scheduling of the project. Ensure all internal timelines are respected and project is on track. Work with team to develop robust software adhering to the timelines & following all the standard guidelines. Act proactively to ensure smooth team operations and effective collaboration Make sure team adheres to all compliance processes and intervene if required Task assignment to the team and tracking until task completion Proactive Status reporting to the management. Identify Risks in the project and highlight to Manager. Create Contingency and Backup planning as necessary. Create Mitigation Plan. Take decision by own based on situation. Play the role of mentor and coach team members as and when required to meet the target goals Gain functional knowledge on applications worked upon Create knowledge repositories for future reference. Arrange knowledge sharing sessions to enhance team's functional capability. Evaluation of new tools and coming with POCs. Provide feedback of team to upper management on timely basis Technical & Behavioral Competencies Key Responsibilities Integrate and upgrade the Quantexa tool with our existing systems for enhanced anti-money laundering measures. Develop and maintain Spark-based applications deployed on Kubernetes clusters. Conduct data analysis to understand and interpret underlying data structures. Collaborate with cross-functional teams to ensure seamless integration and functionality. Stay updated with the latest trends and best practices in Spark development and Kubernetes. Required Qualifications 7+ Years of experience in development Extensive experience in Hadoop, Spark, Scala development (5 years min). Strong analytical skills and experience in data analysis (SQL), data processing (such as ETL), parsing, data mapping and handling real-life data quality issues. Excellent problem-solving abilities and attention to detail. Strong communication and collaboration skills. Experience in Agile development. High quality coding skill, incl. code control, unit testing, design, and documentation (code, test). Experience with tools such as sonar. Experience with GIT, Jenkins. Specific Qualifications (if required) Experience with development and deployment of spark application and deployment on Kubernetes clusters Hands-on development experience (Java, Scala, etc.) via system integration projects, Python, Elastic (optional). Skills Referential Behavioural Skills : (Please select up to 4 skills) Ability to collaborate / Teamwork Adaptability Creativity & Innovation / Problem solving Attention to detail / rigor Transversal Skills: (Please select up to 5 skills) Analytical Ability Ability to develop and adapt a process Ability to develop and leverage networks Choose an item. Choose an item. Education Level: Bachelor Degree or equivalent Experience Level At least 7 years Fluent in English Team player Strong analytical skills Quality oriented and well organized Willing to work under pressure and mission oriented Excellent Oral and Written Communication Skills, Motivational Skills, Results-Oriented

Posted 1 month ago

Apply

6.0 - 10.0 years

13 - 17 Lacs

Bengaluru

Work from Office

About Persistent We are a trusted Digital Engineering and Enterprise Modernization partner, combining deep technical expertise and industry experience to help our clients anticipate what’s next. Our offerings and proven solutions create a unique competitive advantage for our clients by giving them the power to see beyond and rise above. We work with many industry-leading organizations across the world including 12 of the 30 most innovative US companies, 80% of the largest banks in the US and India, and numerous innovators across the healthcare ecosystem. Our growth trajectory continues, as we reported $1,231M annual revenue (16% Y-o-Y). Along with our growth, we’ve onboarded over 4900 new employees in the past year, bringing our total employee count to over 23,500+ people located in 19 countries across the globe. Persistent Ltd. is dedicated to fostering diversity and inclusion in the workplace. We invite applications from all qualified individuals, including those with disabilities, and regardless of gender or gender preference. We welcome diverse candidates from all backgrounds. For more details please login to www.persistent.com About The Position We are looking for a Big Data Lead who will be responsible for the management of data sets that are too big for traditional database systems to handle. You will create, design, and implement data processing jobs in order to transform the data into a more usable format. You will also ensure that the data is secure and complies with industry standards to protect the company?s information. What You?ll Do Manage customer's priorities of projects and requests Assess customer needs utilizing a structured requirements process (gathering, analyzing, documenting, and managing changes) to prioritize immediate business needs and advising on options, risks and cost Design and implement software products (Big Data related) including data models and visualizations Demonstrate participation with the teams you work in Deliver good solutions against tight timescales Be pro-active, suggest new approaches and develop your capabilities Share what you are good at while learning from others to improve the team overall Show that you have a certain level of understanding for a number of technical skills, attitudes and behaviors Deliver great solutions Be focused on driving value back into the business Expertise You?ll Bring 6 years' experience in designing & developing enterprise application solution for distributed systems Understanding of Big Data Hadoop Ecosystem components (Sqoop, Hive, Pig, Flume) Additional experience working with Hadoop, HDFS, cluster management Hive, Pig and MapReduce, and Hadoop ecosystem framework HBase, Talend, NoSQL databases Apache Spark or other streaming Big Data processing, preferred Java or Big Data technologies, will be a plus Benefits Competitive salary and benefits package Culture focused on talent development with quarterly promotion cycles and company-sponsored higher education and certifications Opportunity to work with cutting-edge technologies Employee engagement initiatives such as project parties, flexible work hours, and Long Service awards Annual health check-ups Insurance coverage: group term life, personal accident, and Mediclaim hospitalization for self, spouse, two children, and parents Inclusive Environment •We offer hybrid work options and flexible working hours to accommodate various needs and preferences. •Our office is equipped with accessible facilities, including adjustable workstations, ergonomic chairs, and assistive technologies to support employees with physical disabilities. Let's unleash your full potential. See Beyond, Rise Above

Posted 1 month ago

Apply

4.0 - 8.0 years

8 - 12 Lacs

Pune

Work from Office

About Persistent We are a trusted Digital Engineering and Enterprise Modernization partner, combining deep technical expertise and industry experience to help our clients anticipate what’s next. Our offerings and proven solutions create a unique competitive advantage for our clients by giving them the power to see beyond and rise above. We work with many industry-leading organizations across the world including 12 of the 30 most innovative US companies, 80% of the largest banks in the US and India, and numerous innovators across the healthcare ecosystem. Our growth trajectory continues, as we reported $1,231M annual revenue (16% Y-o-Y). Along with our growth, we’ve onboarded over 4900 new employees in the past year, bringing our total employee count to over 23,500+ people located in 19 countries across the globe. Persistent Ltd. is dedicated to fostering diversity and inclusion in the workplace. We invite applications from all qualified individuals, including those with disabilities, and regardless of gender or gender preference. We welcome diverse candidates from all backgrounds. For more details please login to www.persistent.com About The Position We are looking for a Full Stack Developer who is a skilled professional with exposure to cloud platforms, DevOps, and data visualization, preferably with networking domain (Cisco) product exposure. You will collaborate with internal teams to design, develop, deploy, and maintain software applications at scale. To succeed as a Full Stack Developer, you will be required to ensure the timely completion and approvals of project deliverables. You will also be expected to recommend new technologies and techniques for application development. What You?ll Do Design, develop, deploy, and maintain software applications at scale using Java / J2EE, JavaScript frameworks (Angular or React) and associated technologies Deploy software using CI / CD tools such as Jenkins Understand the technologies implemented, and interface with the project manager on status and technical issues Solve and articulate simple and complex problems with application design, development, and user experiences Collaborate with other developers and designers, as well as assist with technical matters when required Expertise You?ll Bring Experience: Proven ability in Java / j2ee, Spring, Python, Docker, Kubernetes, and Microservices Knowledge in the distributed technologies below will give you an added advantage Apache Spark MapReduce Principles Kafka (MSK) Apache Hadoop (AWS EMR) Benefits Competitive salary and benefits package Culture focused on talent development with quarterly promotion cycles and company-sponsored higher education and certifications Opportunity to work with cutting-edge technologies Employee engagement initiatives such as project parties, flexible work hours, and Long Service awards Annual health check-ups Insurance coverage : group term life , personal accident, and Mediclaim hospitalization for self, spouse, two children, and parents Inclusive Environment •We offer hybrid work options and flexible working hours to accommodate various needs and preferences. •Our office is equipped with accessible facilities, including adjustable workstations, ergonomic chairs, and assistive technologies to support employees with physical disabilities. Let's unleash your full potential. See Beyond, Rise Above.

Posted 1 month ago

Apply

4.0 - 8.0 years

8 - 12 Lacs

Hyderabad

Work from Office

About Persistent We are a trusted Digital Engineering and Enterprise Modernization partner, combining deep technical expertise and industry experience to help our clients anticipate what’s next. Our offerings and proven solutions create a unique competitive advantage for our clients by giving them the power to see beyond and rise above. We work with many industry-leading organizations across the world including 12 of the 30 most innovative US companies, 80% of the largest banks in the US and India, and numerous innovators across the healthcare ecosystem. Our growth trajectory continues, as we reported $1,231M annual revenue (16% Y-o-Y). Along with our growth, we’ve onboarded over 4900 new employees in the past year, bringing our total employee count to over 23,500+ people located in 19 countries across the globe. Persistent Ltd. is dedicated to fostering diversity and inclusion in the workplace. We invite applications from all qualified individuals, including those with disabilities, and regardless of gender or gender preference. We welcome diverse candidates from all backgrounds. For more details please login to www.persistent.com About The Position We are looking for a Full Stack Developer who is a skilled professional with exposure to cloud platforms, DevOps, and data visualization, preferably with networking domain (Cisco) product exposure. You will collaborate with internal teams to design, develop, deploy, and maintain software applications at scale. To succeed as a Full Stack Developer, you will be required to ensure the timely completion and approvals of project deliverables. You will also be expected to recommend new technologies and techniques for application development. What You?ll Do Design, develop, deploy, and maintain software applications at scale using Java / J2EE, JavaScript frameworks (Angular or React) and associated technologies Deploy software using CI / CD tools such as Jenkins Understand the technologies implemented, and interface with the project manager on status and technical issues Solve and articulate simple and complex problems with application design, development, and user experiences Collaborate with other developers and designers, as well as assist with technical matters when required Expertise You?ll Bring Experience: Proven ability in Java / j2ee, Spring, Python, Docker, Kubernetes, and Microservices Knowledge in the distributed technologies below will give you an added advantage Apache Spark MapReduce Principles Kafka (MSK) Apache Hadoop (AWS EMR) Benefits Competitive salary and benefits package Culture focused on talent development with quarterly promotion cycles and company-sponsored higher education and certifications Opportunity to work with cutting-edge technologies Employee engagement initiatives such as project parties, flexible work hours, and Long Service awards Annual health check-ups Insurance coverage : group term life , personal accident, and Mediclaim hospitalization for self, spouse, two children, and parents Inclusive Environment •We offer hybrid work options and flexible working hours to accommodate various needs and preferences. •Our office is equipped with accessible facilities, including adjustable workstations, ergonomic chairs, and assistive technologies to support employees with physical disabilities. Let's unleash your full potential. See Beyond, Rise Above

Posted 1 month ago

Apply

6.0 - 10.0 years

13 - 17 Lacs

Pune

Work from Office

About Persistent We are a trusted Digital Engineering and Enterprise Modernization partner, combining deep technical expertise and industry experience to help our clients anticipate what’s next. Our offerings and proven solutions create a unique competitive advantage for our clients by giving them the power to see beyond and rise above. We work with many industry-leading organizations across the world including 12 of the 30 most innovative US companies, 80% of the largest banks in the US and India, and numerous innovators across the healthcare ecosystem. Our growth trajectory continues, as we reported $1,231M annual revenue (16% Y-o-Y). Along with our growth, we’ve onboarded over 4900 new employees in the past year, bringing our total employee count to over 23,500+ people located in 19 countries across the globe. Persistent Ltd. is dedicated to fostering diversity and inclusion in the workplace. We invite applications from all qualified individuals, including those with disabilities, and regardless of gender or gender preference. We welcome diverse candidates from all backgrounds. For more details please login to www.persistent.com About The Position We are looking for a Big Data Lead who will be responsible for the management of data sets that are too big for traditional database systems to handle. You will create, design, and implement data processing jobs in order to transform the data into a more usable format. You will also ensure that the data is secure and complies with industry standards to protect the company?s information. What You?ll Do Manage customer's priorities of projects and requests Assess customer needs utilizing a structured requirements process (gathering, analyzing, documenting, and managing changes) to prioritize immediate business needs and advising on options, risks and cost Design and implement software products (Big Data related) including data models and visualizations Demonstrate participation with the teams you work in Deliver good solutions against tight timescales Be pro-active, suggest new approaches and develop your capabilities Share what you are good at while learning from others to improve the team overall Show that you have a certain level of understanding for a number of technical skills, attitudes and behaviors Deliver great solutions Be focused on driving value back into the business Expertise You?ll Bring 6 years' experience in designing & developing enterprise application solution for distributed systems Understanding of Big Data Hadoop Ecosystem components (Sqoop, Hive, Pig, Flume) Additional experience working with Hadoop, HDFS, cluster management Hive, Pig and MapReduce, and Hadoop ecosystem framework HBase, Talend, NoSQL databases Apache Spark or other streaming Big Data processing, preferred Java or Big Data technologies, will be a plus Benefits Competitive salary and benefits package Culture focused on talent development with quarterly promotion cycles and company-sponsored higher education and certifications Opportunity to work with cutting-edge technologies Employee engagement initiatives such as project parties, flexible work hours, and Long Service awards Annual health check-ups Insurance coverage: group term life, personal accident, and Mediclaim hospitalization for self, spouse, two children, and parents Inclusive Environment •We offer hybrid work options and flexible working hours to accommodate various needs and preferences. •Our office is equipped with accessible facilities, including adjustable workstations, ergonomic chairs, and assistive technologies to support employees with physical disabilities Let's unleash your full potential. See Beyond, Rise Above

Posted 1 month ago

Apply

6.0 - 10.0 years

13 - 17 Lacs

Hyderabad

Work from Office

About Persistent We are a trusted Digital Engineering and Enterprise Modernization partner, combining deep technical expertise and industry experience to help our clients anticipate what’s next. Our offerings and proven solutions create a unique competitive advantage for our clients by giving them the power to see beyond and rise above. We work with many industry-leading organizations across the world including 12 of the 30 most innovative US companies, 80% of the largest banks in the US and India, and numerous innovators across the healthcare ecosystem. Our growth trajectory continues, as we reported $1,231M annual revenue (16% Y-o-Y). Along with our growth, we’ve onboarded over 4900 new employees in the past year, bringing our total employee count to over 23,500+ people located in 19 countries across the globe. Persistent Ltd. is dedicated to fostering diversity and inclusion in the workplace. We invite applications from all qualified individuals, including those with disabilities, and regardless of gender or gender preference. We welcome diverse candidates from all backgrounds. For more details please login to www.persistent.com About The Position We are looking for a Big Data Lead who will be responsible for the management of data sets that are too big for traditional database systems to handle. You will create, design, and implement data processing jobs in order to transform the data into a more usable format. You will also ensure that the data is secure and complies with industry standards to protect the company?s information. What You?ll Do Manage customer's priorities of projects and requests Assess customer needs utilizing a structured requirements process (gathering, analyzing, documenting, and managing changes) to prioritize immediate business needs and advising on options, risks and cost Design and implement software products (Big Data related) including data models and visualizations Demonstrate participation with the teams you work in Deliver good solutions against tight timescales Be pro-active, suggest new approaches and develop your capabilities Share what you are good at while learning from others to improve the team overall Show that you have a certain level of understanding for a number of technical skills, attitudes and behaviors Deliver great solutions Be focused on driving value back into the business Expertise You?ll Bring 6 years' experience in designing & developing enterprise application solution for distributed systems Understanding of Big Data Hadoop Ecosystem components (Sqoop, Hive, Pig, Flume) Additional experience working with Hadoop, HDFS, cluster management Hive, Pig and MapReduce, and Hadoop ecosystem framework HBase, Talend, NoSQL databases Apache Spark or other streaming Big Data processing, preferred Java or Big Data technologies, will be a plus Benefits Competitive salary and benefits package Culture focused on talent development with quarterly promotion cycles and company-sponsored higher education and certifications Opportunity to work with cutting-edge technologies Employee engagement initiatives such as project parties, flexible work hours, and Long Service awards Annual health check-ups Insurance coverage: group term life, personal accident, and Mediclaim hospitalization for self, spouse, two children, and parents Inclusive Environment •We offer hybrid work options and flexible working hours to accommodate various needs and preferences. •Our office is equipped with accessible facilities, including adjustable workstations, ergonomic chairs, and assistive technologies to support employees with physical disabilities. Let's unleash your full potential. See Beyond, Rise Above

Posted 1 month ago

Apply

4.0 - 8.0 years

8 - 12 Lacs

Bengaluru

Work from Office

About Persistent We are a trusted Digital Engineering and Enterprise Modernization partner, combining deep technical expertise and industry experience to help our clients anticipate what’s next. Our offerings and proven solutions create a unique competitive advantage for our clients by giving them the power to see beyond and rise above. We work with many industry-leading organizations across the world including 12 of the 30 most innovative US companies, 80% of the largest banks in the US and India, and numerous innovators across the healthcare ecosystem. Our growth trajectory continues, as we reported $1,231M annual revenue (16% Y-o-Y). Along with our growth, we’ve onboarded over 4900 new employees in the past year, bringing our total employee count to over 23,500+ people located in 19 countries across the globe. Persistent Ltd. is dedicated to fostering diversity and inclusion in the workplace. We invite applications from all qualified individuals, including those with disabilities, and regardless of gender or gender preference. We welcome diverse candidates from all backgrounds. For more details please login to www.persistent.com About The Position We are looking for a Full Stack Developer who is a skilled professional with exposure to cloud platforms, DevOps, and data visualization, preferably with networking domain (Cisco) product exposure. You will collaborate with internal teams to design, develop, deploy, and maintain software applications at scale. To succeed as a Full Stack Developer, you will be required to ensure the timely completion and approvals of project deliverables. You will also be expected to recommend new technologies and techniques for application development. What You?ll Do Design, develop, deploy, and maintain software applications at scale using Java / J2EE, JavaScript frameworks (Angular or React) and associated technologies Deploy software using CI / CD tools such as Jenkins Understand the technologies implemented, and interface with the project manager on status and technical issues Solve and articulate simple and complex problems with application design, development, and user experiences Collaborate with other developers and designers, as well as assist with technical matters when required Expertise You?ll Bring Experience: Proven ability in Java / j2ee, Spring, Python, Docker, Kubernetes, and Microservices Knowledge in the distributed technologies below will give you an added advantage Apache Spark MapReduce Principles Kafka (MSK) Apache Hadoop (AWS EMR) Benefits Competitive salary and benefits package Culture focused on talent development with quarterly promotion cycles and company-sponsored higher education and certifications Opportunity to work with cutting-edge technologies Employee engagement initiatives such as project parties, flexible work hours, and Long Service awards Annual health check-ups Insurance coverage : group term life , personal accident, and Mediclaim hospitalization for self, spouse, two children, and parents Inclusive Environment •We offer hybrid work options and flexible working hours to accommodate various needs and preferences. •Our office is equipped with accessible facilities, including adjustable workstations, ergonomic chairs, and assistive technologies to support employees with physical disabilities. Let's unleash your full potential. See Beyond, Rise Above

Posted 1 month ago

Apply

8.0 years

0 Lacs

Bengaluru, Karnataka

Remote

Senior Software Engineer Bangalore, Karnataka, India Date posted Jun 23, 2025 Job number 1822030 Work site Up to 50% work from home Travel 0-25 % Role type Individual Contributor Profession Software Engineering Discipline Software Engineering Employment type Full-Time Overview Come join us in the Azure Core Economics team, explore your passions, and impact the world! If you join the Azure Core Economics team, you will join a group of economists, data scientists, and software engineers within Azure Engineering that tackles a variety of data-intensive, distributed computing challenges related to cloud economics that are of critical importance to Microsoft. Our team collaborates with a wide range of other teams in Microsoft on problems such as pricing optimization, demand estimation and forecasting, capacity management and planning, fraud detection, virtual machine bin packing, and others. Our software engineering team in India aids in the development of novel software products, improvements, and research incubations that are aligned to this core mission. As a Senior Software Engineer in the Azure Core Economics team, you will play a critical role in developing software for a variety of economics-related products in cloud computing that leverage large amounts of data to drive improvements to the Azure platform and its customers. This opportunity will allow you to gain experience working on cutting-edge problems in the intersection of applied economics, engineering, and data science, develop deep expertise in the economics of the cloud business, and become increasingly adept at developing engineering solutions for data-intensive products. The team is supportive of flexible work, and candidates may work from home up to 50% of the time. Microsoft’s mission is to empower every person and every organization on the planet to achieve more. As employees we come together with a growth mindset, innovate to empower others, and collaborate to realize our shared goals. Each day we build on our values of respect, integrity, and accountability to create a culture of inclusion where everyone can thrive at work and beyond. Qualifications Required Qualifications: Bachelor's Degree in Computer Science or related technical field AND 8+ years of technical engineering experience with coding in languages including, but not limited to, C, C++, C#, Java, JavaScript, or Python OR Master's Degree in Computer Science or related technical field AND 6+ years of technical engineering experience with coding in languages including, but not limited to, C, C++, C#, Java, JavaScript, or Python OR equivalent experience Other Requirements: Ability to meet Microsoft, customer and/or government security screening requirements are required for this role. These requirements include, but are not limited to the following specialized security screenings: Microsoft Cloud Background Check: This position will be required to pass the Microsoft Cloud Background Check upon hire/transfer and every two years thereafter. Preferred Qualifications: Bachelor's Degree in Computer Science OR related technical field AND 8+ years technical engineering experience with coding in languages including, but not limited to, C, C++, C#, Java, JavaScript, OR Python OR Master's Degree in Computer Science or related technical field AND 6+ years technical engineering experience with coding in languages including, but not limited to, C, C++, C#, Java, JavaScript, or Python OR equivalent experience. 1+ years of experience in a technical leadership role 1+ years of relational database and SQL skills with working knowledge of data warehousing concepts, including technical architectures, infrastructure components, ETL/ELT and reporting/analytic tools and environments (such as Cosmos DB, Apache Beam, Hadoop, Spark, Pig, Hive, MapReduce, and Flume). 1+ years experience working within a cloud environment such as Microsoft Azure, Amazon Web Services, or Google Cloud Platform including, Experience applying software development skills to economics-oriented and data-intensive products and problems and Experience deploying statistical or machine learning models at scale. Responsibilities Leads by example within the team by producing extensible and maintainable code and applying metrics to drive the quality and stability of code Drives efforts to ensure the correct processes are followed to achieve a high degree of security, privacy, safety, and accessibility Drives identification of dependencies and the development of design documents for products, applications, services, and platforms Identifies other teams and technologies that will be leveraged and how they will interact and acts as a key contact for leadership to ensure alignment with partners’ expectations Leverages subject-matter expertise of product features and partners with appropriate stakeholders (e.g., economists, data scientists, engineers, researchers, program managers, and business planners) to drive project plans, release plans, and work items Organizes work into smaller sets of tasks as part of an overall roadmap Drives the refinement of products through data analytics and makes informed decisions in engineering products through data integration. Benefits/perks listed below may vary depending on the nature of your employment with Microsoft and the country where you work.  Industry leading healthcare  Educational resources  Discounts on products and services  Savings and investments  Maternity and paternity leave  Generous time away  Giving programs  Opportunities to network and connect Microsoft is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations.

Posted 1 month ago

Apply

5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Company Description 👋🏼 We're Nagarro. We are a Digital Product Engineering company that is scaling in a big way! We build products, services, and experiences that inspire, excite, and delight. We work at scale — across all devices and digital mediums, and our people exist everywhere in the world (17500+ experts across 39 countries, to be exact). Our work culture is dynamic and non-hierarchical. We're looking for great new colleagues. That's where you come in! Job Description REQUIREMENTS: Total experience 5+years. Hands on experience in Data Engineering, Data Lakes, Data Mesh, or Data Warehousing/ETL environments. Strong working knowledge in Python, SQL, Airflow and PySpark. Hands-on experience implementing projects applying SDLC practices. Hands on experience in building data pipelines and building data frameworks for unit testing, data lineage tracking, and automation. Experience with building and maintaining a cloud system. Familiarity with databases like DB2 and Teradata. Strong working knowledge in Apache Spark, Apache Kafka, Hadoop and MapReduce. Strong troubleshooting skills and ability to design for scalability and flexibility. Expertise in Spanner for high-availability, scalable database solutions. Knowledge of data governance and security practices in cloud-based environments. Problem-solving mindset with the ability to tackle complex data engineering challenges Familiar with containerization technologies (Docker/Kubernetes). Excellent communication and collaboration skills. RESPONSIBILITIES: Writing and reviewing great quality code. Understanding the client’s business use cases and technical requirements and be able to convert them in to technical design which elegantly meets the requirements Mapping decisions with requirements and be able to translate the same to developers Identifying different solutions and being able to narrow down the best option that meets the client’s requirements. Defining guidelines and benchmarks for NFR considerations during project implementation Writing and reviewing design documents explaining overall architecture, framework, and high-level design of the application for the developers Reviewing architecture and design on various aspects like extensibility, scalability, security, design patterns, user experience, NFRs, etc., and ensure that all relevant best practices are followed. Developing and designing the overall solution for defined functional and non-functional requirements; and defining technologies, patterns, and frameworks to materialize it Understanding and relating technology integration scenarios and applying these learnings in projects. Resolving issues that are raised during code/review, through exhaustive systematic analysis of the root cause, and being able to justify the decision taken Carrying out POCs to make sure that suggested design/technologies meet the requirements. Qualifications Bachelor’s or master’s degree in computer science, Information Technology, or a related field.

Posted 1 month ago

Apply

5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Company Description 👋🏼 We're Nagarro. We are a Digital Product Engineering company that is scaling in a big way! We build products, services, and experiences that inspire, excite, and delight. We work at scale — across all devices and digital mediums, and our people exist everywhere in the world (17500+ experts across 39 countries, to be exact). Our work culture is dynamic and non-hierarchical. We're looking for great new colleagues. That's where you come in! Job Description REQUIREMENTS: Total experience 5+ years. Excellent knowledge and experience in Big data engineer. Strong hands-on experience with Apache Spark and Python. Proficiency in GCP Pub/Sub, Hadoop/MapReduce, Hive, and data transformation tools. Experience with relational databases (e.g., PostgreSQL) and NoSQL databases (e.g., MongoDB, Kafka). Solid understanding of SQL-like query languages: SQL, HQL, MQL, etc. Hands-on experience in building data pipelines for ETL/ELT processes. Proficient with CI/CD tools and version control systems like Git. Familiarity with Agile methodologies (Scrum, Kanban). Solid understanding of distributed computing, parallel processing, and big data best practices Strong problem-solving and debugging skills Experience working in Agile/Scrum environments Familiarity with data modeling, data warehousing, and building distributed systems. Expertise in Spanner for high-availability, scalable database solutions. Knowledge of data governance and security practices in cloud-based environments. Problem-solving mindset with the ability to tackle complex data engineering challenges. Strong communication and teamwork skills, with the ability to mentor and collaborate effectively. RESPONSIBILITIES: Writing and reviewing great quality code. Understanding the client’s business use cases and technical requirements and be able to convert them into technical design which elegantly meets the requirements. Mapping decisions with requirements and be able to translate the same to developers. Identifying different solutions and being able to narrow down the best option that meets the client’s requirements. Defining guidelines and benchmarks for NFR considerations during project implementation Writing and reviewing design document explaining overall architecture, framework, and high-level design of the application for the developers. Reviewing architecture and design on various aspects like extensibility, scalability, security, design patterns, user experience, NFRs, etc., and ensure that all relevant best practices are followed. Developing and designing the overall solution for defined functional and non-functional requirements; and defining technologies, patterns, and frameworks to materialize it. Understanding and relating technology integration scenarios and applying these learnings in projects. Resolving issues that are raised during code/review, through exhaustive systematic analysis of the root cause, and being able to justify the decision taken. Carrying out POCs to make sure that suggested design/technologies meet the requirements. Qualifications Bachelor’s or master’s degree in computer science, Information Technology, or a related field.

Posted 1 month ago

Apply

5.0 years

0 Lacs

Bengaluru

On-site

Company Description We're Nagarro. We are a Digital Product Engineering company that is scaling in a big way! We build products, services, and experiences that inspire, excite, and delight. We work at scale — across all devices and digital mediums, and our people exist everywhere in the world (17500+ experts across 39 countries, to be exact). Our work culture is dynamic and non-hierarchical. We're looking for great new colleagues. That's where you come in! Job Description REQUIREMENTS: Total experience 5+ years. Excellent knowledge and experience in Big data engineer. Strong hands-on experience with Apache Spark and Python. Proficiency in GCP Pub/Sub, Hadoop/MapReduce, Hive, and data transformation tools. Experience with relational databases (e.g., PostgreSQL) and NoSQL databases (e.g., MongoDB, Kafka). Solid understanding of SQL-like query languages: SQL, HQL, MQL, etc. Hands-on experience in building data pipelines for ETL/ELT processes. Proficient with CI/CD tools and version control systems like Git. Familiarity with Agile methodologies (Scrum, Kanban). Solid understanding of distributed computing, parallel processing, and big data best practices Strong problem-solving and debugging skills Experience working in Agile/Scrum environments Familiarity with data modeling, data warehousing, and building distributed systems. Expertise in Spanner for high-availability, scalable database solutions. Knowledge of data governance and security practices in cloud-based environments. Problem-solving mindset with the ability to tackle complex data engineering challenges. Strong communication and teamwork skills, with the ability to mentor and collaborate effectively. RESPONSIBILITIES: Writing and reviewing great quality code. Understanding the client’s business use cases and technical requirements and be able to convert them into technical design which elegantly meets the requirements. Mapping decisions with requirements and be able to translate the same to developers. Identifying different solutions and being able to narrow down the best option that meets the client’s requirements. Defining guidelines and benchmarks for NFR considerations during project implementation Writing and reviewing design document explaining overall architecture, framework, and high-level design of the application for the developers. Reviewing architecture and design on various aspects like extensibility, scalability, security, design patterns, user experience, NFRs, etc., and ensure that all relevant best practices are followed. Developing and designing the overall solution for defined functional and non-functional requirements; and defining technologies, patterns, and frameworks to materialize it. Understanding and relating technology integration scenarios and applying these learnings in projects. Resolving issues that are raised during code/review, through exhaustive systematic analysis of the root cause, and being able to justify the decision taken. Carrying out POCs to make sure that suggested design/technologies meet the requirements. Qualifications Bachelor’s or master’s degree in computer science, Information Technology, or a related field.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies