Jobs
Interviews

84 Bigdata Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 11.0 years

7 - 11 Lacs

chennai, bengaluru

Work from Office

Job Title: Developer Work Location: ~CHENNAI~BANGALORE~ Skill Required: Digital : BigData and Hadoop Ecosystems~Digital : Python ~Digital : PySpark Experience Range in Required Skills:5 - 7Yrs Specific activities required: - Lead the implementation of infrastructure via code and provide strategic advice/recommendations for the development and advancement of Microsoft Azure technologies based on previous research on trends in public cloud environments. - Integrate and automate the delivery of standardised Azure deployments, in conjunction with orchestration products such as Azure DevOps with Terraform, Azure ARM templates and other modern deployment technologies. - Act as the escalation point for level three Azure related issues, providing technical support and fault resolution, as well as guidance and mentoring of operational run teams, both locally and remotely throughout the organization. - Ensure the appropriate gathering of business requirements and their translation into appropriate solutions. - Maintain and deliver all related documentation for the design, development, build, and deployment methods used, ensuring the source of control of all applicable code is stored and managed properly. - Provide guidance and assistance to al support teams. - Provide complimentary support and leadership in hardening and security testing. KEY COMPETENCIES: Key competences: - Tertiary qualifications in a relevant discipline with relevant certifications in Microsoft Azure. - Worked as Data engineer on Azure Cloud - Good knowledge of Pyspark, Azure Data Factory - Comprehensive knowledge of public cloud environments and industry trends. - Significant experience supporting, , designing, and developing public cloud solutions via Infrastructure as Code, including Terraform and ARM. - Extensive DevOps experience. - The ability to communicate effectively and work collaboratively with diverse team members. - Demonstrated experience in security hardening and testing. - Proven ability in creating and updating accurate documentation. - Excellent verbal and written communication skills. - Willingness and flexibility to work outside of standard office hours, and on weekends as required

Posted 1 day ago

Apply

4.0 - 8.0 years

0 Lacs

karnataka

On-site

As a candidate for the position, you should possess the following qualifications and skills: - Experience: 4 to 7 years - Proficiency in Bigdata, Pyspark, Hive, and Spark Optimization - Good to have: GCP Please note the Job Reference Number for this position is 13159.,

Posted 3 days ago

Apply

8.0 - 10.0 years

15 - 25 Lacs

bengaluru

Work from Office

Dear Candidate, We have a Job Opening for Lead Software Engineer - Bigdata in Banking Based Product Company at Bangalore. Requirement Details: Location : Bangalore Designation: Specialist Software Engineer - Genesys Experience: 8 to 10 Years Expected Notice Period : Immediate Profile Required: At least 3 to 7 Years of experience on big data platform and at least 2 years of experience in implementing DWH on Snowflake Proven experience with cloud platforms preferable Azure particularly on data services Good understanding of distributed computing frameworks like Apache Spark, Hadoop etc Perform Analysis on existing data storage systems Big Data & development of data solutions in Snowflake High Proficiency in SQL and at least one of the following languages Scala / Python & Experience in working on migrating data from on-premise databases preferably Big Data platform to Snowflake Expertise in building robust ELT/ETL processes, performance tuning of the data pipelines in Snowflake and should be able to trouble shoot the issues quickly Strong Knowledge on Integration concepts and design best practices Data Modeling & data integration, Advanced SQL skills for analysis, standardizing queries Proven experience in managing and mentoring data engineering teams Excellent interpersonal skills, with the ability to work across teams and communicate effectively with technical and non-technical stake holders Strong analytical and trouble shooting skills with proven ability to find solutions in complex data environments If you are interested, kindly share your updated CV to ct1@convate.com with below details. Kindly fill the below details: 1. Reason for job change: 2. Current Salary: 3. Expected Salary: 4. Joining Time needed Request you to kindly refer any of your friends or colleagues relevant and interested to the opportunity shared. About Convate Consultancy Recruitment Firm: Estd in 2004, Convate (team of 60 recruiters) is a leading International Recruitment Company having operations in Bangalore and Dubai. We specialize in the recruitment of IT/Healthcare/Engineering in India and the Middle East. Convate provides a learning-based work culture with a strong opportunity to grow in the years to come. Thanks and Regards, Akhila Recruitment Specialist Ct1@convate.com Convate Consultancy Services Pvt Ltd

Posted 5 days ago

Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

You are being recruited by ANSR on behalf of their client, American Airlines, a company that aims to Care for People on Life's Journey by providing thousands of flights daily to over 350 destinations in more than 60 countries. American Airlines is undergoing a transformation in the way it delivers technology to its customers and team members globally. American's Tech Hub in Hyderabad, India, is a key technology office location where team members drive technical innovation and engineer digital products to enhance the customer and team member experience. By leveraging cutting-edge technology, American Airlines is focused on solving business problems and delivering industry-leading technology solutions. As a member of the Information Technology Team within the Information Technology Division, you will be responsible for participating in all phases of the development process, advocating for agile methodologies and test-driven development. Your role will involve collaborating with various stakeholders to understand requirements, developing and maintaining enterprise services and applications, troubleshooting and debugging complex issues, and researching and implementing new technologies to enhance processes. To be successful in this position, you should possess a Bachelor's degree in a relevant technical discipline or equivalent experience, along with at least 3 years of full Software Development Life Cycle (SDLC) experience. Mandatory skills include proficiency in C#, .Net, SQL Server or PostgreSQL, Angular or React, JavaScript, and Cloud platforms like Azure or AWS. Additionally, experience with EventHub, Kafka, or MQ is required. Preferred qualifications include a Master's degree, 5 years of SDLC experience, and knowledge of the airline industry. Proficiency in Full Stack Development, various programming languages and frameworks, cloud-based development, and security integrations are also essential for this role. In return for your contributions, American Airlines offers travel perks, health benefits, wellness programs, a 401(k) program, and additional benefits like the Employee Assistance Program and pet insurance. The company values inclusion and diversity, striving to create an inclusive work environment that meets the needs of its diverse workforce. Join American Airlines to embark on a rewarding journey where you can grow both personally and professionally while contributing to a world-class customer experience.,

Posted 6 days ago

Apply

7.0 - 12.0 years

0 Lacs

chennai, tamil nadu

On-site

As an experienced professional with 7-12 years of experience in the Data and Analytics domain, you will be responsible for leading multiple development projects from end to end. Your expertise in SDLC/Agile methodologies, proficiency in ETL and analytical tools, along with a strong command of SQL will be essential for the successful execution of projects. In this role, you will manage multiple teams comprising 25 to 30 members each, driving them towards timely deliverables. Your proficiency in project management activities, coupled with a proactive and go-getter attitude, will enable you to effectively prioritize tasks and ensure their completion. A key aspect of your responsibilities will involve interpreting the business needs of stakeholders and translating them into actionable insights. Your knowledge of Big Data and cloud technologies will be valuable in creating functional and technical documentation related to Business Intelligence solutions. Furthermore, you will provide thought leadership, best practices, and standards necessary to deliver effective solutions. Your role will also entail re-designing existing data models and architecture to enhance performance, ensure data quality, and governance. Your ability to translate complex functional, technical, and business requirements into architectural designs will be crucial. Additionally, conducting code reviews, offering best practices for data modeling, application design, and development, and providing technical product assistance and tuning to meet customer requirements will be part of your responsibilities.,

Posted 1 week ago

Apply

6.0 - 11.0 years

0 - 0 Lacs

chennai, bengaluru

Hybrid

Role & responsibilities Minimum of 6 years of software development experience in a professional environment and/or comparable experience such as: o Familiar with Agile or other rapid application development methods o Experience with design and coding in Java and across one or more platforms and additional languages as appropriate o Experience with Big Data processing and Batch/streaming technologies such as Apache Spark, Kafka, Flink, Beam and Scala as a programming language preferred o Experience with RxJava and functional programming is preferred o Experience in Full Stack development using Java with React, Node preferred o Experience with AWS or GCP cloud is preferred o Experience with AWS Technologies like EMR, MSK (Kafka), EKS (Kubernetes), DynamoDB, Aurora DB preferred. o Experience with GCP Technologies like Dataflow (Apache Beam pipelines), BigQuery, Bigtable, Pub/Sub Dataproc, Pubsub and Bigtable is preferred. o Backend experience including Apache Cassandra, and relational databases such as Oracle, PostgreSQL a plus o Hands-on expertise with application design, software development and automated testing. o Experience with distributed (multi-tiered) systems, algorithms, and relational and No-SQL databases o Confirmed experience with object-oriented design and coding with variety of languages o Experience in managing and delivering applications and services using a cloud computing model across Public, Private, and Hybrid Cloud environments. • Bachelors degree in computer science, computer science engineering, or related experience required, advanced degree preferred

Posted 1 week ago

Apply

2.0 - 6.0 years

0 Lacs

karnataka

On-site

You are ready to gain the skills and experience necessary to advance within your role and further develop your career. At JPMorgan Chase, within the Liquidity Risk (LRI) team, the position of Software Engineer II - Big Data/Java/Scala offers you an exciting opportunity. In this role, you will play a crucial part in designing and implementing the next generation buildout of a cloud-native liquidity risk management platform. The primary focus of our technology organization is to provide comprehensive solutions for managing the firm's liquidity risk and meeting regulatory reporting obligations in over 50 markets. The project entails the strategic development of advanced liquidity calculation engines, integrating AI and ML into liquidity risk processes, and introducing digital-first reporting capabilities. The platform's goal is to handle 40-60 million transactions and positions daily, assess the risk presented by the current actual and model-based market conditions, create a multidimensional view of the corporate risk profile, and enable real-time analysis. **Job Responsibilities:** - Executes standard software solutions, design, development, and technical troubleshooting. - Utilizes tools in the Software Development Life Cycle toolchain to enhance automation value. - Analyzes large, diverse data sets to identify issues and contribute to decision-making in support of secure, stable application development. - Learns and implements system processes, methodologies, and skills for developing secure, stable code and systems. - Contributes to a team culture of diversity, equity, inclusion, and respect. - Supports the team's drive for continuous improvement of the development process and innovative solutions to meet business needs. - Dedicates effort to align technology solutions with business goals. **Required Qualifications, Capabilities, and Skills:** - Formal training or certification in Java, Scala, Spark, Bigdata concepts with at least 2 years of applied experience. - Hands-on development expertise and comprehensive knowledge of Java, Scala, Spark, and related Bigdata technologies. - Practical experience in system design, application development, testing, and operational stability. - Proficiency across the entire Software Development Life Cycle. - Familiarity with agile methodologies like CI/CD, Applicant Resiliency, and Security. - Growing understanding of software applications and technical processes within a technical discipline. - Ability to collaborate closely with stakeholders to define requirements. - Collaboration with partners across feature teams to develop reusable services that meet solution requirements. **Preferred Qualifications, Capabilities, and Skills:** - Experience in big data solutions with a track record of utilizing data analysis for driving solutions. - Exposure to cloud technologies, particularly AWS.,

Posted 1 week ago

Apply

6.0 - 11.0 years

18 Lacs

chennai, bengaluru

Work from Office

Candidates Specification : Must have 6+ years of experience in ETL Testing Expert in Python + SQL Python Strong Python programming skills for writing automation scripts and integrating with ETL processes Azure Experience with Azure Data Factory, Azure SQL, Azure Blob Storage, and other Azure data services.(good to have) Automation Tools: Familiarity with test automation tools and frameworks (e.g., Pytest, Selenium) Data Warehousing: Experience working with data warehousing concepts and technologies Contact Person- Sheena Rakesh Email id- sheena@gojobs.biz

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

You will be joining one of the elite units of Infosys Strategic Technology Group (STG) led by the Unit Technology Officer (CTO) as a full stack architect. As part of the Power Programmer initiative in Global Delivery, your primary responsibility will be to build a team of Full Stack Developers. This team will focus on executing complex engineering projects, developing platforms, and creating marketplaces for clients by leveraging cutting-edge technologies. Your role as a full stack architect will require you to stay ahead of the technology curve and continuously enhance your skills to become a Polyglot developer. You will be expected to proactively address end customer challenges by dedicating most of your time to designing and coding solutions for technology-oriented development projects. Working in an Agile mode, you will be involved in providing solutions with minimal system requirements. Collaboration with other Power Programmers, participation in the Open Source community, and engagement with Tech User groups will be essential aspects of your role. You will also have the opportunity to engage in custom development of new platforms and solutions, work on large-scale digital platforms and marketplaces, and contribute to complex engineering projects using cloud-native architecture. In this role, you will collaborate with innovative Fortune 500 companies, utilizing cutting-edge technologies to co-create and develop new products and platforms for clients. Additionally, you will be encouraged to contribute to Open Source projects and continuously upskill in the latest technology areas while incubating tech user groups. Your technical expertise should include proficiency in Big Data technologies such as Spark, Scala, Hive, Kafka, HBase, Oozie, and Sqoop. Your experience in functional programming with Scala and data processing with Spark Streaming will be highly valuable in this role.,

Posted 2 weeks ago

Apply

0.0 - 4.0 years

0 Lacs

indore, madhya pradesh

On-site

As an intern at Login2Xplore, you will have the opportunity to work on various tasks related to software development and technology exploration. Your day-to-day responsibilities may include: - Working on the development of software/modules, coding, and exploring new technologies. - Creating technical design documents and manuals. - Handling software installation, execution, testing automation, and providing training & support. - Engaging in research & development activities on upcoming technologies such as AI, ML, Big Data, etc. Please note that there are opportunities for both longer duration and part-time internships in this profile. Login2Xplore is a software product startup dedicated to creating powerful data structures using Java and other core technologies. Our focus is on generating fast, real-time information and analytics from the rapidly growing unmanaged data landscape worldwide. We are currently working on innovative products like PowerIndeX, a next-generation data indexing solution, and JsonPowerDB, a lightweight, high-performance, real-time document-oriented and key-value pair database with a web-service API.,

Posted 2 weeks ago

Apply

8.0 - 12.0 years

0 Lacs

pune, maharashtra

On-site

We are seeking a skilled Senior Developer & Tech Lead who is dedicated to producing clean and effective code, constructing scalable systems, fostering engineering excellence, and overseeing a team of proficient developers in a dynamic Agile setting. This position is well-suited for individuals with extensive practical experience in Java and Apache Spark, coupled with a solid understanding of object-oriented design principles. Your responsibilities will include conducting thorough impact analyses for code modifications while considering dependencies across various application components, designing and implementing scalable, high-performance code utilizing Java and Bigdata/Apache Spark, crafting high-quality, sustainable code that is modular, testable, and aligns with SOLID principles and industry-standard design patterns. You will also be responsible for writing robust unit tests using JUnit, focusing on code coverage, business logic, readability, and reusability, as well as leading and participating in code reviews to ensure adherence to clean design/architecture and best engineering practices. Additionally, creating a culture of ownership and accountability where quality and collaboration are fundamental values, mentoring and guiding developers through technical hurdles, and collaborating closely with cross-functional teams to deliver top-notch code swiftly and efficiently. The ideal candidate should possess at least 8 years of development experience with hands-on expertise in Java, Bigdata/Apache Spark, and object-oriented programming, along with familiarity with REST APIs, RDBMS database, and Kafka messaging systems. Exposure to microservices architecture and containerization tools such as Docker and Kubernetes is also desired, as well as a track record of leading teams and mentoring developers in a fast-paced development environment. A strong grasp of software development lifecycle (SDLC) and Agile methodologies, exceptional problem-solving abilities, critical thinking skills under pressure, excellent communication skills, and the capacity to work effectively in cross-functional teams are crucial attributes for this role. Qualifications: - 8+ years of development experience in Java, Bigdata/Apache Spark, and object-oriented programming - Familiarity with REST APIs, RDBMS database, and Kafka messaging systems - Exposure to microservices architecture and containerization tools like Docker and Kubernetes - Demonstrated leadership experience and mentoring skills in a fast-paced development environment - Proficiency in software development lifecycle (SDLC) and Agile methodologies - Exceptional problem-solving abilities and critical thinking skills under pressure - Excellent communication skills and ability to collaborate effectively in cross-functional teams Education: - Bachelor's degree/University degree or equivalent experience - Master's degree preferred Note: If you require a reasonable accommodation due to a disability to utilize our search tools or apply for a career opportunity, please review Accessibility at Citi. Also, refer to Citi's EEO Policy Statement and the Know Your Rights poster for further information.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

You are a highly skilled and experienced Data Architect with expertise in cloud-based solutions. You will be responsible for designing, implementing, and optimizing data architecture to meet the organization's current and future needs. Your role will involve data modeling, transformation, governance, and hands-on experience with modern cloud platforms and tools such as Snowflake, Spark, Data Lakes, and Data Warehouses. Collaboration with cross-functional teams and stakeholders is crucial, and you will establish and enforce standards and guidelines across data platforms to ensure consistency, scalability, and best practices. You will be accountable for architecting and implementing scalable, secure, and high-performance cloud data platforms that integrate data lakes, data warehouses, and databases. Developing comprehensive data models to support analytics, reporting, and operational needs will be a key responsibility. Leading the design and execution of ETL/ELT pipelines to process and transform data efficiently using tools like Talend, Matillion, SQL, BigData, Hadoop, AWS EMR, and Apache Spark is essential. You will integrate diverse data sources into cohesive and reusable datasets for business intelligence and machine learning purposes. Establishing, documenting, and enforcing standards and guidelines for data architecture, data modeling, transformation, and governance across all data platforms will be part of your role. Ensuring consistency and best practices in data storage, integration, and security throughout the organization is critical. You will establish and enforce data governance standards to ensure data quality, security, and compliance with regulatory requirements, implementing processes and tools to manage metadata, lineage, and data access controls. Your expertise will be utilized in utilizing Snowflake for advanced analytics and data storage needs, optimizing performance and cost efficiency. Leveraging modern cloud platforms to manage data lakes and ensure seamless integration with other services is also a key responsibility. Collaboration with business stakeholders, data engineers, and analysts to gather requirements and translate them into technical designs is essential, along with effectively communicating architectural decisions, trade-offs, and progress to both technical and non-technical audiences. Continuous improvement is part of your role, where you will stay updated on emerging trends in cloud and data technologies, recommending innovations to enhance the organization's data capabilities and optimizing existing architectures to improve scalability, performance, and maintainability. Your technical skills should include expertise in data modeling, data architecture design principles, Talend, Matillion, SQL, BigData, Hadoop, AWS EMR, Apache Spark, Snowflake, and cloud-based data platforms. Experience with data lakes, data warehouses, relational and NoSQL databases, data transformation techniques, ETL/ELT pipelines, DevOps/DataOps/MLOps tools, and standards and governance frameworks is necessary. You should have exceptional written and verbal communication skills to interact effectively with technical teams and business stakeholders. Ideally, you should have 5+ years of experience in data architecture focusing on cloud technologies, a proven track record of delivering scalable, cloud-based data solutions, and a Bachelor's or Master's degree in Computer Science, Information Systems, or a related field. Preferred qualifications include certifications in Snowflake, AWS data services, any RDBMS/NoSQL, AI/ML, Data Governance, familiarity with machine learning workflows and data pipelines, and experience working in Agile development environments.,

Posted 2 weeks ago

Apply

10.0 - 15.0 years

20 - 35 Lacs

pune

Work from Office

We at Onix Datametica Solutions Private Limited are looking for Bigdata Lead who have a passion for cloud with knowledge of different on-premise and cloud Data implementation in the field of Big Data and Analytic s including and not limiting to Teradata, Netezza, Exadata, Oracle, Cloudera, Hortonworks and alike. Ideal candidates should have technical experience in migrations and the ability to help customers get value from Datametica's tools and accelerators Check out more about us on our website below! https://www.onixnet.com/ Job Description Experience : 10 to 12 Years Location : Pune 6+ years of overall experience in developing, testing & implementing Big data projects using Hadoop, Spark, Hive. Hands-on experience playing lead role in Big data projects, responsible for implementing one or more tracks within projects, identifying and assigning tasks within the team and providing technical guidance to team members. Experience in setting up Hadoop services, implementing ETL/ELT pipelines, working with Terabytes of data ingestion & processing from varied systems Experience working in onshore/offshore model, leading technical discussions with customers, mentoring and guiding teams on technology, preparing HDD & LDD documents Required Skills and Abilities: Mandatory Skills Spark, Scala/Pyspark, Hadoop ecosystem including Hive, Sqoop, Impala, Oozie, Hue, Java, Python, SQL, Flume, bash(shell scripting) Understanding of Data Governance concepts and experience implementing metadata capture, lineage capture, business glossary Experience implementing CICD pipelines and working experience with tools like SCM tools such as GIT, Bit bucket, etc Ability to assign and manage tasks for team members, provide technical guidance, work with architects on HDD, LDD, POCs Hands on experience in writing data ingestion pipelines, data processing pipelines using spark and sql, experience in implementing SCD type 1 & 2, auditing, exception handling mechanism Data Warehousing projects implementation with either Java, or Scala based Hadoop programming background. Proficient with various development methodologies like waterfall, agile/scrum. Exceptional communication, organization, and time management skills Collaborative approach to decision-making & Strong analytical skills Good To Have - Certifications in any of GCP, AWS or Azure, Cloudera Work on multiple Projects simultaneously, prioritizing appropriately

Posted 2 weeks ago

Apply

2.0 - 6.0 years

0 Lacs

karnataka

On-site

You are prepared to enhance your skills and expertise to progress in your current position and propel your career forward. A software engineering opportunity awaits you at JPMorgan Chase within the Liquidity Risk (LRI) team as a Software Engineer II specializing in Big Data, Java, and Scala. In this role, you will play a pivotal role in developing and implementing the next-generation cloud-native liquidity risk management platform. The technology team is dedicated to delivering comprehensive solutions for managing liquidity risk and fulfilling regulatory reporting requirements across various markets. The project will involve strategically developing advanced liquidity calculation engines, integrating AI and ML into liquidity risk processes, and introducing digital-first reporting capabilities. The platform aims to process 40-60 million transactions and positions daily, assess risks associated with current and model-based market scenarios, create a multidimensional view of the corporate risk profile, and enable real-time analysis. Responsibilities: - Implement standard software solutions, design, development, and technical troubleshooting - Utilize tools in the Software Development Life Cycle toolchain to enhance automation benefits - Analyze large, diverse data sets to identify issues and contribute to decision-making for secure application development - Learn and apply system processes, methodologies, and skills for secure and stable code development - Foster a team culture of diversity, equity, inclusion, and respect - Contribute to the team's pursuit of continuous development process improvement and innovative solutions to meet business requirements - Align technology solutions with business objectives effectively Required Qualifications, Capabilities, and Skills: - Formal training or certification in Java, Scala, Spark, Big Data concepts with a minimum of 2 years of practical experience - Proficiency in Java, Scala, Spark, and related Big Data technologies through hands-on development experience - Practical experience in system design, application development, testing, and operational stability - Familiarity with the entire Software Development Life Cycle - Exposure to agile methodologies like CI/CD, Applicant Resiliency, and Security - Basic knowledge of software applications and technical processes within a technical domain - Ability to collaborate closely with stakeholders to define requirements - Collaboration with partners across feature teams to develop reusable services meeting solution requirements Preferred Qualifications, Capabilities, and Skills: - Experience in big data solutions with a track record of data-driven problem-solving - Exposure to cloud technologies, particularly AWS,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

noida, uttar pradesh

On-site

Job Description: As a Senior Big Data Cloud Quality Assurance Engineer, you will play a crucial role in ensuring the quality and performance of big data applications and services deployed in cloud environments. Your responsibilities will include designing and implementing test plans and test cases, conducting functional, performance, and scalability testing, identifying and tracking defects, collaborating with development teams, developing automated test scripts, analyzing test results, mentoring junior QA team members, and staying updated on industry trends. If you are a motivated professional with a keen attention to detail and looking to advance your career in big data quality assurance, this is an exciting opportunity for you. Qualifications: To qualify for this role, you should have a Bachelor's degree in Computer Science, Information Technology, or a related field, along with a minimum of 5 years of experience in software testing, with at least 2 years focused on big data applications and cloud technologies. Proficiency in testing frameworks and tools, experience with big data technologies and cloud platforms, familiarity with programming languages, excellent analytical and problem-solving skills, strong communication skills, and the ability to work collaboratively in a team environment are essential. Skills Required: ETL Testing, Bigdata, Database Testing, API Testing, Selenium, SQL, Linux, Cloud Testing Roles and Responsibilities: 1. Design and implement comprehensive test plans and test cases for big data applications in cloud environments. 2. Collaborate with data engineers and developers to understand system architecture and data flow. 3. Perform manual and automated testing for big data processing frameworks and tools. 4. Lead and mentor junior QA team members. 5. Identify and track defects, and verify fixes. 6. Develop and maintain automated test scripts. 7. Execute performance testing for scalability and reliability assessment. 8. Participate in design and code reviews. 9. Define acceptance criteria with stakeholders. 10. Stay updated on industry trends in big data and cloud technologies. 11. Ensure compliance with security and data governance policies. 12. Provide detailed reports on testing progress and outcomes. Experience: 5 to 7 years,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

You are invited to join Carelon Global Solutions India as a Team Lead I (IND) in Bangalore. Carelon Global Solutions is a part of Elevance Health, a renowned healthcare company in America dedicated to improving lives and communities. As a Team Lead, you will be responsible for all design requirements for leader communications, reporting to the Manager Communications. Your main duties will include understanding the design process, liaising with stakeholders, and coordinating with external agencies to ensure brand guidelines are followed. To excel in this role, you should possess the following qualifications and skills: Qualifications: - B.Tech / B.E / MCA Experience: - 5+ years of experience in conducting root cause analysis - 6+ years of experience with SQL and NoSQL databases like MySQL, Postgres - 6+ years of experience with strong analytical skills and advanced SQL knowledge Skills and Competencies: - Profound understanding of Big Data core concepts and technologies including Apache Spark, Kafka, Scala, Hive, and AWS - Good understanding of business and operational processes - Capable of problem/issue resolution and thinking out of the box Your responsibilities will include: - Demonstrating good experience and understanding of various core AWS services and Big Data technologies - Playing the Scrum Master role for a software development team - Providing support to the team using a servant leadership style and leading by example - Leading and coordinating the activities of the production support team - Managing incidents and problems effectively to ensure minimal impact on production - Communicating the status and health of applications to business lines and management - Performing advanced troubleshooting, analysis, and resolution of production issues using programming and query skills At Carelon Global Solutions, we offer a world of limitless opportunities to our associates, with a strong focus on learning and development, an inspiring culture built on innovation and creativity, comprehensive rewards and recognitions, competitive health and medical insurance coverage, best-in-class amenities and workspaces, and policies designed with associates at the center. We are an equal opportunity employer committed to diversity and inclusion. If you are a highly creative and meticulous individual with a passion for healthcare solutions, we welcome you to apply for this full-time position and be a part of our dynamic team at Carelon Global Solutions India in Bangalore.,

Posted 2 weeks ago

Apply

10.0 - 14.0 years

0 Lacs

karnataka

On-site

Transport is at the core of modern society. Imagine using your expertise to shape sustainable transport and infrastructure solutions for the future If you seek to make a difference on a global scale, working with next-gen technologies and the sharpest collaborative teams, then we could be a perfect match. Electromobility is changing the automotive world, and we are now looking for you who want to be part of making this change happening in our industry! The Electromobility organization is responsible for the complete development lifecycle of our electric powertrains from advanced engineering through product development into the maintenance phase. The function has a truly purpose-driven leadership and together we drive Electromobility transition based on cutting-edge engineering and state-of-the-art research within the Volvo Group. By joining us, you'll be part of a global and diverse team of highly skilled professionals. We make our customers, the planet, and our future generations win. As the Group Manager for Data Analytics, you will play a key role in developing the team and building capability & capacity to deliver various programs within Electromobility. You will lead a talented team of engineers to develop cutting-edge technologies and solutions that propel Electromobility forward. Reporting Director "Simulation & Data" Section, you will have the opportunity to make a significant impact on our growth and success. The team purpose involves contributing to data life cycle during product development journey in order to create world-class solutions for electromobility systems, delivering value to Volvo business units such as trucks, buses, construction equipment, and marine applications. By understanding customer usage and product performance, the team provides insights for product development, uptime, and continuous. The team is involved in various phases of product development journey from data design, anomaly detection, building crude to accurate models for failure prediction. The team develops/optimizes data pipeline, conveys insights/story through Visualization tools, builds ML models to predict for uptime or component life, and focuses on continuous improvement. They collaborate with cross-functional teams from different parts of the organization locally & globally. We seek a well-established leader with a proven track record working in a global environment, pushing the boundaries to get more value. You have extensive domain knowledge in data-driven product development, involving Big-data, Edge computing, AI/ML, and system engineering. You thrive on teamwork, leveraging your skills to influence and empower colleagues to harness the full potential for maximum impact. Using great communication skills on all levels, you are also a natural speaking partner, fostering networks, building trust, and managing stakeholders through courage and integrity. To Be Able To Do This You Need To - Lead and inspire others through strategic focus and reasoning, by asking bold questions and challenging to go beyond. - Manage Competence development, Performance and support the team members in the journey from Aspire to Inspire - Attract, develop, and retain professionals to also secure competences for future demands - Collaborate with cross-functional teams to minimize the waste and bring in more efficiency - Balancing the Capacity with the workload, Budgeting & Infrastructure impacting the team - Excel in Stakeholder management and system thinking approach towards problem-solving - Actively participate in all relevant decisions foras We are committed to shaping the future landscape of efficient, safe, and sustainable transport solutions. Fulfilling our mission creates countless career opportunities for talents across the group's leading brands and entities. Applying to this job offers you the opportunity to join Volvo Group. Every day, you will be working with some of the sharpest and most creative brains in our field to be able to leave our society in better shape for the next generation. We are passionate about what we do, and we thrive on teamwork. We are almost 100,000 people united around the world by a culture of care, inclusiveness, and empowerment.,

Posted 2 weeks ago

Apply

8.0 - 12.0 years

0 Lacs

pune, maharashtra

On-site

As a Senior Engineer Full Stack at Deutsche Bank in Pune, India, you will be part of a cross-functional agile delivery team, collaborating with analysts, developers, and testers to provide cutting-edge technology solutions for the banking industry. Your role will involve approaching software development innovatively, utilizing the latest technologies and practices to deliver business value effectively. You will foster a team-oriented approach to engineering, promoting open code, discussions, and a supportive environment. From initial analysis to production support, you will play a pivotal role in all stages of software delivery. Deutsche Bank offers a growth-oriented environment that emphasizes excellence, continuous learning, and mutual respect. Your responsibilities will include leading the feature team, collaborating on requirements, analyzing stories, designing solutions, implementing and testing them, and providing production support. You will write high-quality code, practice test-driven development, and contribute to architectural decisions while ensuring reliable and supportable software. As a Vice President, your role will encompass people management, team leadership, mentoring, and fostering a culture of continuous improvement. You are expected to have deep knowledge of Java, databases, and Angular, hands-on experience with Google Cloud platform, and expertise in test-driven development and continuous integration. The ideal candidate will possess a strong understanding of web technologies, experience with BigData Hadoop technologies, proficiency in SQL and relational databases, and familiarity with agile practices. Additional desirable skills include Behavior Driven Development, knowledge of various data technologies, and architecture and design approaches supporting rapid delivery. A degree in Engineering or Computer Science from an accredited college or university is required. Deutsche Bank provides training, coaching, and a culture of continuous learning to support your career growth. The company promotes a positive, inclusive work environment where employees are empowered to excel together. For further information about Deutsche Bank and its teams, please visit our company website at https://www.db.com/company/company.htm. We are committed to fostering a culture of collaboration, responsibility, commercial thinking, and initiative, celebrating the successes of our people as a unified Deutsche Bank Group. We welcome applications from all individuals and strive to create a fair and inclusive workplace environment.,

Posted 3 weeks ago

Apply

4.0 - 8.0 years

7 - 11 Lacs

chennai, bengaluru

Work from Office

Job Title : Big data Devloper Location State : Karnataka, TN Location City : Bangalore, Chennai Experience Required : 4 to 8 Year(s) CTC Range : 7 to 11 LPA Shift: Day Shift Work Mode: Onsite Position Type: C2H Openings: 2 Company Name: VARITE INDIA PRIVATE LIMITED About The Client: Client is an Indian multinational technology company specializing in information technology services and consulting. Headquartered in Mumbai, it is a part of the Tata Group and operates in 150 locations across 46 countries. About The Job: Bigdata and Hadoop Ecosystems Essential Job Functions: Bigdata and Hadoop Ecosystems Qualifications: Bigdata and Hadoop Ecosystems How to Apply: Interested candidates are invited to submit their resume using the apply online button on this job post. About VARITE: VARITE is a global staffing and IT consulting company providing technical consulting and team augmentation services to Fortune 500 Companies in USA, UK, CANADA and INDIA. VARITE is currently a primary and direct vendor to the leading corporations in the verticals of Networking, Cloud Infrastructure, Hardware and Software, Digital Marketing and Media Solutions, Clinical Diagnostics, Utilities, Gaming and Entertainment, and Financial Services. Equal Opportunity Employer: VARITE is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. We do not discriminate on the basis of race, color, religion, sex, sexual orientation, gender identity or expression, national origin, age, marital status, veteran status, or disability status. Unlock Rewards: Refer Candidates and Earn. If you're not available or interested in this opportunity, please pass this along to anyone in your network who might be a good fit and interested in our open positions. VARITE offers a Candidate Referral program, where you'll receive a one-time referral bonus based on the following scale if the referred candidate completes a three-month assignment with VARITE. Exp Req - Referral Bonus 0 - 2 Yrs. - INR 5,000 2 - 6 Yrs. - INR 7,500 6 + Yrs. - INR 10,000

Posted 3 weeks ago

Apply

7.0 - 12.0 years

15 - 27 Lacs

chennai

Hybrid

Senior Bigdata Developer (GCP - BigQuery , DataFlow, DataProc, Spanner) Very good communication skill Self starter and learner Willing to work from office on Hybrid mode.

Posted 3 weeks ago

Apply

3.0 - 7.0 years

20 - 25 Lacs

pune

Work from Office

About the Role: We are looking for a highly motivated Senior Software Engineer, Data Analytics with experience to join our fast-paced engineering team. The ideal candidate takes full ownership of their work, thrives in cross-functional collaboration, and is passionate about building scalable, fault-tolerant big data systems. In this role, you will design and develop high-performance data platforms, mentor junior engineers, and contribute to delivering impactful analytics solutions that drive strategic business decisions. What Youll Do: Design, build, and optimize scalable and fault-tolerant Big Data pipelines for batch and streaming workloads. Develop real-time streaming applications using Apache Spark Streaming or Flink. Work with Snowflake, Hadoop, Kafka, and Spark for large-scale data processing and analytics. Implement workflow orchestration using tools like Apache Airflow, Oozie, or Luigi. Develop backend services and REST APIs to serve analytics and data products. Collaborate with product managers, stack holders, and cross-functional teams to deliver data-driven solutions. Ensure data quality, governance, and security across the data ecosystem. Guide and mentor junior engineers, providing technical leadership and best practice recommendations. Perform code reviews, performance tuning, and troubleshoot distributed system issues. Drive innovation by evaluating and implementing new tools, frameworks, and approaches for data engineering. We'd Love for You to Have: 4-7 years of experience in Big Data & Analytics engineering. Strong programming skills in Java, Scala, or Python. Hands-on experience with Apache Spark, Hadoop, Kafka, and distributed data systems. Proficiency in SQL and experience with Snowflake (preferred) or other cloud data warehouses. Practical experience with workflow orchestration tools such as Airflow, Oozie, or Luigi. Strong foundation in data structures, algorithms, and distributed system design. Familiarity with cloud platforms (AWS, GCP, Azure) and related data services. Experience with containerization and orchestration (Docker, Kubernetes). Exposure to data observability, monitoring tools, and AI/ML integration with data pipelines. Experience in mentoring and guiding team members. Proven track record of working on cross-team collaboration projects. Strong problem-solving skills with the ability to take ownership and deliver end-to-end solutions. Qualifications Should have a bachelors degree in engineering (CS / IT) or equivalent degree from a well-known Institute / University.

Posted 3 weeks ago

Apply

5.0 - 9.0 years

13 - 23 Lacs

bengaluru

Hybrid

The Role Develops and program methods, automated processes, and systems to cleanse, integrate and analyze structured and unstructured, diverse big data sources to generate actionable insights and solutions using machine learning and advanced analytics . Interprets and communicates insights and findings from analyses and experiments to other analysts, data scientists, team members and business partners. The Main Responsibilities Support the development of end-to-end analytics solutions by assisting in the design and implementation of solutions that cover the entire data science lifecycle, including data discovery, cleaning, exploratory data analysis, model building, and deployment. Assist with operationalizing models and participate in the iterative process of refining models and insights based on feedback and business requirements. Analyze data and build predictive, prescriptive, and advanced analytical models in various areas including capacity planning, effect/anomaly detection, predictive asset failure/maintenance, workload optimization, customer segmentation and business performance. Gain direct experience with various modeling techniques such as clustering, regression, and time series forecasting, applying these techniques to generate actionable insights and recommendations. Mine information for previously unknown patterns and insights hidden in these assets and leverage them for competitive advantage. Create compelling data visualizations and dashboards to effectively communicate findings to both technical and non-technical audiences. Present insights in a clear, concise, and actionable manner. Collaborate within and across cross-functional teams, working closely with data engineers, data scientists, and business stakeholders to understand business problems, gather requirements, and communicate insights effectively. Contribute to collaborative problem-solving sessions and agile development processes. Develop and operationalize end-to-end machine learning pipelines on Databricks , including feature engineering, model training, evaluation, and deployment. Implement and manage MLOps practices , integrating Git for version control, CI/CD pipelines for model deployment, and automated monitoring of models in production. Develop and consume RESTful APIs for data integration , enabling seamless connectivity between analytics applications and external systems. Ensure reproducibility, auditability, and governance of data science models by adhering to enterprise MLOps standards and frameworks. Support analytics democratization by packaging models as reusable components and APIs for consumption across the enterprise. What We Look for in a Candidate Able to apply techniques such as classification, clustering, regression, deep learning, association, anomaly detection, time series forecasting, Hidden Markov models and Bayesian inference to solve pragmatic business problems. Able to design working models and implement them on Big Data systems using Map Reduce or Spark frameworks . Familiar with Hadoop, Pig, Hive, Scope, Cosmos, or similar technologies . Able to work within an agile, iterative DevOps development process. Experience: 3+ years of experience delivering Machine Learning and Advanced Analytics solutions Experience with statistical programming environments like Python, R, SPSS, or IBM Watson Studio Experience building data models and performing complex queries using SQL Experience performance tuning large datasets Experience building large data pipelines and/or web services Experience developing visualization and dashboards using PowerBI or similar tools Fluent in one or more object-oriented languages like C#, C++, Scala, Java, and scripting languages like Python or Ruby "We are an equal opportunity employer committed to fair and ethical hiring practices. We do not charge any fees or accept any form of payment from candidates at any stage of the recruitment process. If anyone claims to offer employment opportunities in our company in exchange for money or any other benefit, please treat it as fraudulent and report it immediately."

Posted 4 weeks ago

Apply

8.0 - 12.0 years

0 Lacs

pune, maharashtra

On-site

We are seeking a seasoned Senior Developer & Tech Lead who is enthusiastic about writing clean and efficient code, constructing scalable systems, promoting engineering excellence, and supervising a team of skilled developers in a fast-paced, Agile environment. This position is well-suited for developers with extensive hands-on experience in Java and Apache Spark, coupled with a solid understanding of object-oriented design principles. Your responsibilities will include conducting detailed impact analysis for code changes, designing and implementing scalable, high-performance code using Java and Bigdata/Apache Spark, and ensuring the code is of high quality, maintainable, modular, and adheres to industry-standard design patterns and SOLID principles. You will also be responsible for writing robust unit tests using JUnit, leading code reviews to enforce clean design and best engineering practices, fostering an environment of ownership and accountability, and mentoring a team of developers through technical challenges. As a Senior Developer & Tech Lead, you will collaborate closely with Architects, Quality Engineers, DevOps, and Product owners to deliver high-quality code at speed. You will work in a cross-functional Agile team, participating in daily stand-ups, sprint planning, retrospectives, and backlog grooming. Additionally, you will translate user stories into technical tasks and ensure timely delivery of high-quality solutions. The ideal candidate for this role should possess at least 8 years of development experience with a strong background in Java, Bigdata/Apache Spark, and object-oriented programming. Experience with REST APIs, RDBMS database, and Kafka messaging systems is required, along with exposure to microservices architecture and containerization tools such as Docker and Kubernetes. Proven experience in leading teams and mentoring developers in a fast-paced development environment is essential. A solid understanding of software development lifecycle (SDLC) and Agile methodologies, excellent problem-solving skills, and the ability to think critically under pressure are also crucial. Strong communication skills and the ability to collaborate effectively in cross-functional teams are highly valued. Education-wise, a Bachelor's degree or equivalent experience is required, while a Master's degree is preferred. If you are a person with a disability and require a reasonable accommodation to use our search tools and/or apply for a career opportunity, please review the Accessibility at Citi. You can also view Citi's EEO Policy Statement and the Know Your Rights poster.,

Posted 1 month ago

Apply

4.0 - 8.0 years

0 Lacs

pune, maharashtra

On-site

At Citi, we are not just building technology, we are building the future of banking. Encompassing a broad range of specialties, roles, and cultures, our teams are creating innovations used across the globe. Citi is constantly growing and progressing through our technology, with a laser focus on evolving the ways of doing things. As one of the world's most global banks, we are changing how the world does business. Shape your career with Citi. We are currently looking for a high-caliber professional to join our team as Officer, Tableau Developer -C11- Hybrid based in Pune, India. Being part of our team means that we will provide you with the resources to meet your unique needs, empower you to make healthy decisions, and manage your financial well-being to help plan for your future. For instance: - We provide programs and services for your physical and mental well-being, including access to telehealth options, health advocates, confidential counseling, and more. Coverage varies by country. - We empower our employees to manage their financial well-being and help them plan for the future. - We provide access to an array of learning and development resources to help broaden and deepen your skills and knowledge as your career progresses. In this role, you are expected to: Responsibilities: - Utilize knowledge of applications development procedures and concepts, and basic knowledge of other technical areas to identify and define necessary system enhancements, including using script tools and analyzing/interpreting code. - Consult with users, clients, and other technology groups on issues, and recommend programming solutions, install, and support customer exposure systems. - Apply fundamental knowledge of programming languages for design specifications. - Analyze applications to identify vulnerabilities and security issues, as well as conduct testing and debugging. - Serve as an advisor or coach to new or lower-level analysts. - Identify problems, analyze information, and make evaluative judgments to recommend and implement solutions. - Resolve issues by identifying and selecting solutions through the application of acquired technical experience and guided by precedents. - Has the ability to operate with a limited level of direct supervision. - Can exercise independence of judgment and autonomy. - Acts as a subject matter expert to senior stakeholders and/or other team members. - Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct, and business practices, and escalating, managing, and reporting control issues with transparency. Qualifications: - 4 - 8 years of relevant experience in Tableau developer- Strong in SQL - Reporting Tool- Tableau - Programming skill- Python - Database - Oracle/ Bigdata - Good to have experience in the Financial Service industry - Intermediate level experience in Applications Development role - Consistently demonstrates clear and concise written and verbal communication - Demonstrated problem-solving and decision-making skills - Ability to work under pressure and manage deadlines or unexpected changes in expectations or requirements Education: - Bachelor's degree/University degree or equivalent experience If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity, please review Accessibility at Citi. View Citi's EEO Policy Statement and the Know Your Rights poster.,

Posted 1 month ago

Apply

15.0 - 19.0 years

0 Lacs

pune, maharashtra

On-site

The Financial Crimes & Fraud Prevention Analytics team at Citi is looking for a skilled individual to join as a C14 (people manager) reporting to the Director/Managing Director, AIM. This role will involve leading a team of data scientists based in Pune/Bangalore, focusing on the development and implementation of Machine Learning (ML) /AI/Gen AI models for Fraud Prevention. The successful candidate will be responsible for designing, developing, and deploying generative AI based solutions, analyzing data to understand fraud patterns, and developing models to achieve overall business goals. Additionally, the individual will collaborate with the model implementation team, ensure model documentation, and address questions from model risk management (MRM) while adapting to changing business needs. Key Responsibilities: - Lead as Subject Matter Expert (SME) in the area of ML/AI/Gen AI, demonstrating strong AI and ML concepts and the ability to articulate complex concepts to diverse audiences. - Lead a team of data scientists in the development and implementation of ML /AI/Gen AI models, providing technical leadership, mentorship, and ensuring 100% execution accuracy. - Customize and fine-tune existing RAG frameworks or design new frameworks to meet project requirements. - Establish governance frameworks for model development, deployment, and monitoring to meet MRM and Fair Lending guidelines. - Oversee the end-to-end model development lifecycle and ensure timely deployment with high quality and no errors. - Manage a team of 15+ data scientists, providing career development, conflict management, performance management, coaching, mentorship, and technical guidance. Requirements: - Minimum 15+ years of analytics experience in core model development using ML/AI/Gen AI techniques. - Strong knowledge of current state-of-the-art ML/AI/Gen AI algorithms and their pros and cons. - Experience in Bigdata environments, Python, SQL, and Big Data. - Bachelors or masters degree in computer science, Data Science, Machine Learning, or a related field. Ph.D. is a plus. - At least 8 years of people management experience. - Proven track record of building and deploying generative models-based solutions in production environments. - Excellent verbal and written communication skills, with the ability to influence business outcomes and decisions. - Strong project management skills and the ability to define business requirements and create robust technical documentation. - Strategic thinking and the ability to frame business problems, with excellent analytical and statistical skills. If you are a person with a disability and need a reasonable accommodation to use Citi's search tools and/or apply for a career opportunity, please review Accessibility at Citi. For more information on Citi's EEO Policy Statement and the Know Your Rights poster, visit the Citi website.,

Posted 1 month ago

Apply
Page 1 of 4
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies