Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
8 - 13 years
50 - 55 Lacs
Bengaluru
Work from Office
As an Engineering Manager, you will lead a team of engineers responsible for the development andimplementation of our cloud-based data infrastructure. You will work closely with cross-functionalteams to understand data requirements, design scalable solutions, and ensure the integrity andavailability of our data. The ideal candidate will have a deep understanding of cloud technologies,data engineering best practices, and a proven track record of successfully delivering complex data projects. Key Responsibilities include:- Hire, develop, and retain top engineering talent- Build and nurture self-sustained, high-performing teams- Provide mentorship and technical guidance to engineers, fostering continuous learning anddevelopment- Lead the design, development, and deployment of scalable cloud-based data infrastructure and applications- Drive end-to-end execution of complex data engineering projects- Partner with Data Scientists, ML Engineers, and business stakeholders to understand data needs and translate them into scalable engineering solutions- Align technical strategy with business goals through effective communication and collaboration- Implement and enforce best practices for data security, privacy, and compliance with regulatory standards- Optimize data storage, processing, and retrieval for improved performance and cost efficiency.- Continuously evaluate and improve the system architecture and workflows- Stay current with emerging trends and technologies in cloud data engineering- Recommend and adopt tools, frameworks, and platforms that enhance productivity and reliability Bachelor s degree in Computer Science or a related field Minimum 8 years of experience in software development with at least 2 years in a technical leadership or management role. Proven experience as a Full stack developer, with a focus on cloud platforms. Proficient in programming languages such as Python. Strong hands-on expertise with Python frameworks (Django, Flask, or FastAPI, RESTful APIs), React.js and modern JavaScript Experience with authentication and authorization (OAuth, JWT) Strong understanding of cloud services, preferably AWS & Experience in building cloud native platforms using containerization technologies like Kubernetes, docker, helm Preferred Qualifications Knowledge of data warehouse solutions (BigQuery, Snowflake, Druid) and Big Data technologies such as Spark, Kafka, Hive, Iceberg, Trino, Flink. Experience with big data technologies (Hadoop, Spark, etc.). Experience with streaming data technologies (Kafka, Kinesis). Experience building data streaming solutions using Apache Spark / Apache Storm / Flink / Flume. Familiarity with machine learning pipelines is an added advantage. Proven ability to deliver complex, high-scale systems in a production environment. Strong people management and cross-functional collaboration skills.
Posted 1 month ago
3 - 6 years
8 - 12 Lacs
Bengaluru
Work from Office
As a Software Engineer, you will work closely with cross-functional teams to understand business requirements, design scalable solutions, and ensure the integrity and availability of our data. The ideal candidate will have a deep understanding of cloud technologies, UI technologies, software engineering best practices, and a proven track record of successfully delivering complex projects.- Lead the design and implementation of cloud-based data architectures.- Collaborate with data scientists, analysts, and business stakeholders to understand requirements.- Stay current with industry trends and emerging technologies in cloud engineering. Bachelors degree in computer science or equivalent field with of 3 - 6 years hands-on programming experience Proven experience as a Full Stack engineer, with a focus on cloud platforms Strong proficiency in cloud services such as AWS, Azure, or Google Cloud Expertise in building UI and data integration services Proficient in programming languages such as Python, Java, Scala, GoLang, JavaScript Knowledge of data warehouse solutions (Redshift, BigQuery, Snowflake, Druid) Experience with streaming UI technologies Experience building data streaming solutions using Apache Spark, Storm, Link, Flume Preferred Qualifications Certification in cloud platforms Knowledge of machine learning and data science concepts Contributions to the open source community
Posted 1 month ago
7 - 11 years
50 - 60 Lacs
Mumbai, Delhi / NCR, Bengaluru
Work from Office
Role :- Resident Solution ArchitectLocation: RemoteThe Solution Architect at Koantek builds secure, highly scalable big data solutions to achieve tangible, data-driven outcomes all the while keeping simplicity and operational effectiveness in mind This role collaborates with teammates, product teams, and cross-functional project teams to lead the adoption and integration of the Databricks Lakehouse Platform into the enterprise ecosystem and AWS/Azure/GCP architecture This role is responsible for implementing securely architected big data solutions that are operationally reliable, performant, and deliver on strategic initiatives Specific requirements for the role include: Expert-level knowledge of data frameworks, data lakes and open-source projects such as Apache Spark, MLflow, and Delta Lake Expert-level hands-on coding experience in Python, SQL ,Spark/Scala,Python or Pyspark In depth understanding of Spark Architecture including Spark Core, Spark SQL, Data Frames, Spark Streaming, RDD caching, Spark MLib IoT/event-driven/microservices in the cloud- Experience with private and public cloud architectures, pros/cons, and migration considerations Extensive hands-on experience implementing data migration and data processing using AWS/Azure/GCP services Extensive hands-on experience with the Technology stack available in the industry for data management, data ingestion, capture, processing, and curation: Kafka, StreamSets, Attunity, GoldenGate, Map Reduce, Hadoop, Hive, Hbase, Cassandra, Spark, Flume, Hive, Impala, etc Experience using Azure DevOps and CI/CD as well as Agile tools and processes including Git, Jenkins, Jira, and Confluence Experience in creating tables, partitioning, bucketing, loading and aggregating data using Spark SQL/Scala Able to build ingestion to ADLS and enable BI layer for Analytics with strong understanding of Data Modeling and defining conceptual logical and physical data models Proficient level experience with architecture design, build and optimization of big data collection, ingestion, storage, processing, and visualization Responsibilities : Work closely with team members to lead and drive enterprise solutions, advising on key decision points on trade-offs, best practices, and risk mitigationGuide customers in transforming big data projects,including development and deployment of big data and AI applications Promote, emphasize, and leverage big data solutions to deploy performant systems that appropriately auto-scale, are highly available, fault-tolerant, self-monitoring, and serviceable Use a defense-in-depth approach in designing data solutions and AWS/Azure/GCP infrastructure Assist and advise data engineers in the preparation and delivery of raw data for prescriptive and predictive modeling Aid developers to identify, design, and implement process improvements with automation tools to optimizing data delivery Implement processes and systems to monitor data quality and security, ensuring production data is accurate and available for key stakeholders and the business processes that depend on it Employ change management best practices to ensure that data remains readily accessible to the business Implement reusable design templates and solutions to integrate, automate, and orchestrate cloud operational needs and experience with MDM using data governance solutions Qualifications : Overall experience of 12+ years in the IT field Hands-on experience designing and implementing multi-tenant solutions using Azure Databricks for data governance, data pipelines for near real-time data warehouse, and machine learning solutions Design and development experience with scalable and cost-effective Microsoft Azure/AWS/GCP data architecture and related solutions Experience in a software development, data engineering, or data analytics field using Python, Scala, Spark, Java, or equivalent technologies Bachelors or Masters degree in Big Data, Computer Science, Engineering, Mathematics, or similar area of study or equivalent work experience Good to have- - Advanced technical certifications: Azure Solutions Architect Expert, - AWS Certified Data Analytics, DASCA Big Data Engineering and Analytics - AWS Certified Cloud Practitioner, Solutions Architect - Professional Google Cloud Certified Location : - Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, Remote
Posted 1 month ago
1 - 5 years
3 - 7 Lacs
Allahabad, Noida
Work from Office
Feather Thread Corporation is looking for Bigdata administrator to join our dynamic team and embark on a rewarding career journey. Office Management:Oversee general office operations, including maintenance of office supplies, equipment, and facilities Manage incoming and outgoing correspondence, including mail, email, and phone calls Coordinate meetings, appointments, and travel arrangements for staff members as needed Administrative Support:Provide administrative support to management and staff, including scheduling meetings, preparing documents, and organizing files Assist with the preparation of reports, presentations, and other materials for internal and external stakeholders Maintain accurate records and databases, ensuring data integrity and confidentiality Communication and Coordination:Serve as a point of contact for internal and external stakeholders, including clients, vendors, and partners Facilitate communication between departments and team members, ensuring timely and effective information flow Coordinate logistics for company events, meetings, and conferences Documentation and Compliance:Assist with the development and implementation of company policies, procedures, and guidelines Maintain compliance with regulatory requirements and industry standards Ensure proper documentation and record-keeping practices are followed Project Support:Provide support to project teams by assisting with project coordination, documentation, and tracking of tasks and deadlines Collaborate with team members to ensure project deliverables are met on time and within budget
Posted 2 months ago
2 - 7 years
4 - 8 Lacs
Chennai
Work from Office
? Do 1. Instrumental in understanding the requirements and design of the product/ software Develop software solutions by studying information needs, studying systems flow, data usage and work processes Investigating problem areas followed by the software development life cycle Facilitate root cause analysis of the system issues and problem statement Identify ideas to improve system performance and impact availability Analyze client requirements and convert requirements to feasible design Collaborate with functional teams or systems analysts who carry out the detailed investigation into software requirements Conferring with project managers to obtain information on software capabilities
Posted 2 months ago
3 - 7 years
1 - 5 Lacs
Telangana
Work from Office
Location Chennai and Hyderbad preferred but customer is willing to take resources from Hyderabad Experience 5 to 8 yrs ( U3 ). Exp - 5- 10 Yrs Location - Hyderabad / Chennai Location Proven experience as a development data engineer or similar role, with ETL background. Experience with data integration / ETL best practices and data quality principles. Play a crucial role in ensuring the quality and reliability of the data by designing, implementing, and executing comprehensive testing. By going over the User Stories build the comprehensive code base and business rules for testing and validation of the data. Knowledge of continuous integration and continuous deployment (CI/CD) pipelines. Familiarity with Agile/Scrum development methodologies. Excellent analytical and problem solving skills. Strong communication and collaboration skills. Experience with big data technologies (Hadoop, Spark, Hive).
Posted 2 months ago
2 - 6 years
5 - 9 Lacs
Uttar Pradesh
Work from Office
Proven experience as a development data engineer or similar role, with ETL background. Experience with data integration / ETL best practices and data quality principles. Play a crucial role in ensuring the quality and reliability of the data by designing, implementing, and executing comprehensive testing. By going over the User Stories build the comprehensive code base and business rules for testing and validation of the data. Knowledge of continuous integration and continuous deployment (CI/CD) pipelines. Familiarity with Agile/Scrum development methodologies. Excellent analytical and problem solving skills. Strong communication and collaboration skills. Experience with big data technologies (Hadoop, Spark, Hive)
Posted 2 months ago
5 - 7 years
4 - 8 Lacs
Bengaluru
Work from Office
Job Title HADOOP ADMIN Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to get to the heart of customer issues, diagnose problem areas, design innovative solutions and facilitate deployment resulting in client delight. You will develop a proposal by owning parts of the proposal document and by giving inputs in solution design based on areas of expertise. You will plan the activities of configuration, configure the product as per the design, conduct conference room pilots and will assist in resolving any queries related to requirements and solution design You will conduct solution/product demonstrations, POC/Proof of Technology workshops and prepare effort estimates which suit the customer budgetary requirements and are in line with organization’s financial guidelines Actively lead small projects and contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional Requirements: Primary skills:Technology->Big Data - Hadoop->Hadoop Administration Preferred Skills: Technology->Big Data - Hadoop->Hadoop Administration Additional Responsibilities: Ability to develop value-creating strategies and models that enable clients to innovate, drive growth and increase their business profitability Good knowledge on software configuration management systems Awareness of latest technologies and Industry trends Logical thinking and problem solving skills along with an ability to collaborate Understanding of the financial processes for various types of projects and the various pricing models available Ability to assess the current processes, identify improvement areas and suggest the technology solutions One or two industry domain knowledge Client Interfacing skills Project and Team management Educational Requirements Bachelor of Engineering Service Line Data & Analytics Unit * Location of posting is subject to business requirements
Posted 2 months ago
6 - 11 years
8 - 14 Lacs
Hyderabad
Work from Office
Job Role Strong Spark programming experience with Java Good knowledge of SQL query writing and shell scripting Experience working in Agile mode Analyze, Design, develop, deploy and operate high-performant and high-quality services that serve users in a cloud environment. Good understanding of client eco system and expectations In charge of code reviews, integration process, test organization, quality of delivery Take part in development. Experienced into writing queries using SQL commands. Experienced with deploying and operating the codes in the cloud environment. Experienced in working without much supervision. Your Profile Primary Skill Java, Spark, SQL Secondary Skill/Good to have Hadoop or any cloud technology, Kafka, or BO. What youll love about working hereShort Description Choosing Capgemini means having the opportunity to make a difference, whether for the worlds leading businesses or for society. It means getting the support you need to shape your career in the way that works for you. It means when the future doesnt look as bright as youd like, you have the opportunity to make changeto rewrite it. When you join Capgemini, you dont just start a new job. You become part of something bigger. A diverse collective of free-thinkers, entrepreneurs and experts, all working together to unleash human energy through technology, for an inclusive and sustainable future. At Capgemini, people are at the heart of everything we do! You can exponentially grow your career by being part of innovative projects and taking advantage of our extensive Learning & Development programs. With us, you will experience an inclusive, safe, healthy, and flexible work environment to bring out the best in you! You also get a chance to make positive social change and build a better world by taking an active role in our Corporate Social Responsibility and Sustainability initiatives. And whilst you make a difference, you will also have a lot of fun.
Posted 2 months ago
2 - 7 years
4 - 9 Lacs
Andhra Pradesh
Work from Office
JD -7+ years of hands on experience in Python especially dealing with Pandas and Numpy Good hands-on experience in Spark PySpark and Spark SQL Hands on experience in Databricks Unity Catalog Delta Lake Lake house Platform Medallion Architecture Azure Data Factory ADLS Experience in dealing with Parquet and JSON file format Knowledge in Snowflake.
Posted 2 months ago
4 - 6 years
6 - 8 Lacs
Pune
Work from Office
Capgemini Invent Capgemini Invent is the digital innovation, consulting and transformation brand of the Capgemini Group, a global business line that combines market leading expertise in strategy, technology, data science and creative design, to help CxOs envision and build whats next for their businesses. Your role Analyse and organize raw data. Build data systems and pipelines. Evaluate business needs and objectives. Interpret trends and patterns. Conduct complex data analysis and report on results. Prepare data for prescriptive and predictive modelling. Build algorithms and prototypes. Combine raw information from different sources. Explore ways to enhance data quality and reliability. Identify opportunities for data acquisition. Develop analytical tools and programs. Collaborate with data scientists and architects on several projects. Participate in code peer reviews to ensure our applications comply with best practices. Your Profile Experience with any Big Data toolsHadoop, Spark, Kafka, Sqoop, Flume, Hive etc. Experience with any relational SQL and NoSQL databases, including Postgres, Cassandra, Sql Server, Oracle, Snowflake. Experience with any data pipeline and workflow management toolsAzkaban, Luigi, Airflow, etc. Experience in any Cloud platformsAzure, AWS or GCP. Experience with stream-processing systemsStorm, Spark-Streaming, etc. Experience with object-oriented/object function scripting languagesPython, Java, C++, Scala, etc Must have hands-on experience in DevOps and CI/CD deployments. Should know basic and advance SQL and can write complex queries. Strong experience into data warehousing and dimensional modelling. Should be a very good team player to work in a geographically dispersed team . What you will love about working here We recognize the significance of flexible work arrangements to provide support. Be it remote work, or flexible work hours, you will get an environment to maintain healthy work life balance. At the heart of our mission is your career growth. Our array of career growth programs and diverse professions are crafted to support you in exploring a world of opportunities. Equip yourself with valuable certifications in the latest technologies such as Generative AI. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, cloud and data, combined with its deep industry expertise and partner ecosystem. The Group reported 2023 global revenues of 22.5 billion.
Posted 2 months ago
5 - 10 years
9 - 13 Lacs
Bengaluru
Work from Office
We are looking for Java developers with the following skills for Bangalore Location. strong Java developers (read and debug code) and scripting (python or perl programming) experts. good to have skills would be big data pipelines, spark , Hadoop, HBase Candidates should have experience in debugging skills The candidates should have minimum of 5+ yrs experience Nicer to hire strong Java developers (read and debug code) and scripting (python or perl programming) experts, and good to have skills would be big data pipelines, spark , Hadoop, HBase etc., Location : Bangalore Experience : 5-10 yrs Notice Period : 0-60 days
Posted 3 months ago
3 - 5 years
5 - 7 Lacs
Pune
Work from Office
Job Job Title Hadoop Developer Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to interface with the client for quality assurance, issue resolution and ensuring high customer satisfaction. You will understand requirements, create and review designs, validate the architecture and ensure high levels of service offerings to clients in the technology domain. You will participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews. You will lead and guide your teams towards developing optimized high quality code deliverables, continual knowledge management and adherence to the organizational guidelines and processes. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional Requirements: Technology->Big Data - Hadoop->Hadoop Preferred Skills: Technology->Big Data - Hadoop->Hadoop Additional Responsibilities: Knowledge of more than one technology Basics of Architecture and Design fundamentals Knowledge of Testing tools Knowledge of agile methodologies Understanding of Project life cycle activities on development and maintenance projects Understanding of one or more Estimation methodologies, Knowledge of Quality processes Basics of business domain to understand the business requirements Analytical abilities, Strong Technical Skills, Good communication skills Good understanding of the technology and domain Ability to demonstrate a sound understanding of software quality assurance principles, SOLID design principles and modelling methods Awareness of latest technologies and trends Excellent problem solving, analytical and debugging skills Educational Requirements Bachelor of Engineering Service Line Data & Analytics Unit * Location of posting is subject to business requirements
Posted 3 months ago
6 - 10 years
13 - 17 Lacs
Pune
Work from Office
About Persistent We are a trusted Digital Engineering and Enterprise Modernization partner, combining deep technical expertise and industry experience to help our clients anticipate what’s next. Our offerings and proven solutions create a unique competitive advantage for our clients by giving them the power to see beyond and rise above. We work with many industry-leading organizations across the world including 12 of the 30 most innovative US companies, 80% of the largest banks in the US and India, and numerous innovators across the healthcare ecosystem. Our growth trajectory continues, as we reported $1,231M annual revenue (16% Y-o-Y). Along with our growth, we’ve onboarded over 4900 new employees in the past year, bringing our total employee count to over 23,500+ people located in 19 countries across the globe. Persistent Ltd. is dedicated to fostering diversity and inclusion in the workplace. We invite applications from all qualified individuals, including those with disabilities, and regardless of gender or gender preference. We welcome diverse candidates from all backgrounds. For more details please login to www.persistent.com About The Position We are looking for a Big Data Lead who will be responsible for the management of data sets that are too big for traditional database systems to handle. You will create, design, and implement data processing jobs in order to transform the data into a more usable format. You will also ensure that the data is secure and complies with industry standards to protect the company?s information. What You?ll Do Manage customer's priorities of projects and requests Assess customer needs utilizing a structured requirements process (gathering, analyzing, documenting, and managing changes) to prioritize immediate business needs and advising on options, risks and cost Design and implement software products (Big Data related) including data models and visualizations Demonstrate participation with the teams you work in Deliver good solutions against tight timescales Be pro-active, suggest new approaches and develop your capabilities Share what you are good at while learning from others to improve the team overall Show that you have a certain level of understanding for a number of technical skills, attitudes and behaviors Deliver great solutions Be focused on driving value back into the business Expertise You?ll Bring 6 years' experience in designing & developing enterprise application solution for distributed systems Understanding of Big Data Hadoop Ecosystem components (Sqoop, Hive, Pig, Flume) Additional experience working with Hadoop, HDFS, cluster management Hive, Pig and MapReduce, and Hadoop ecosystem framework HBase, Talend, NoSQL databases Apache Spark or other streaming Big Data processing, preferred Java or Big Data technologies, will be a plus Benefits Competitive salary and benefits package Culture focused on talent development with quarterly promotion cycles and company-sponsored higher education and certifications Opportunity to work with cutting-edge technologies Employee engagement initiatives such as project parties, flexible work hours, and Long Service awards Annual health check-ups Insurance coverage: group term life, personal accident, and Mediclaim hospitalization for self, spouse, two children, and parents Inclusive Environment •We offer hybrid work options and flexible working hours to accommodate various needs and preferences. •Our office is equipped with accessible facilities, including adjustable workstations, ergonomic chairs, and assistive technologies to support employees with physical disabilities Let's unleash your full potential. See Beyond, Rise Above
Posted 3 months ago
6 - 10 years
13 - 17 Lacs
Bengaluru
Work from Office
About Persistent We are a trusted Digital Engineering and Enterprise Modernization partner, combining deep technical expertise and industry experience to help our clients anticipate what’s next. Our offerings and proven solutions create a unique competitive advantage for our clients by giving them the power to see beyond and rise above. We work with many industry-leading organizations across the world including 12 of the 30 most innovative US companies, 80% of the largest banks in the US and India, and numerous innovators across the healthcare ecosystem. Our growth trajectory continues, as we reported $1,231M annual revenue (16% Y-o-Y). Along with our growth, we’ve onboarded over 4900 new employees in the past year, bringing our total employee count to over 23,500+ people located in 19 countries across the globe. Persistent Ltd. is dedicated to fostering diversity and inclusion in the workplace. We invite applications from all qualified individuals, including those with disabilities, and regardless of gender or gender preference. We welcome diverse candidates from all backgrounds. For more details please login to www.persistent.com About The Position We are looking for a Big Data Lead who will be responsible for the management of data sets that are too big for traditional database systems to handle. You will create, design, and implement data processing jobs in order to transform the data into a more usable format. You will also ensure that the data is secure and complies with industry standards to protect the company?s information. What You?ll Do Manage customer's priorities of projects and requests Assess customer needs utilizing a structured requirements process (gathering, analyzing, documenting, and managing changes) to prioritize immediate business needs and advising on options, risks and cost Design and implement software products (Big Data related) including data models and visualizations Demonstrate participation with the teams you work in Deliver good solutions against tight timescales Be pro-active, suggest new approaches and develop your capabilities Share what you are good at while learning from others to improve the team overall Show that you have a certain level of understanding for a number of technical skills, attitudes and behaviors Deliver great solutions Be focused on driving value back into the business Expertise You?ll Bring 6 years' experience in designing & developing enterprise application solution for distributed systems Understanding of Big Data Hadoop Ecosystem components (Sqoop, Hive, Pig, Flume) Additional experience working with Hadoop, HDFS, cluster management Hive, Pig and MapReduce, and Hadoop ecosystem framework HBase, Talend, NoSQL databases Apache Spark or other streaming Big Data processing, preferred Java or Big Data technologies, will be a plus Benefits Competitive salary and benefits package Culture focused on talent development with quarterly promotion cycles and company-sponsored higher education and certifications Opportunity to work with cutting-edge technologies Employee engagement initiatives such as project parties, flexible work hours, and Long Service awards Annual health check-ups Insurance coverage: group term life, personal accident, and Mediclaim hospitalization for self, spouse, two children, and parents Inclusive Environment •We offer hybrid work options and flexible working hours to accommodate various needs and preferences. •Our office is equipped with accessible facilities, including adjustable workstations, ergonomic chairs, and assistive technologies to support employees with physical disabilities. Let's unleash your full potential. See Beyond, Rise Above
Posted 3 months ago
6 - 10 years
13 - 17 Lacs
Hyderabad
Work from Office
About Persistent We are a trusted Digital Engineering and Enterprise Modernization partner, combining deep technical expertise and industry experience to help our clients anticipate what’s next. Our offerings and proven solutions create a unique competitive advantage for our clients by giving them the power to see beyond and rise above. We work with many industry-leading organizations across the world including 12 of the 30 most innovative US companies, 80% of the largest banks in the US and India, and numerous innovators across the healthcare ecosystem. Our growth trajectory continues, as we reported $1,231M annual revenue (16% Y-o-Y). Along with our growth, we’ve onboarded over 4900 new employees in the past year, bringing our total employee count to over 23,500+ people located in 19 countries across the globe. Persistent Ltd. is dedicated to fostering diversity and inclusion in the workplace. We invite applications from all qualified individuals, including those with disabilities, and regardless of gender or gender preference. We welcome diverse candidates from all backgrounds. For more details please login to www.persistent.com About The Position We are looking for a Big Data Lead who will be responsible for the management of data sets that are too big for traditional database systems to handle. You will create, design, and implement data processing jobs in order to transform the data into a more usable format. You will also ensure that the data is secure and complies with industry standards to protect the company?s information. What You?ll Do Manage customer's priorities of projects and requests Assess customer needs utilizing a structured requirements process (gathering, analyzing, documenting, and managing changes) to prioritize immediate business needs and advising on options, risks and cost Design and implement software products (Big Data related) including data models and visualizations Demonstrate participation with the teams you work in Deliver good solutions against tight timescales Be pro-active, suggest new approaches and develop your capabilities Share what you are good at while learning from others to improve the team overall Show that you have a certain level of understanding for a number of technical skills, attitudes and behaviors Deliver great solutions Be focused on driving value back into the business Expertise You?ll Bring 6 years' experience in designing & developing enterprise application solution for distributed systems Understanding of Big Data Hadoop Ecosystem components (Sqoop, Hive, Pig, Flume) Additional experience working with Hadoop, HDFS, cluster management Hive, Pig and MapReduce, and Hadoop ecosystem framework HBase, Talend, NoSQL databases Apache Spark or other streaming Big Data processing, preferred Java or Big Data technologies, will be a plus Benefits Competitive salary and benefits package Culture focused on talent development with quarterly promotion cycles and company-sponsored higher education and certifications Opportunity to work with cutting-edge technologies Employee engagement initiatives such as project parties, flexible work hours, and Long Service awards Annual health check-ups Insurance coverage: group term life, personal accident, and Mediclaim hospitalization for self, spouse, two children, and parents Inclusive Environment •We offer hybrid work options and flexible working hours to accommodate various needs and preferences. •Our office is equipped with accessible facilities, including adjustable workstations, ergonomic chairs, and assistive technologies to support employees with physical disabilities. Let's unleash your full potential. See Beyond, Rise Above
Posted 3 months ago
4 - 6 years
6 - 8 Lacs
Mumbai
Work from Office
Capgemini Invent Capgemini Invent is the digital innovation, consulting and transformation brand of the Capgemini Group, a global business line that combines market leading expertise in strategy, technology, data science and creative design, to help CxOs envision and build whats next for their businesses. Your Role Analyse and organize raw data. Build data systems and pipelines. Evaluate business needs and objectives. Interpret trends and patterns. Conduct complex data analysis and report on results. Prepare data for prescriptive and predictive modelling. Build algorithms and prototypes. Combine raw information from different sources. Explore ways to enhance data quality and reliability. Identify opportunities for data acquisition. Develop analytical tools and programs. Collaborate with data scientists and architects on several projects. Participate in code peer reviews to ensure our applications comply with best practices. Your Profile Experience with any Big Data tools:Hadoop, Spark, Kafka, Sqoop, Flume, Hive etc. Experience with any relational SQL and NoSQL databases, including Postgres, Cassandra, Sql Server, Oracle, Snowflake. Experience with any data pipeline and workflow management tools:Azkaban, Luigi, Airflow, etc. Experience in any Cloud platforms:Azure, AWS or GCP. Experience with stream-processing systems:Storm, Spark-Streaming, etc. Experience with object-oriented/object function scripting languages:Python, Java, C++, Scala, etc Must have hands-on experience in DevOps and CI/CD deployments. Should know basic and advance SQL and can write complex queries. Strong experience into data warehousing and dimensional modelling. Should be a very good team player to work in a geographically dispersed team . What you will love about working here We recognize the significance of flexible work arrangements to provide support. Be it remote work, or flexible work hours, you will get an environment to maintain healthy work life balance. At the heart of our mission is your career growth. Our array of career growth programs and diverse professions are crafted to support you in exploring a world of opportunities. Equip yourself with valuable certifications in the latest technologies such as Generative AI. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, cloud and data, combined with its deep industry expertise and partner ecosystem. The Group reported 2023 global revenues of 22.5 billion.
Posted 3 months ago
10 - 15 years
12 - 17 Lacs
Hyderabad
Work from Office
Minimum 10 years experience in design, architecture or development in Analytics and Data Warehousing Have experience in solution design, solution governance and implementing end-to-end Big Data solutions using Hadoop eco-systems (Hive, HDFS, Pig, HBase, Flume, Kafka, Sqoop, YARN, Impala) Possess ability to produce semantic, conceptual, logical and physical data models using data modelling techniques such as Data Vault, Dimensional Modelling, 3NF, etc. Has the ability to design data warehousing and enterprise analytics-based solutions using Teradata or relevant data platforms Can demonstrate expertise in design patterns (FSLDM, IBM IFW DW) and data modelling frameworks including dimensional, star and non-dimensional schemas Possess commendable experience in consistently driving cost effective and technologically feasible solutions, while steering solution decisions across the group, to meet both operational and strategic goals is essential. Are adept with abilities to positively influence the adoption of new products, solutions and processes, to align with the existing Information Architectural design would be desirable Have Analytics & Data/BI Architecture appreciation and broad experience across all technology disciplines, including project management, IT strategy development and business process, information, application and business process. Have extensive experience with Teradata data warehouses and Big Data platforms on both On-Prim and Cloud platform. Extensive experience in large enterprise environments handling large volume of datasets with High Service Level Agreement(s) across various business functions/ units. Have experience leading discussions and presentations. Experience in driving decisions across groups of stakeholders.
Posted 3 months ago
3 - 6 years
3 - 6 Lacs
Hyderabad
Work from Office
Hadoop Admin 1 Position Hadoop administration Automation (Ansible, shell scripting or python scripting) DEVOPS skills (Should be able to code at least in one language preferably python Location: Preferably Bangalore, Otherwise Chennai, Pune, Hyderabad Working Type: Remote
Posted 3 months ago
4 - 8 years
6 - 10 Lacs
Hyderabad
Work from Office
JR REQ -BigData Engineer --4to8year---HYD----Karuppiah Mg --- TCS C2H ---900000
Posted 3 months ago
6 - 11 years
0 - 3 Lacs
Bengaluru
Work from Office
SUMMARY This is a remote position. Job Description: EMR Admin We are seeking an experienced EMR Admin with expertise in Big data services such as Hive, Metastore, H-base, and Hue. The ideal candidate should also possess knowledge in Terraform and Jenkins. Familiarity with Kerberos and Ansible tools would be an added advantage, although not mandatory. Additionally, candidates with Hadoop admin skills, proficiency in Terraform and Jenkins, and the ability to handle EMR Admin responsibilities are encouraged to apply. Location: Remote Experience: 6+ years Must-Have: The candidate should have 4 years in EMR Admin. Requirements Requirements: Proven experience in EMR administration Proficiency in Big data services including Hive, Metastore, H-base, and Hue Knowledge of Terraform and Jenkins Familiarity with Kerberos and Ansible tools (preferred) Experience in Hadoop administration (preferred)
Posted 3 months ago
3 - 8 years
5 - 9 Lacs
Pune
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Apache Hadoop Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : 15 years of full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements using Apache Hadoop. Your typical day will involve working with the Hadoop ecosystem, developing and testing applications, and troubleshooting issues. Roles & Responsibilities: Design, develop, and test applications using Apache Hadoop and related technologies. Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions. Troubleshoot and debug issues in the Hadoop ecosystem, including HDFS, MapReduce, Hive, and Pig. Ensure the performance, scalability, and reliability of applications by optimizing code and configurations. Professional & Technical Skills: Must To Have Skills:Experience with Apache Hadoop. Strong understanding of the Hadoop ecosystem, including HDFS, MapReduce, Hive, and Pig. Experience with Java or Scala programming languages. Familiarity with SQL and NoSQL databases. Experience with data ingestion, processing, and analysis using Hadoop tools like Sqoop, Flume, and Spark. Additional Information: The candidate should have a minimum of 3 years of experience in Apache Hadoop. The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering impactful data-driven solutions. This position is based at our Pune office. Qualification 15 years of full time education
Posted 3 months ago
3 - 5 years
3 - 8 Lacs
Noida
Work from Office
We are seeking a highly skilled and motivated Data Engineer to join our dynamic team. As a Data Engineer, you will collaborate closely with our Data Scientists to develop and deploy machine learning models. Proficiency in below listed skills will be crucial in building and maintaining pipelines for training and inference datasets. Responsibilities: • Work in tandem with Data Scientists to design, develop, and implement machine learning pipelines. • Utilize PySpark for data processing, transformation, and preparation for model training. • Leverage AWS EMR and S3 for scalable and efficient data storage and processing. • Implement and manage ETL workflows using Streamsets for data ingestion and transformation. • Design and construct pipelines to deliver high-quality training and inference datasets. • Collaborate with cross-functional teams to ensure smooth deployment and real-time/near real-time inferencing capabilities. • Optimize and fine-tune pipelines for performance, scalability, and reliability. • Ensure IAM policies and permissions are appropriately configured for secure data access and management. • Implement Spark architecture and optimize Spark jobs for scalable data processing. Total Experience Expected: 04-06 years
Posted 3 months ago
3 - 8 years
5 - 10 Lacs
Noida
Work from Office
About The Role : Role Purpose The purpose of this role is to interpret data and turn into information (reports, dashboards, interactive visualizations etc) which can offer ways to improve a business, thus affecting business decisions. Do 1. Managing the technical scope of the project in line with the requirements at all stages a. Gather information from various sources (data warehouses, database, data integration and modelling) and interpret patterns and trends b. Develop record management process and policies c. Build and maintain relationships at all levels within the client base and understand their requirements. d. Providing sales data, proposals, data insights and account reviews to the client base e. Identify areas to increase efficiency and automation of processes f. Set up and maintain automated data processes g. Identify, evaluate and implement external services and tools to support data validation and cleansing. h. Produce and track key performance indicators 2. Analyze the data sets and provide adequate information a. Liaise with internal and external clients to fully understand data content b. Design and carry out surveys and analyze survey data as per the customer requirement c. Analyze and interpret complex data sets relating to customers business and prepare reports for internal and external audiences using business analytics reporting tools d. Create data dashboards, graphs and visualization to showcase business performance and also provide sector and competitor benchmarking e. Mine and analyze large datasets, draw valid inferences and present them successfully to management using a reporting tool f. Develop predictive models and share insights with the clients as per their requirement Deliver NoPerformance ParameterMeasure1.Analyses data sets and provide relevant information to the clientNo. Of automation done, On-Time Delivery, CSAT score, Zero customer escalation, data accuracy
Posted 3 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2