Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
8 - 13 years
25 - 27 Lacs
Mumbai, Delhi / NCR, Bengaluru
Work from Office
NP: 15 Days. Minimum Qualifications Job Requirements: * Bachelors degree in CS. 8 years of hands-on experience in designing and developing distributed data pipelines. 5 years of hands-on experience in Azure data service technologies.5 years of hands-on experience in Python, SQL, Object oriented programming, ETL and unit testing Experience with data integration with APIs, Web services, Queues Experience with Azure DevOps and CI/CD as well as agile tools and processes including JIRA, confluence. Excellent Communication skill Location: Chennai, Hyderabad, Kolkata, Pune, Ahmedabad, Delhi, Mumbai, Bengaluru, Remote
Posted 1 month ago
5 - 8 years
0 Lacs
Chennai, Tamil Nadu, India
Hybrid
Introduction A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. You'll work with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio; including Software and Red Hat. Curiosity and a constant quest for knowledge serve as the foundation to success in IBM Consulting. In your role, you'll be encouraged to challenge the norm, investigate ideas outside of your role, and come up with creative solutions resulting in ground breaking impact for a wide network of clients. Our culture of evolution and empathy centers on long-term career growth and development opportunities in an environment that embraces your unique skills and experience Your Role And Responsibilities As an Associate Software Developer at IBM, you'll work with clients to co-create solutions to major real-world challenges by using best practice technologies, tools, techniques, and products to translate system requirements into the design and development of customized systems Preferred Education Master's Degree Required Technical And Professional Expertise Core Java, Spring Boot, Java2/EE,Microsservices - Hadoop Ecosystem (HBase, Hive, MapReduce, HDFS, Pig, Sqoop etc) Spark Good to have Python Preferred Technical And Professional Experience None
Posted 1 month ago
1 - 4 years
7 - 11 Lacs
Bengaluru
Work from Office
---- What the Candidate Will Do ---- Design, develop, and maintain robust and scalable software solutions Find opportunities for quality improvements in the search stack and lead the entire development lifecycle end-to-end, from architecture design and coding to testing and deployment ---- Basic Qualifications ---- Bachelor's degree in Computer Science 3+ years of professional experience in software development with a track record of increasing responsibility and impact. Experience with Go and Python programming languages Demonstrated experience developing sophisticated backend systems and longer-term ownership of critical backend services and infrastructure. Bias to action and proven track record of getting things done. ---- Preferred Qualifications ---- Master's degree in Computer Science Big Data: Proficiency building data pipelines. Experience in using PySpark at scale with large data sets. Experience with t
Posted 1 month ago
4 - 9 years
3 - 7 Lacs
Hyderabad
Work from Office
Data Engineer Summary Apply Now Full-Time 4+ years Responsibilities Design, develop, and maintain data pipelines and ETL processes. Build and optimize data architectures for analytics and reporting. Collaborate with data scientists and analysts to support data-driven initiatives. Implement data security and governance best practices. Monitor and troubleshoot data infrastructure and ensure high availability. Qualifications Design, develop, and maintain data pipelines and ETL processes. Build and optimize data architectures for analytics and reporting. Collaborate with data scientists and analysts to support data-driven initiatives. Implement data security and governance best practices. Monitor and troubleshoot data infrastructure and ensure high availability. Skills Proficiency in data engineering tools (Hadoop, Spark, Kafka, etc.). Strong SQL and programming skills (Python, Java, etc.). Experience with cloud platforms (AWS, Azure, GCP). Knowledge of data modeling, warehousing, and ETL processes. Strong problem-solving and analytical abilities.
Posted 1 month ago
13 - 20 years
45 - 50 Lacs
Pune
Work from Office
About The Role : Job Title Solution Architect, VP Location Pune, India Role Description As a solution architect supporting the individual business aligned, strategic or regulatory book of work, you will be working closely with the projects' business stakeholders, business analysts, data solution architects, engineers, the lead solution architect, the principal architect and the Chief Architect to help projects break out business requirements into implementable building blocks and design the solution's target architecture. It is the solution architect's responsibility to align the design against the general (enterprise) architecture principles, apply agreed best practices and patterns and help the engineering teams deliver against the architecture in an event driven, service oriented environment. What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities In projects work with SMEs and stakeholders deriving the individual components of the solution Design the target architecture for a new solution or when adding new capabilities to an existing solution. Assure proper documentation of the High-Level Design and Low Level design of the delivered solution Quality assures the delivery against the agreed and approved architecture, i.e. provide delivery guidance and governance Prepare the High-Level Design for review and approval by design authorities for projects to proceed into implementation Support creation of the Low-Level Design as it is being delivered to support final go live Your skills and experience Very proficient at designing / architecting solutions in an event driven environment leveraging service oriented principles Proficient at Java and the delivery of Spring based services Proficient at building systems in a decoupled, event driven environment leveraging messaging / streaming, i.e Kafka Very good experience designing and implementing data streaming and data ingest pipelines for heavy throughput while providing different guarantees (At least once, Exactly once) Very good understanding of non-streaming ETL and ELT approaches for data ingest Solid understanding of containerized, distributed systems and building auto-scalable, state-less services in a cluster (Concepts of Quorum, Consensus) Solid understanding of standard RDBM systems and a proficiency at data engineering level in T-SQL and/or PL/SQL and/or PL/pgSQL, preferably all Good understanding of how RBMS generally work but also specific tuning experience on SQL Server, Oracle or PostgreSQL are welcome Understanding of modeling / implementing different data modelling approaches as well as understanding of respective pros and cons (e.g. Normalized, Denormalized, Star, Snowflake, DataVault 2.0, ...) A strong working experience on GCP architecture is a benefit (APIGee, BigQuery, Pub/Sub, Cloud SQL, DataFlow, ...) - appropriate GCP architecture level certification even more so Experience leveraging other languages more than welcome (C#, Python) How we'll support you Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm
Posted 1 month ago
10 - 15 years
0 - 0 Lacs
Chennai
Work from Office
About the Role As a Senior Data Engineer you’ll be a core part of our engineering team. You will bring your valuable experience and knowledge, improving the technical quality of our data-focused products. This is a key role in helping us become more mature, deliver innovative new products and unlock further business growth. This role will be part of a newly formed team that will collaborate alongside data team members based in Ireland, USA and India. Following the successful delivery of some fantastic products in 2024, we have embarked upon a data-driven strategy in 2025. We have a huge amount of data and are keen to accelerate unlocking its value to delight our customers and colleagues. You will be tasked with delivering new data pipelines, actionable insights in automated ways and enabling innovative new product features. Reporting to our Team Lead, you will be collaborating with the engineering and business teams. You’ll work across all our brands, helping to shape their future direction. Working as part of a team, you will help shape the technical design of our platforms and solve complicated problems in elegant ways that are robust, scalable, and secure. We don’t get everything right first time, but you will help us reflect, adjust and be better next time around. We are looking for people who are inquisitive, confident exploring unfamiliar problems, and have a passion for learning. We don’t have all the answers and don’t expect you to know everything either. Our team culture is open, inclusive, and collaborative – we tackle goals together. Seeking the best solution to a problem, we actively welcome ideas and opinions from everyone in the team. Our Technologies We are continuously evolving our products and exploring new opportunities. We are focused on selecting the right technologies to solve the problem at hand. We know the technologies we’ll be using in 3 years’ time will probably be quite different to what we’re using today. You’ll be a key contributor to evolving our tech stack over time. Our data pipelines are currently based upon Google BigQuery, FiveTran and DBT Cloud. These involve advanced SQL alongside Python in a variety of areas. We don’t need you to be an expert with these technologies, but it will help if you’re strong with something similar. Your Skills and Experience This is an important role for us as we scale up the team and we are looking for someone who has existing experience at this level. You will have worked with data driven platforms that involve some kind of transaction, such as eCommerce, trading platforms or advertising lead generation. Your broad experience and knowledge of data engineering methods mean you’re able to build high quality products regardless of the language used – solutions that avoid common pitfalls impacting the platform’s technical performance. You can apply automated approaches for tracking and measuring quality throughout the whole lifecycle, through to the production environments. You are comfortable working with complex and varied problems. As a strong communicator, you work well with product owners and business stakeholders. You’re able to influence and persuade others by listening to their views, explaining your own thoughts, and working to achieve agreement. We have many automotive industry experts within our team already and they are eager to teach you everything you need to know for this role. Any existing industry knowledge is a bonus but is not necessary. This is a full-time role based in our India office on a semi-flexible basis. Our engineering team is globally distributed but we’d like you to be accessible to the office for ad-hoc meetings and workshops.
Posted 1 month ago
4 - 8 years
12 - 22 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
Warm Greetings from SP Staffing!! Role: Big Data Developer Experience Required :4 to 8 yrs Work Location : Bangalore/Chennai/Pune/Delhi/Hyderabad/Kochi Required Skills, Spark and Scala Interested candidates can send resumes to nandhini.s@spstaffing.in
Posted 1 month ago
5 - 10 years
9 - 13 Lacs
Chennai
Work from Office
Overview GCP all services (Pubsub, BQ, Airflow, data proc, cloud composer, gcs) +Teradata Responsibilities GCP all services (Pubsub, BQ, Airflow, data proc, cloud composer, gcs) + Teradata Requirements GCP all services (Pubsub, BQ, Airflow, data proc, cloud composer, gcs) + Teradata
Posted 1 month ago
1 - 6 years
5 - 9 Lacs
Hyderabad
Work from Office
to be added Stay up to date on everything Blackbaud, follow us on Linkedin, X, Instagram, Facebook and YouTube ? Blackbaud is a digital-first company which embraces a flexible remote or hybrid work culture. Blackbaud supports hiring and career development for all roles from the location you are in today! Blackbaud is proud to be an equal opportunity employer and is committed to maintaining an inclusive work environment. All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, physical or mental disability, age, or veteran status or any other basis protected by federal, state, or local law.
Posted 1 month ago
2 - 6 years
7 - 11 Lacs
Hyderabad
Work from Office
As a Software Developer you'll participate in many aspects of the software development lifecycle, such as design, code implementation, testing, and support. You will create software that enables your clients' hybrid-cloud and AI journeys. Your primary responsibilities includeComprehensive Feature Development and Issue ResolutionWorking on the end to end feature development and solving challenges faced in the implementation. Stakeholder Collaboration and Issue ResolutionCollaborate with key stakeholders, internal and external, to understand the problems, issues with the product and features and solve the issues as per SLAs defined. Continuous Learning and Technology IntegrationBeing eager to learn new technologies and implementing the same in feature development. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise GCP using pub sub, big query, dataflow, cloud workflow/cloud scheduler, cloud run, data proc, Cloud Function Cloud data engineers with GCP PDE certification and working experience with GCP. Building end to end data pipelines in GCP using pub sub, big query, dataflow, cloud workflow/cloud scheduler, cloud run, data proc, Cloud Function Experience in logging and monitoring of GCP services and Experience in Terraform and infrastructure automation. Expertise in Python coding language Develops data engineering solutions on Google Cloud ecosystem and Supports and maintains data engineering solutions on Google Cloud ecosystem Preferred technical and professional experience Stay updated with the latest trends and advancements in cloud technologies, frameworks, and tools. Conduct code reviews and provide constructive feedback to maintain code quality and ensure adherence to best practices. Troubleshoot and debug issues, and deploy applications to the cloud platform
Posted 1 month ago
5 - 7 years
0 - 0 Lacs
Thiruvananthapuram
Work from Office
Job: Data Engineer Experience: 5+ years Mandatory Skill: Python, Pyspark, Linux Shell Scripting Location: Trivandrum Required Skills & Experience: Experience with large-scale distributed data processing systems. Expertise in data modelling, testing, quality, access, and storage. Proficiency in Python, SQL , and experience with Databricks and DBT. Experience implementing cloud data technologies (GCP, Azure, or AWS). Knowledge of improving the data development lifecycle and shortening lead times. Agile delivery experience. Required Skills Python,Pyspark,Linux Shell Scripting
Posted 1 month ago
4 - 9 years
12 - 16 Lacs
Gurugram
Work from Office
About The Role Role: Data Engineering Manager ? Do: ESSENTIAL DUTIES/RESPONSIBILITIES : - Oversee help desk support to ensure that end user problems are resolved in a timely and effective manner, enabling users to access needed information and utilize technology resources effectively. - Improve productivity at ServiceDesk by focusing on reducing incidents, use self-heal and self-help techniques to reduce call flow at SD. - Communicate effectively with customers and stakeholders to assess support needs, assist in the identification of technology needs, and respond to customer service concerns. - Train and guide support specialists to effectively utilize help desk problem-management process (the identification, prioritization, escalation and resolution of end user help requests) to ensure quick and accurate responses to all end users, while emphasizing a customer-focused attitude. - Establish and monitor service level targets/benchmarks and measure performance against those benchmarks. Establish and monitor Compliance level across the region - Patch, AV and security standards - Track and analyze support calls and information requests to identify areas of need and create strategies to enhance end-user capacity and end-user reliance on support personnel. - Develop and maintain comprehensive documentation, includingoperations guidelines and procedures, inventory checklists, deployment guides, budget information, training guides and support materials. - Measure and report on unit performance via metrics and indicators of service level activity and customer satisfaction. Provide regular helpdesk performance and utilization reports to leadership. - Manage the inventory, support and maintenance of the region/location's end user technology assets, including, but not limited todesktop and notebook computers, mobile devices (smart phones, tablets, etc.), printers and software. - Facilitate equipment, services, and software purchases and implementation; and manage inventory and licensing reconciliations. Conduct research and make recommendations on hardware and software products, services, protocols, and standards. SECONDARY DUTIES/RESPONSIBILITIES: - Recommend changes or enhancements in available information technology or equipment as prompted by feedback via the user support function. - Engage in ongoing research of emerging trends and new technologies which may benefit the corporation's goal of strategically implementing technology to enhance business performance, and specifically support the support services function. - Participate in the planning, policy and decision making discussions involving information management projects. - Provide occasional technical support and best practice advice for offsite Corporation events. - Research and implement special projects and other duties as assigned. NATURE OF WORK CONTACTS - Works closely with staff and management from other units and divisions. - Regular interactions with GNOC such as network administrators and server admi. - Periodic correspondence and interaction with vendors Management staff ? ? ? Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.
Posted 1 month ago
2 - 5 years
14 - 17 Lacs
Hyderabad
Work from Office
As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs. Your primary responsibilities include: Design, build, optimize and support new and existing data models and ETL processes based on our clients business requirements. Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization. Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Must have 5+ years exp in Big Data -Hadoop Spark -Scala ,Python Hbase, Hive Good to have Aws -S3, athena ,Dynomo DB, Lambda, Jenkins GIT Developed Python and pyspark programs for data analysis. Good working experience with python to develop Custom Framework for generating of rules (just like rules engine). Developed Python code to gather the data from HBase and designs the solution to implement using Pyspark. Apache Spark DataFrames/RDD's were used to apply business transformations and utilized Hive Context objects to perform read/write operations.. Preferred technical and professional experience Understanding of Devops. Experience in building scalable end-to-end data ingestion and processing solutions Experience with object-oriented and/or functional programming languages, such as Python, Java and Scala"
Posted 1 month ago
4 - 9 years
4 - 7 Lacs
Pune
Work from Office
Your Role Pyspark Data Engineer As a Pyspark developer you Must have 2+ years in Pyspark. Strong programming experience, Python, Pyspark, Scala is preferred. Experience indesigning and implementing CI/CD, Build Management, and Development strategy. Experience with SQL and SQL Analytical functions, experience participating in key business, architectural and technical decisions Scope to get trained on AWS cloud technology Your Profile Pyspark SQL Data Engineer What you will love about working here Choosing Capgemini means having the opportunity to make a difference, whether for the worlds leading businesses or for society. It means getting the support you need to shape your career in the way that works for you. It means when the future doesnt look as bright as youd like, you have the opportunity to make changeto rewrite it. When you join Capgemini, you dont just start a new job. You become part of something bigger. A diverse collective of free-thinkers, entrepreneurs and experts, all working together to unleash human energy through technology, for an inclusive and sustainable future. At Capgemini, people are at the heart of everything we do! You can exponentially grow your career by being part of innovative projects and taking advantage of our extensive Learning & Development programs. With us, you will experience an inclusive, safe, healthy, and flexible work environment to bring out the best in you! You also get a chance to make positive social change and build a better world by taking an active role in our Corporate Social Responsibility and Sustainability initiatives. And whilst you make a difference, you will also have a lot of fun. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, generative AI, cloud and data, combined with its deep industry expertise and partner ecosystem.
Posted 1 month ago
15 - 20 years
3 - 7 Lacs
Bengaluru
Work from Office
Project Role : Data Management Practitioner Project Role Description : Maintain the quality and compliance of an organizations data assets. Design and implement data strategies, ensuring data integrity and enforcing governance policies. Establish protocols to handle data, safeguard sensitive information, and optimize data usage within the organization. Design and advise on data quality rules and set up effective data compliance policies. Must have skills : Teradata Vantage Good to have skills : Data Architecture Principles, Teradata BI, Amazon Web Services (AWS) Minimum 15 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Management Practitioner, you will be responsible for maintaining the quality and compliance of an organization's data assets. You will design and implement data strategies, ensure data integrity, enforce governance policies, and optimize data usage within the organization. Roles & Responsibilities: Expected to be a SME with deep knowledge and experience. Should have Influencing and Advisory skills. Responsible for team decisions. Engage with multiple teams and contribute on key decisions. Expected to provide solutions to problems that apply across multiple teams. Design and implement data quality rules. Advise on data compliance policies. Develop protocols to handle and safeguard sensitive data. Professional & Technical Skills: Must To Have Skills: Proficiency in Teradata Vantage. Good To Have Skills: Experience with Data Architecture Principles, Amazon Web Services (AWS), Teradata BI. Strong understanding of data management principles. Experience in implementing data governance policies. Knowledge of data integrity and compliance standards. Additional Information: The candidate should have a minimum of 15 years of experience in Teradata Vantage. This position is based at our Bengaluru office. A 15 years full-time education is required. Qualification 15 years full time education
Posted 1 month ago
5 - 10 years
5 - 9 Lacs
Pune
Work from Office
Project Role : Data Governance Practitioner Project Role Description : Establish and enforce data governance policies to ensure the accuracy, integrity, and security of organizational data. Collaborate with key stakeholders to define data standards, facilitate effective data collection, storage, access, and usage; and drive data stewardship initiatives for comprehensive and effective data governance. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Governance Practitioner, you will establish and enforce data governance policies to ensure the accuracy, integrity, and security of organizational data. Collaborate with key stakeholders to define data standards, facilitate effective data collection, storage, access, and usage; and drive data stewardship initiatives for comprehensive and effective data governance. Roles & Responsibilities: Expected to be an SME Collaborate and manage the team to perform Responsible for team decisions Engage with multiple teams and contribute on key decisions Provide solutions to problems for their immediate team and across multiple teams Lead data governance initiatives within the organization Develop and implement data governance policies and procedures Ensure compliance with data governance regulations and standards Professional & Technical Skills: Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform Strong understanding of data governance principles Experience in implementing data quality and data stewardship programs Knowledge of data privacy regulations and compliance requirements Experience in data management and data security practices Additional Information: The candidate should have a minimum of 5 years of experience in Databricks Unified Data Analytics Platform. This position is based at our Pune office. A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
12 - 17 years
10 - 14 Lacs
Bengaluru
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : AWS Glue Good to have skills : Data Engineering, AWS BigData, PySpark Minimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your day will involve overseeing the application development process and ensuring successful project delivery. Roles & Responsibilities: Expected to be an SME Collaborate and manage the team to perform Responsible for team decisions Engage with multiple teams and contribute on key decisions Expected to provide solutions to problems that apply across multiple teams Lead the application development process Coordinate with stakeholders to gather requirements Ensure timely project delivery Professional & Technical Skills: Must To Have Skills: Proficiency in AWS Glue, Data Engineering, PySpark, AWS BigData Strong understanding of cloud computing principles Experience in designing and implementing data pipelines Knowledge of ETL processes and data transformation Familiarity with data warehousing concepts Additional Information: The candidate should have a minimum of 12 years of experience in AWS Glue This position is based at our Bengaluru office A 15 years full-time education is required Qualification 15 years full time education
Posted 1 month ago
5 - 10 years
10 - 14 Lacs
Hyderabad
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : PySpark Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your day will involve overseeing the application development process and ensuring successful project delivery. Roles & Responsibilities: Expected to be an SME Collaborate and manage the team to perform Responsible for team decisions Engage with multiple teams and contribute on key decisions Provide solutions to problems for their immediate team and across multiple teams Lead the application development process Ensure successful project delivery Mentor and guide team members Professional & Technical Skills: Must To Have Skills: Proficiency in PySpark Strong understanding of big data processing Experience with data pipelines and ETL processes Hands-on experience in building scalable applications Knowledge of cloud platforms and services Additional Information: The candidate should have a minimum of 5 years of experience in PySpark This position is based at our Hyderabad office A 15 years full-time education is required Qualification 15 years full time education
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
India has seen a rise in demand for professionals skilled in Sqoop, a tool designed for efficiently transferring bulk data between Apache Hadoop and structured datastores such as relational databases. Job seekers with expertise in Sqoop can explore various opportunities in the Indian job market.
The average salary range for Sqoop professionals in India varies based on experience levels: - Entry-level: Rs. 3-5 lakhs per annum - Mid-level: Rs. 6-10 lakhs per annum - Experienced: Rs. 12-20 lakhs per annum
Typically, a career in Sqoop progresses as follows: 1. Junior Developer 2. Sqoop Developer 3. Senior Developer 4. Tech Lead
In addition to expertise in Sqoop, professionals in this field are often expected to have knowledge of: - Apache Hadoop - SQL - Data warehousing concepts - ETL tools
As you explore job opportunities in the field of Sqoop in India, make sure to prepare thoroughly and showcase your skills confidently during interviews. Stay updated with the latest trends and advancements in Sqoop to enhance your career prospects. Good luck with your job search!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.