Home
Jobs

3720 Hadoop Jobs - Page 36

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

10.0 years

4 - 7 Lacs

Hyderābād

On-site

GlassDoor logo

Job description Some careers shine brighter than others. If you’re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Lead Consultant Specialist. In this role, you will: The role requires a strong leader to oversee the solution design and implementation of change whilst at the same time ensuring production is resilient and performant. This includes interaction with the bank’s architects and other systems and technical teams, end users and stakeholders. The person is expected to oversee and guide the day-to-day activities of the technical team, with the help of his more experienced colleagues, ensure the team follows good practice etc. In addition, this person will be able to suggest and plan best technical solutions, undertake problem solving, etc balancing pragmatism vs long tern best practice. Thus role includes opportunities for hands on development and analysis and not just team management. Role covers a mix of change and run responsibilities. Current team size is around 45 people located in UK, India, China, Poland and Mexico but mostly India. Should be flexible in working hours, ready to work in shift and On call. Requirements To be successful in this role, you should meet the following requirements: Background in hands-on technical development, with at least 10+ years of industry experience in a data engineering or engineering equivalent & managed a team developers Strong emotional intelligence, able to work professionally under pressure. Have the gravitas to represent the platform in senior meetings Strong communication skills, with the ability to convey technical detail in a non technical language Needs to be a practitioner and proponent of Agile and Dev Ops Proficiency in Hadoop, Spark, Scala, Python, or a programming language associated with data engineering. Expertise building and deploying production level data processing batch systems maintained by application support teams. Experience with a variety of modern development tooling (e.g. Git, Gradle, Nexus) and technologies supporting automation and DevOps (e.g. Jenkins, Docker) Experience working in an Agile environment A strong technical communication ability with demonstrable experience of working in rapidly changing client environments. Knowledge of testing libraries of common programming languages (such as ScalaTest or equivalent). Knows the difference between different test types (unit test, integration test) and can cite specific examples of what they have written themselves. Good understanding of CDP 7.1, HDFS filesystems, Unix, Unix Shell Scripting, Elasticsearch Experience in understanding and analyzing complex business requirements and carry out the system design accordingly. Quantexa Data Engineering Certification (Preferred) Experience in managed services AWS EKS, Azure EKS (Preferred) Experience in Angular (Preferred) Microservices (OCP, Kubernetes) (Preferred) You’ll achieve more when you join HSBC. www.hsbc.com/careers HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. Issued by – HSDI

Posted 1 week ago

Apply

0 years

5 - 6 Lacs

Hyderābād

On-site

GlassDoor logo

Category: Software Development/ Engineering Main location: India, Andhra Pradesh, Hyderabad Position ID: J0125-0901 Employment Type: Full Time Position Description: Founded in 1976, CGI is among the world's largest independent IT and business consulting services firms. With 94,000 consultants and professionals globally, CGI delivers an end-to-end portfolio of capabilities, from strategic IT and business consulting to systems integration, managed IT and business process services, and intellectual property solutions. CGI works with clients through a local relationship model complemented by a global delivery network that helps clients digitally transform their organizations and accelerate results. CGI Fiscal 2024 reported revenue is CA$14.68 billion, and CGI shares are listed on the TSX (GIB.A) and the NYSE (GIB). Learn more at cgi.com. Position - Senior Software Engineer Experience - 4 - 7 Yrs Category - Software Development/Engineering Shift - 1 to 10 PM Location - BNG/HYD/CHN Position Id - J0125-0901 Work Type - Hybrid Employment Type - Full time Education - Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field. Your future duties and responsibilities: We are looking for a talented Data Engg to join our team. In this role, you will develop, optimize, and maintain scalable applications, and be responsible for building efficient, testable, and reusable code. Your work will involve collaborating with cross-functional teams to deliver high-quality software that meets our clients' needs. Write reusable, testable, and efficient code. Implement security and data protection solutions. Develop and maintain robust and scalable backend systems and APIs using Python. Integrate user-facing elements developed by front-end developers with server-side logic. Work with various databases (SQL, NoSQL) to ensure efficient data storage and retrieval. Required qualifications to be successful in this role: Programing Language : Python, Pyspark Bigdata Tech – Data Bricks, Spark, Hadoop, Hive Cloud – AWS Database – RDBMS & No SQL Shell Scripting Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field. CGI is an equal opportunity employer. In addition, CGI is committed to providing accommodation for people with disabilities in accordance with provincial legislation. Please let us know if you require reasonable accommodation due to a disability during any aspect of the recruitment process and we will work with you to address your needs. Skills: English Python Teradata Hadoop Hive What you can expect from us: Together, as owners, let’s turn meaningful insights into action. Life at CGI is rooted in ownership, teamwork, respect and belonging. Here, you’ll reach your full potential because… You are invited to be an owner from day 1 as we work together to bring our Dream to life. That’s why we call ourselves CGI Partners rather than employees. We benefit from our collective success and actively shape our company’s strategy and direction. Your work creates value. You’ll develop innovative solutions and build relationships with teammates and clients while accessing global capabilities to scale your ideas, embrace new opportunities, and benefit from expansive industry and technology expertise. You’ll shape your career by joining a company built to grow and last. You’ll be supported by leaders who care about your health and well-being and provide you with opportunities to deepen your skills and broaden your horizons. Come join our team—one of the largest IT and business consulting services firms in the world.

Posted 1 week ago

Apply

4.0 years

0 Lacs

Hyderābād

On-site

GlassDoor logo

Job Summary: We are looking for an experienced Data Engineer with 4+ years of proven expertise in building scalable data pipelines, integrating complex datasets, and working with cloud-based and big data technologies. The ideal candidate should have hands-on experience with data modeling, ETL processes, and real-time data streaming. Key Responsibilities: Design, develop, and maintain scalable and efficient data pipelines and ETL workflows. Work with large datasets from various sources, ensuring data quality and consistency. Collaborate with Data Scientists, Analysts, and Software Engineers to support data needs. Optimize data systems for performance, scalability, and reliability. Implement data governance and security best practices. Troubleshoot data issues and identify improvements in data processes. Automate data integration and reporting tasks. Required Skills & Qualifications: Bachelor’s or Master’s degree in Computer Science, Engineering, or related field. 4+ years of experience in data engineering or similar roles . Strong programming skills in Python , SQL , and Shell scripting . Experience with ETL tools (e.g., Apache Airflow, Talend, AWS Glue). Proficiency in data modeling , data warehousing , and database design . Hands-on experience with cloud platforms (AWS, GCP, or Azure) and services like S3, Redshift, BigQuery, Snowflake . Experience with big data technologies such as Spark, Hadoop, Kafka . Strong understanding of data structures, algorithms , and system design . Familiarity with CI/CD tools , version control (Git), and Agile methodologies. Preferred Skills: Experience with real-time data streaming (Kafka, Spark Streaming). Knowledge of Docker , Kubernetes , and infrastructure-as-code tools like Terraform . Exposure to machine learning pipelines or data science workflows is a plus. Interested candidates can send their resume Job Type: Full-time Schedule: Day shift Work Location: In person

Posted 1 week ago

Apply

1.0 - 2.6 years

6 - 9 Lacs

Hyderābād

On-site

GlassDoor logo

Analyst – ETL Testing - Deloitte Support Services India Private Limited USI DT Canada MF is an integral part of the Information Technology Services group. The principle focus of this organization is the development and maintenance of technology solutions that e-enable the delivery of Function and Marketplace Services and Management Information Systems. Solutions Delivery group develops and maintains solutions built on varied technologies like Siebel, PeopleSoft Microsoft technologies, SAP, Hadoop, ETL, BI and Lotus Notes. Solutions Delivery Canada has various groups which provide the best of the breed solutions to the clients by following a streamlined system development methodology. Solutions Delivery comprises of groups like Usability, Application Architecture, Development and Quality Assurance and Performance. Role Specific Responsibilities / Work you’ll do Responsible for planning, developing, and coordinating testing activities including Test Plan creation, Test Case creation, debugging, execution, test analysis. Responsible for the execution of test scenarios in support of the test team. Familiarize themselves with the business functionality and technology used for assigned applications (under test). Utilize ETL QA testing tools, methodologies, and processes. Work closely with on-site team towards successful test phases. Encourage collaborative efforts and camaraderie with other Release Stream team areas. Ensure the quality and low bug rates of code released into production. Responsible for successful execution and alerting team leads and managers of obstacles, issues, and risks. Able discuss status of all open issues facing the test team and describes actions taken to mitigate such issues Responsible for coordinating/engaging build movements to the QA environment as directed by Team Lead. The team EDC Canada is the Canada CIO’s IT department which manages an end-to-end portfolio of Canada business applications and technology infrastructure that supports business processes common to Deloitte Canada member firm. Cutting Edge Technologies: At USI DT Canada MF, you will be part of an exciting journey that will keep you ahead of the curve. Be it our innovative delivery model for agile or our Communities of Practices, we are constantly investing in leading edge technologies to give our practitioners a world class experience. We have programs and projects spanning across a multitude of technologies and always abreast on evolving technologies and emerging industry leading practices such as agile. Application Development and Solutions Delivery: Start from Architecture and User Experience and evolve into design, develop, transform, re-platform, or custom-build systems in complex business scenarios. We manage a portfolio of enterprise scale applications and solutions used by practitioners in Canada. Offerings include Custom Development, Packaged Application Development, Application Architecture and Testing Advisory Services. Technologies include Business Analytics, Business intelligence, Cloud Development, Mobile, .Net, SharePoint, SAP HANA, Manual, Automated, and Performance testing. Location : Hyderabad Work shift Timings : 11 AM to 8 PM Qualifications Essential A Computer Science University degree and/or equivalent work experience A strong commitment to professional client service excellence Excellent interpersonal relations and demonstrated ability to work with others effectively in teams Good verbal and written communications skills Excellent Analytical Skill Top 3 Keywords: SQL databases, Test Strategy, API’s testing Technical Skills and Qualifications 1-2.6 years' experience in ETL testing. Demonstrates an understanding and working experience on SQL databases and ETL testing. Should be able to write queries to validate the table mappings and structures Should be able to perform schema validations Good understanding of SCD types Strong knowledge of database methodology In-depth understanding of Data Warehousing/Business intelligence concepts Working experience in testing BI reports Should be able to write queries to validate the data quality during migration projects Demonstrates an understanding of any of the peripheral technologies utilized in SDC, including Peoplesoft, SAP and Aderant. Demonstrates a working understanding of tools like UFT and TFS Experience with Microsoft tools is highly desired Understands enterprise-wide networks and software implementations. Must have previous experience in creating complex SQL queries for data validation. Must have testing experience in Enterprise Data Warehouse (EDW) Good to have Reports testing experience Good to have working knowledge on Azure, DB2, HANA, SQL databases. Demonstrates a working understanding of planning, developing, and coordinating testing activities including Test Plan creation, Test Case creation, debugging, execution, test analysis. Demonstarte an understanding on the estimation techaniques and QA plan Demonstrates analytical skills in assessing user, functional and technical requirements Demonstrates a working understanding of functional testing techniques and strategies. Demonstrates a working understanding of test analysis and design. Demonstrates a working understanding of analyzing test results and the creation of appropriate test metrics. Demonstrates a working understanding of the defect management process. Demonstrates an ability to provide accurate project estimates and timelines. Demonstrates an ability to deliver on project commitments. Produces work that consistently meets quality standards. Demonstrates the ability to operate as the primary resource on a project Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Professional development From entry-level employees to senior leaders, we believe there’s always room to learn. We offer opportunities to build new skills, take on leadership opportunities and connect and grow through mentorship. From on-the-job learning experiences to formal development programs, our professionals have a variety of opportunities to continue to grow throughout their career. Requisition code: 304282

Posted 1 week ago

Apply

5.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Linkedin logo

Job Description: About us* At Bank of America, we are guided by a common purpose to help make financial lives better through the power of every connection. Responsible Growth is how we run our company and how we deliver for our clients, teammates, communities and shareholders every day. One of the keys to driving Responsible Growth is being a great place to work for our teammates around the world. We’re devoted to being a diverse and inclusive workplace for everyone. We hire individuals with a broad range of backgrounds and experiences and invest heavily in our teammates and their families by offering competitive benefits to support their physical, emotional, and financial well-being. Bank of America believes both in the importance of working together and offering flexibility to our employees. We use a multi-faceted approach for flexibility, depending on the various roles in our organization. Working at Bank of America will give you a great career with opportunities to learn, grow and make an impact, along with the power to make a difference. Join us! Global Business Services Global Business Services delivers Technology and Operations capabilities to Lines of Business and Staff Support Functions of Bank of America through a centrally managed, globally integrated delivery model and globally resilient operations. Global Business Services is recognized for flawless execution, sound risk management, operational resiliency, operational excellence and innovation. In India, we are present in five locations and operate as BA Continuum India Private Limited (BACI), a non-banking subsidiary of Bank of America Corporation and the operating company for India operations of Global Business Services. Process Overview* GF (Global Finance) Global Financial Control India (GFCI) is part of the CFO Global Delivery strategy to provide offshore delivery to Line of Business (LOBs) and Enterprise Finance functions. The capabilities hosted include General Accounting & Reconciliations, Legal Entity Controllership, Corporate Sustainability Controllership, Corporate Controllership, Management Reporting & Analysis, Finance Systems Support, Operational Risk and Controls, Regulatory Reporting and Strategic initiatives. The Financed Emissions Accounting & Reporting team, a part of the Global Financial Control-Corporate Sustainability Controller organization within the CFO Group, plays a critical role in supporting the calculation of asset level balance sheet Financed Emissions, which are integral to the Bank ’s goal of achieving Net-zero greenhouse gas emissions by 2050. Job Description* The role is responsible for building data sourcing process, data research and analytics using available tools, support model input data monitoring and develop necessary data or reporting frameworks to support our approaches to net zero progress alignment, target setting, client engagement and reputational risk review, empowering banking teams to assist clients on net zero financing strategies and specific commercial opportunities. The role will support and partner with business stakeholders in the Enterprise Climate Program Office, Technology, Climate and Credit Risk, the Global Environment Group, Lines of Business, Legal Entity Controllers and Model Risk Management. Additionally, the role will support data governance, lineage, controls by building, improving and executing data processes. Candidate must be able to communicate across technology partners, climate office and the business lines to execute on viable analytical solutions, with a focus on end-user experience and usability. Candidate must be strong in identifying and explaining data quality issues to help achieve successful and validated data for model execution. This individual should feel at ease creating complex SQL queries and extracting large, raw datasets from various sources, merging, and transforming raw data into usable data and analytic structures, and benchmarking results to known. They must feel comfortable with automating repeatable process, generating data insights that are easy for end users to interpret, conduct quantitative analysis, as well as effectively communicate and disseminate findings and data points to stakeholders. They should also understand greenhouse gas accounting frameworks and financed emissions calculations as applied to different sectors and asset classes. The candidate will have experience representing ERA with critical Climate stakeholders across the firm, and should demonstrate capacity for strategic leadership, exercising significant independent judgment and discretion and work towards strategic goals with limited oversight. Essential Functions: Net zero transition planning and execution: Partners with GEG, Program Office and Lines of Business in developing and executing enterprise-wide net zero transition plan and operational roadmap, with a focus on analysis and reporting capabilities, data procurement, liaising with consultants, external data providers, Climate Risk and Technology functions. Data development & Operations: Research on data requirements, produce executive level and detailed level data summary, validate the accuracy, completeness, reasonableness, timeliness on the dataset and develop desktop procedures for BAU operations. Perform data review and test technology implementation for financed emissions deliverables. Execute BAU processes such as new data cycle creation, execute data controls and data quality processes. Produce data summary materials and walk through with leadership team. Data Analytics & Strategy: Analyze the data and provide how granular data movements across history affects the data new results. Find trends of data improvements or areas for improvement. Develops automated data analysis results and answer the common questions to justify the changes in data. Support ad hoc analytics of bank-wide and client net zero commitment implementation, with an initial focus on automation of financed emissions analysis, reporting against PCAF standards and net zero transition preparedness analytics and engagement to enhance strategy for meeting emissions goals for target sectors. Requirements* Education* Bachelor’s degree in data management or analytics, engineering, sustainability, finance or other related field OR Master’s degree in data science, earth/climate sciences, engineering, sustainability, natural resource management, environmental economics, finance or other related field Certification if any NA Experience Range* Minimum 5+ years in statistical and/or data management and analytics and visualization Two (2) or more years of experience in Climate, Financed Emissions or financial reporting preferred. Foundational Skills* Deep expertise in SQL, Excel, automation & optimization, and project management Knowledge of data architecture concepts, data models, ETL processes Deep understanding of how data process works and ability to solve dynamically evolving and complex data challenges part of day-to-day activities. Experience in extracting, and combining data across from multiple sources, and aggregate data to support model development. Strong documentation & presentation skills to explain the data analysis in a visual and procedural way based on the audience. Excellent interpersonal, management, and teamwork skills. Highly motivated self-starter with excellent time management skills and the ability to effectively manage multiple priorities and timelines. Ability to effectively communicate and resolve conflicts by both oral and written communication to both internal and external clients. Ability to think critically to solve problems with rational solutions. Ability to react and make decisions quickly under pressure with good judgment. Desired Skills* Advanced knowledge of Finance Advanced knowledge of Climate Risk Demonstrated ability to motivate others in a high-stress environment to achieve goal. Ability to adapt to a dynamic and evolving work environment. Ability to quickly identify risks and determine reasonable solutions. Experience in multiple database environment such as Oracle, Hadoop, and Teradata Knowledge on Alteryx, Tableau, R, (knowledge of NLP, data scraping and generative AI welcome) Work Timings* Window 12:30 PM to 9:30 PM (9 hours shift, may require stretch during close period) Job Location* Mumbai Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Chandigarh, India

On-site

Linkedin logo

Company Profile Oceaneering is a global provider of engineered services and products, primarily to the offshore energy industry. We develop products and services for use throughout the lifecycle of an offshore oilfield, from drilling to decommissioning. We operate the world's premier fleet of work class ROVs. Additionally, we are a leader in offshore oilfield maintenance services, umbilicals, subsea hardware, and tooling. We also use applied technology expertise to serve the defense, entertainment, material handling, aerospace, science, and renewable energy industries. Since year 2003, Oceaneering’s India Center has been an integral part of operations for Oceaneering’s robust product and service offerings across the globe. This center caters to diverse business needs, from oil and gas field infrastructure, subsea robotics to automated material handling & logistics. Our multidisciplinary team offers a wide spectrum of solutions, encompassing Subsea Engineering, Robotics, Automation, Control Systems, Software Development, Asset Integrity Management, Inspection, ROV operations, Field Network Management, Graphics Design & Animation, and more. In addition to these technical functions, Oceaneering India Center plays host to several crucial business functions, including Finance, Supply Chain Management (SCM), Information Technology (IT), Human Resources (HR), and Health, Safety & Environment (HSE). Our world class infrastructure in India includes modern offices, industry-leading tools and software, equipped labs, and beautiful campuses aligned with the future way of work. Oceaneering in India as well as globally has a great work culture that is flexible, transparent, and collaborative with great team synergy. At Oceaneering India Center, we take pride in “Solving the Unsolvable” by leveraging the diverse expertise within our team. Join us in shaping the future of technology and engineering solutions on a global scale. Position Summary The Principal Data Scientist will develop Machine Learning and/or Deep Learning based integrated solutions that address customer needs such as inspection topside and subsea. They will also be responsible for development of machine learning algorithms for automation and development of data analytics programs for Oceaneering’s next generation systems. The position requires the Principal Data Scientist to work with various Oceaneering Business units across global time zones but also offers the flexibility to work in a Hybrid Work-office environment. Essential Duties And Responsibilities Lead and supervise a team of moderately experienced engineers on product/prototype design & development assignments or applications. Work both independently and collaboratively to develop custom data models and algorithms to apply on data sets that will be deployed in existing and new products. Mine and analyze data from company databases to drive optimization and improvement of product development, marketing techniques and business strategies. Assess the effectiveness and accuracy of new data sources and data gathering techniques. Build data models and organize structured and unstructured data to interpret solutions. Prepares data for predictive and prescriptive modeling. Architect solutions by selection of appropriate technology and components Determines the technical direction and strategy for solving complex, significant, or major issues. Plans and evaluates architectural design and identifies technical risks and associated ways to mitigate those risks. Prepares design proposals to reflect cost, schedule, and technical approaches. Recommends test control, strategies, apparatus, and equipment. Develop, construct, test, and maintain architectures. Lead research activities for ongoing government and commercial projects and products. Collaborate on proposals, grants, and publications in algorithm development. Collect data as warranted to support the algorithm development efforts. Work directly with software engineers to implement algorithms into commercial software products. Work with third parties to utilize off the shelf industrial solutions. Algorithm development on key research areas based on client’s technical problem. This requires constant paper reading, and staying ahead of the game by knowing what is and will be state of the art in this field. Ability to work hands-on in cross-functional teams with a strong sense of self-direction. Non-essential Develop an awareness of programming and design alternatives Cultivate and disseminate knowledge of application development best practices Gather statistics and prepare and write reports on the status of the programming process for discussion with management and/or team members Direct research on emerging application development software products, languages, and standards in support of procurement and development efforts Train, manage and provide guidance to junior staff Perform all other duties as requested, directed or assigned Supervisory Responsibilities This position does not have direct supervisory responsibilities. Re Reporting Relationship Engagement Head Qualifications REQUIRED Bachelor’s degree in Electronics and Electrical Engineering (or related field) with eight or more years of past experience working on Machine Learning and Deep Learning based projects OR Master’s degree in Data Science (or related field) with six or more years of past experience working on Machine Learning and Deep Learning based projects DESIRED Strong knowledge of advanced statistical functions: histograms and distributions, Regression studies, scenario analysis etc. Proficient in Object Oriented Analysis, Design and Programming Strong background in Data Engineering tools like Python/C#, R, Apache Spark, Scala etc. Prior experience in handling large amount of data that includes texts, shapes, sounds, images and/or videos. Knowledge of SaaS Platforms like Microsoft Fabric, Databricks, Snowflake, h2o etc. Background experience of working on cloud platforms like Azure (ML) or AWS (SageMaker), or GCP (Vertex), etc. Proficient in querying SQL and NoSQL databases Hands on experience with various databases like MySQL/PostgreSQL/Oracle, MongoDB, InfluxDB, TimescaleDB, neo4j, Arango, Redis, Cassandra, etc. Prior experience with at least one probabilistic/statistical ambiguity resolution algorithm Proficient in Windows and Linux Operating Systems Basic understanding of ML frameworks like PyTorch and TensorFlow Basic understanding of IoT protocols like Kafka, MQTT or RabbitMQ Prior experience with bigdata platforms like Hadoop, Apache Spark, or Hive is a plus. Knowledge, Skills, Abilities, And Other Characteristics Ability to analyze situations accurately, utilizing a variety of analytical techniques in order to make well informed decisions Ability to effectively prioritize and execute tasks in a high-pressure environment Skill to gather, analyze and interpret data. Ability to determine and meet customer needs Ensures that others involved in a project or effort are kept informed about developments and plans Knowledge of communication styles and techniques Ability to establish and maintain cooperative working relationships Skill to prioritize workflow in a changing work environment Knowledge of applicable data privacy practices and laws Strong analytical and problem-solving skills. Additional Information This position is considered OFFICE WORK which is characterized as follows. Almost exclusively indoors during the day and occasionally at night Occasional exposure to airborne dust in the work place Work surface is stable (flat) The physical demands described here are representative of those that must be met by an employee to successfully perform the essential functions of this job. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions. This position is considered LIGHT work. OCCASIONAL FREQUENT CONSTANT Lift up to 20 pounds Climbing, stooping, kneeling, squatting, and reaching Lift up to 10 pounds Standing Repetitive movements of arms and hands Sit with back supported Closing Statement In addition, we make a priority of providing learning and development opportunities to enable employees to achieve their potential and take charge of their future. As well as developing employees in a specific role, we are committed to lifelong learning and ongoing education, including developing people skills and identifying future supervisors and managers. Every month, hundreds of employees are provided training, including HSE awareness, apprenticeships, entry and advanced level technical courses, management development seminars, and leadership and supervisory training. We have a strong ethos of internal promotion. We can offer long-term employment and career advancement across countries and continents. Working at Oceaneering means that if you have the ability, drive, and ambition to take charge of your future-you will be supported to do so and the possibilities are endless. Equal Opportunity/Inclusion Oceaneering’s policy is to provide equal employment opportunity to all applicants. Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Trivandrum, Kerala, India

Remote

Linkedin logo

Description Data Engineer Responsibilities : Deliver end-to-end data and analytics capabilities, including data ingest, data transformation, data science, and data visualization in collaboration with Data and Analytics stakeholder groups Design and deploy databases and data pipelines to support analytics projects Develop scalable and fault-tolerant workflows Clearly document issues, solutions, findings and recommendations to be shared internally & externally Learn and apply tools and technologies proficiently, including: Languages: Python, PySpark, ANSI SQL, Python ML libraries Frameworks/Platform: Spark, Snowflake, Airflow, Hadoop , Kafka Cloud Computing: AWS Tools/Products: PyCharm, Jupyter, Tableau, PowerBI Performance optimization for queries and dashboards Develop and deliver clear, compelling briefings to internal and external stakeholders on findings, recommendations, and solutions Analyze client data & systems to determine whether requirements can be met Test and validate data pipelines, transformations, datasets, reports, and dashboards built by team Develop and communicate solutions architectures and present solutions to both business and technical stakeholders Provide end user support to other data engineers and analysts Candidate Requirements Expert experience in the following[Should have/Good to have]: SQL, Python, PySpark, Python ML libraries. Other programming languages (R, Scala, SAS, Java, etc.) are a plus Data and analytics technologies including SQL/NoSQL/Graph databases, ETL, and BI Knowledge of CI/CD and related tools such as Gitlab, AWS CodeCommit etc. AWS services including EMR, Glue, Athena, Batch, Lambda CloudWatch, DynamoDB, EC2, CloudFormation, IAM and EDS Exposure to Snowflake and Airflow. Solid scripting skills (e.g., bash/shell scripts, Python) Proven work experience in the following: Data streaming technologies Big Data technologies including, Hadoop, Spark, Hive, Teradata, etc. Linux command-line operations Networking knowledge (OSI network layers, TCP/IP, virtualization) Candidate should be able to lead the team, communicate with business, gather and interpret business requirements Experience with agile delivery methodologies using Jira or similar tools Experience working with remote teams AWS Solutions Architect / Developer / Data Analytics Specialty certifications, Professional certification is a plus Bachelor Degree in Computer Science relevant field, Masters Degree is a plus Show more Show less

Posted 1 week ago

Apply

6.0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Linkedin logo

Company Overview: Crest Data is the global leading provider of Data Analytics, Security, DevOps, Cloud Solutions, Software integrations, Analytics, and security-based technological services. With a clientele that includes several Fortune 500 corporations and some of the innovative Silicon Valley Startups. Company URL : http://www.crestdata.ai Job Location - Ahmedabad Key Responsibilities - Utilize various open-source technologies & various tools to orchestrate solutions. Working closely with Development teams to implement and automate systems and architectures. Drive Automation efforts across the organization or customer utilizing Infrastructure as Code (IaC), Configuration Management, and Continuous Integration (CI) / Continuous Delivery (CD) tools such as Jenkins and Bamboo. Configuration, Management, and Maintenance of Linux systems - Development of Secure Cloud Networking policies and architectures Develop and Maintain scalable architecture solutions using Cloud Providers such as AWS, Google Cloud or Azure. Working alongside product architecture teams to make recommendations on DevOps practices Monitoring and development of monitoring procedures for multiple architectures - Collaborate with your peers as a subject matter expert on Cloud, Datacenter Technologies, Automation, Infrastructure as a Service (IaaS), IaC, SysOps, and DevOps Write scripts and automation using Perl/Python/Groovy/Java/Bash/Shell Configure and manage data sources like MySQL, Mongo, Elastic search, Redis, Cassandra, Hadoop, etc Understand how IT operations are managed Manage source control including SVN and GIT Key Skills - Bachelor’s Degree or MS in Engineering or equivalent. 6+ years of experience on Jenkins, Docker & Kubernetes. 6+ years of experience in managing Linux-based infrastructure 6+ years of hands-on experience in at least one scripting language 6+ Experience with IaC and Configuration Management techniques/technologies, preferably Chef, Cloud-Formation or Terraform, or Puppet 4+ Experience building secure, scalable architecture in any one cloud technology such as AWS, Azure, and Google Cloud Knowledge of Shell script or Python programming language is plus Sense of ownership and pride in your performance and its impact on the company’s success Critical thinking and problem-solving skills Good time-management, Interpersonal, and communication skills Show more Show less

Posted 1 week ago

Apply

3.0 years

25 Lacs

Gurgaon

On-site

GlassDoor logo

Position Title: Hadoop Data Engineer Location: Hyderabd and Gurgaon Position Type: Full-Time Required Experience: 3+ Years Job Overview: We are looking for experienced Data Engineers proficient in Hadoop, Hive, Python, SQL, and Pyspark/Spark to join our dynamic team. Candidates will be responsible for designing, developing, and maintaining scalable big data solutions. Key Responsibilities: Develop and optimize data pipelines for large-scale data processing. Work with structured and unstructured datasets to derive actionable insights. Collaborate with cross-functional teams to enhance data-driven decision-making. Ensure the performance, scalability, and reliability of data architectures. Implement best practices for data security and governance. Interview Process: L1: Virtual interview. L2: Face-to-Face interview at office. L3: Final round (Face-to-Face or Virtual). Job Types: Full-time, Permanent Pay: Up to ₹2,500,000.00 per year Benefits: Health insurance Provident Fund Location Type: In-person Schedule: Day shift Monday to Friday Morning shift Application Question(s): How many years of experience do you have in Python? How many years of experience do you have in Hive? Experience: total work: 3 years (Preferred) Hadoop: 3 years (Preferred) Work Location: In person

Posted 1 week ago

Apply

0 years

15 - 19 Lacs

Gurgaon

On-site

GlassDoor logo

NEW OPPORTUNITY || IMMEDIATE TO 45 DAYS JOINERS REQUIRED || Role: Hadoop Data Engineer Location: Gurgaon / Hyderabad Work Mode: Hybrid Employment Type: Full-Time Interview Mode: First Video then In Person Job Description Job Overview: We are looking for experienced Data Engineers proficient in Hadoop, Hive, Python, SQL, and Pyspark/Spark to join our dynamic team. Candidates will be responsible for designing, developing, and maintaining scalable big data solutions. Key Responsibilities: Develop and optimize data pipelines for large-scale data processing. Work with structured and unstructured datasets to derive actionable insights. Collaborate with cross-functional teams to enhance data-driven decision-making. Ensure the performance, scalability, and reliability of data architectures. Implement best practices for data security and governance. Job Type: Full-time Pay: ₹1,597,042.36 - ₹1,988,639.54 per year Schedule: Day shift Application Question(s): How many years of total experience do you have? How many years of relevant experience as Hadoop Data Engineer do you have? Do you have good hands-on-experience in all the skills - Hadoop, Hive, Python, SQL, Pyspark/Spark? Can you join within 45 days? Are you comfortable with F2F interview at Gurgaon/ Hyderabad? Work Location: In person

Posted 1 week ago

Apply

0 years

15 - 19 Lacs

Haryāna

On-site

GlassDoor logo

Role: Hadoop Data Engineer Location: Gurgaon, HR Work Mode: Hybrid Employment Type: Full-Time Interview Mode: First Video then In Person Job Description Job Overview: We are looking for experienced Data Engineers proficient in Hadoop, Hive, Python, SQL, and Pyspark/Spark to join our dynamic team. Candidates will be responsible for designing, developing, and maintaining scalable big data solutions. Key Responsibilities: Develop and optimize data pipelines for large-scale data processing. Work with structured and unstructured datasets to derive actionable insights. Collaborate with cross-functional teams to enhance data-driven decision-making. Ensure the performance, scalability, and reliability of data architectures. Implement best practices for data security and governance. Job Type: Full-time Pay: ₹1,507,675.01 - ₹1,926,524.53 per year Schedule: Day shift Work Location: In person

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

Delhi

On-site

GlassDoor logo

Delhi / Bangalore Engineering / Full Time / Hybrid What is Findem: Findem is the only talent data platform that combines 3D data with AI. It automates and consolidates top-of-funnel activities across your entire talent ecosystem, bringing together sourcing, CRM, and analytics into one place. Only 3D data connects people and company data over time - making an individual’s entire career instantly accessible in a single click, removing the guesswork, and unlocking insights about the market and your competition no one else can. Powered by 3D data, Findem’s automated workflows across the talent lifecycle are the ultimate competitive advantage. Enabling talent teams to deliver continuous pipelines of top, diverse candidates while creating better talent experiences, Findem transforms the way companies plan, hire, and manage talent. Learn more at www.findem.ai Experience - 5 - 9 years We are looking for an experienced Big Data Engineer, who will be responsible for building, deploying and managing various data pipelines, data lake and Big data processing solutions using Big data and ETL technologies. Location- Delhi, India Hybrid- 3 days onsite Responsibilities Build data pipelines, Big data processing solutions and data lake infrastructure using various Big data and ETL technologies Assemble and process large, complex data sets that meet functional non-functional business requirements ETL from a wide variety of sources like MongoDB, S3, Server-to-Server, Kafka etc., and processing using SQL and big data technologies Build analytical tools to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics Build interactive and ad-hoc query self-serve tools for analytics use cases Build data models and data schema for performance, scalability and functional requirement perspective Build processes supporting data transformation, metadata, dependency and workflow management Research, experiment and prototype new tools/technologies and make them successful Skill Requirements Must have-Strong in Python/Scala Must have experience in Big data technologies like Spark, Hadoop, Athena / Presto, Redshift, Kafka etc Experience in various file formats like parquet, JSON, Avro, orc etc Experience in workflow management tools like airflow Experience with batch processing, streaming and message queues Any of visualization tools like Redash, Tableau, Kibana etc Experience in working with structured and unstructured data sets Strong problem solving skills Good to have Exposure to NoSQL like MongoDB Exposure to Cloud platforms like AWS, GCP, etc Exposure to Microservices architecture Exposure to Machine learning techniques The role is full-time and comes with full benefits. We are globally headquartered in the San Francisco Bay Area with our India headquarters in Bengaluru. Equal Opportunity As an equal opportunity employer, we do not discriminate on the basis of race, color, religion, national origin, age, sex (including pregnancy), physical or mental disability, medical condition, genetic information, gender identity or expression, sexual orientation, marital status, protected veteran status or any other legally-protected characteristic.

Posted 1 week ago

Apply

9.0 - 12.0 years

7 - 12 Lacs

Hyderabad

Work from Office

Naukri logo

Role Description: We are looking for highly motivated expert Senior Data Engineer who can own the design & development of complex data pipelines, solutions and frameworks. The ideal candidate will be responsible to design, develop, and optimize data pipelines, data integration frameworks, and metadata-driven architectures that enable seamless data access and analytics. This role prefers deep expertise in big data processing, distributed computing, data modeling, and governance frameworks to support self-service analytics, AI-driven insights, and enterprise-wide data management. Roles & Responsibilities: Design, develop, and maintain scalable ETL/ELT pipelines to support structured, semi-structured, and unstructured data processing across the Enterprise Data Fabric. Implement real-time and batch data processing solutions, integrating data from multiple sources into a unified, governed data fabric architecture. Optimize big data processing frameworks using Apache Spark, Hadoop, or similar distributed computing technologies to ensure high availability and cost efficiency. Work with metadata management and data lineage tracking tools to enable enterprise-wide data discovery and governance. Ensure data security, compliance, and role-based access control (RBAC) across data environments. Optimize query performance, indexing strategies, partitioning, and caching for large-scale data sets. Develop CI/CD pipelines for automated data pipeline deployments, version control, and monitoring. Implement data virtualization techniques to provide seamless access to data across multiple storage systems. Collaborate with cross-functional teams, including data architects, business analysts, and DevOps teams, to align data engineering strategies with enterprise goals. Stay up to date with emerging data technologies and best practices, ensuring continuous improvement of Enterprise Data Fabric architectures. Must-Have Skills: Hands-on experience in data engineering technologies such as Databricks, PySpark, SparkSQL Apache Spark, AWS, Python, SQL, and Scaled Agile methodologies. Proficiency in workflow orchestration, performance tuning on big data processing. Strong understanding of AWS services Experience with Data Fabric, Data Mesh, or similar enterprise-wide data architectures. Ability to quickly learn, adapt and apply new technologies Strong problem-solving and analytical skills Excellent communication and teamwork skills Experience with Scaled Agile Framework (SAFe), Agile delivery practices, and DevOps practices. Good-to-Have Skills: Good to have deep expertise in Biotech & Pharma industries Experience in writing APIs to make the data available to the consumers Experienced with SQL/NOSQL database, vector database for large language models Experienced with data modeling and performance tuning for both OLAP and OLTP databases Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops Education and Professional Certifications 9 to 12 years of Computer Science, IT or related field experience AWS Certified Data Engineer preferred Databricks Certificate preferred Scaled Agile SAFe certification preferred Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals. Ability to learn quickly, be organized and detail oriented. Strong presentation and public speaking skills.

Posted 1 week ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

Remote

Linkedin logo

When you join Verizon You want more out of a career. A place to share your ideas freely — even if they’re daring or different. Where the true you can learn, grow, and thrive. At Verizon, we power and empower how people live, work and play by connecting them to what brings them joy. We do what we love — driving innovation, creativity, and impact in the world. Our V Team is a community of people who anticipate, lead, and believe that listening is where learning begins. In crisis and in celebration, we come together — lifting our communities and building trust in how we show up, everywhere & always. Want in? Join the #VTeamLife. Director - Data Engineering As a Director, you will be leading a team focused on scaling and managing Data Engineering solutions and teams focused on Verizon Consumer and Verizon Business domain. In addition to this, you will be setting the standards and building reusable engineering frameworks and services to industrialize the way data gets acquired to processed to how it gets consumed. In this setup, you will be leading a team of people managers, senior data engineers and architects. You will build best-in-class solutions and services on premise, in Cloud and / or on the edge with the ability to handle both real-time and batch workloads. You will also be devising AI strategies to make Data intake to consumption, a seamless and efficient process. You will also play a key role in making ours a Great place to work through a strong focus on employee engagement, enablement and talent development. Building a diverse team with a strong leadership pipeline and an inclusive culture and also attract and retain top Engineering talent. Contributing to driving world class employee satisfaction through the implementation of relevant organizational initiatives that build on employee commitment. Mentoring and guiding Senior Data experts and people leaders. Defining & continuously measuring the objectives & key results to drive an outcome-oriented practice across the team. Enabling career development via access to training resources, on-the-job coaching and mentoring. Creating the strategy for the problem space in line with the overall vision of the organization and through feedback from internal & external stakeholders. Implementing reusable and fit-for-future data optimization techniques and services through an automate-first mindset. Implementing modern Data industrialization practices across Cloud, Big Data and Relational database platforms & services. Driving a “Data Product” first mindset and building solutions aligning with those principles Applying an AI first approach to building data that delivers intelligent insights Serving as a technical thought leader and advisor to the organization Having an external presence in the industry Establishing best practices and governance routines to ensure adherence to SLA and FinOps. Staying informed on the latest advancements in AI and Data technology space, finding ways to deliver value by applying and customizing these to specific problem statements in the enterprise. Fostering an innovation culture where our employees are engaged in pioneering work that can be shared externally in a number of ways like patents, open source, publications, etc. Inspiring our team to create monetizable services that can be leveraged both within Verizon and externally with Verizon’s partners. What We’re Looking For... You are a dynamic leader with outstanding analytical capabilities and strong technology leadership along with hands-on and leadership experience in Data engineering areas along with end-to-end product development and delivery experience. You are constantly challenging the status quo and identifying new innovative ways to drive your team towards new ways of working that creates efficiencies. You’ll Need To Have Bachelor’s degree or four or more years of work experience. Six or more years of relevant work experience. Experience in leading and building Data and Full stack engineering teams with specific focus on GCP, AWS and Hadoop. Experience in Data Architecture Experience in applying AI to the lifecycle of data Experience in delivering business outcomes in large and complex business organizations. Experience in communicating complex designs and outcomes to a non-technical audience. Willingness to travel up to approximately 15% of the time. Even better if you have one or more of the following: Master’s degree in Computer Science or an MBA. Strong knowledge on Consumer and Business enterprise domain with expertise in Telecom Experience in Relational Database Management Systems (RDBMS) Experience in implementing Gen AI and Agentic AI solutions Experience in end-to-end Product lifecycle management Experience in driving significant cost optimization on Google Cloud #AI&D Where you’ll be working In this hybrid role, you'll have a defined work location that includes work from home and assigned office days set by your manager. Scheduled Weekly Hours 40 Equal Employment Opportunity Verizon is an equal opportunity employer. We evaluate qualified applicants without regard to race, gender, disability or any other legally protected characteristics. Show more Show less

Posted 1 week ago

Apply

2.0 - 4.0 years

6 - 8 Lacs

Noida

On-site

GlassDoor logo

Expertise in AWS services like EC2, CloudFormation, S3, IAM, SNS, SQS, EMR, Athena, Glue, lake formation etc. ? Expertise in Hadoop/EMR/DataBricks with good debugging skills to resolve hive and spark related issues. Sound fundamentals of database concepts and experience with relational or non-relational database types such as SQL, Key-Value, Graphs etc. Experience in infrastructure provisioning using CloudFormation, Terraform, Ansible, etc. Experience in programming languages such as Python/PySpark. Excellent written and verbal communication skills. Key Responsibilities Working closely with the Data lake engineers to provide technical guidance, consultation and resolution of their queries. Assist in development of simple and advanced analytics best practices, processes, technology & solution patterns and automation (including CI/CD) Working closely with various stakeholders in US team with a collaborative approach. Develop data pipeline in python/pyspark to be executed in AWS cloud. Set up analytics infrastructure in AWS using cloud formation templates. Develop mini/micro batch, streaming ingestion patterns using Kinesis/Kafka. Seamlessly upgrading the application to higher version like Spark/EMR upgrade. Participates in the code reviews of the developed modules and applications. Provides inputs for formulation of best practices for ETL processes / jobs written in programming languages such as PySpak and BI processes. Working with column-oriented data storage formats such as Parquet , interactive query service such as Athena and event-driven computing cloud service - Lambda Performing R&D with respect to the latest and greatest Big data in the market, perform comparative analysis and provides recommendations to choose the best tool as per the current and future needs of the enterprise. Required Qualifications Bachelors or Masters degree in Computer Science or similar field 2-4 years of strong expeirence in big data development Expertise in AWS services like EC2, CloudFormation, S3, IAM, SNS, SQS, EMR, Athena, Glue, lake formation etc. Expertise in Hadoop/EMR/DataBricks with good debugging skills to resolve hive and spark related issues. Sound fundamentals of database concepts and experience with relational or non-relational database types such as SQL, Key-Value, Graphs etc. Experience in infrastructure provisioning using CloudFormation, Terraform, Ansible, etc. Experience in programming languages such as Python/PySpark. Excellent written and verbal communication skills. Preferred Qualifications Cloud certification (AWS, Azure or GCP) About Our Company Ameriprise India LLP has been providing client based financial solutions to help clients plan and achieve their financial objectives for 125 years. We are a U.S. based financial planning company headquartered in Minneapolis with a global presence. The firm’s focus areas include Asset Management and Advice, Retirement Planning and Insurance Protection. Be part of an inclusive, collaborative culture that rewards you for your contributions and work with other talented individuals who share your passion for doing great work. You’ll also have plenty of opportunities to make your mark at the office and a difference in your community. So if you're talented, driven and want to work for a strong ethical company that cares, take the next step and create a career at Ameriprise India LLP. Ameriprise India LLP is an equal opportunity employer. We consider all qualified applicants without regard to race, color, religion, sex, genetic information, age, sexual orientation, gender identity, disability, veteran status, marital status, family status or any other basis prohibited by law. Full-Time/Part-Time Full time Timings (2:00p-10:30p) India Business Unit AWMPO AWMP&S President's Office Job Family Group Technology

Posted 1 week ago

Apply

3.0 years

0 Lacs

Calcutta

Remote

GlassDoor logo

3+ years Ability to programmatically procure Azure /AWS resources. Experience to implement WebSocket protocol Expertise in Git /GitHub Hands-on experience in Ansible is an advantage Ability to write Cloud automation scripts in Java or Go Language (Hands on experience) Discover a career with a greater purpose at CBNITS Build resilience and nimbleness through automation. Clearly define and evangelise your mission/vision to the organisation. Recognize and pay off technical debt. See your people, measure your data. BE A PART OF THE SMARTEST TEAM This is your chance to work in a team that is full of smart people with excellent tech knowledge. GET RECOGNIZED FOR YOUR CONTRIBUTION Even your smallest contribution will get recognised. We express real care that goes beyond the standard pay check and benefits package. FLEXIBLE WORKING HOUR Work from home and work flexible hours, we allow you to tailor your work to suit your life outside the office. CAREER DEVELOPMENT AND OPPORTUNITIES From arranging virtual workshops to e-learning, we make it easy for employees to improve their core skills. WHO WE ARE CBNITS LLC an MNC company in Fremont, USA is the place where you are inspired to explore your passions, where your talent is nurtured and cultivated. We have one development centre in India (Kolkata) providing full IT solutions to our clients from the last 7 years. We are mostly dealing with projects like - Big Data Hadoop, Dynamics 365, IoT, SAP, Machine Learning, Deep Learning, Blockchain, Flutter, React JS & React Native, DevOps & Cloud AWS, Golang etc.

Posted 1 week ago

Apply

5.0 years

3 - 5 Lacs

Calcutta

Remote

GlassDoor logo

5+ years of relevant work experience. A Palo Alto Networks PCNSE certification is preferred Fluency in English BS in Computer Science, computer engineering, systems or related field Strong communication and problem solving skills with the ability to communicate clearly and calmly with technical personnel in high-stress situations Proven experience in executing skills in RCA and problem resolution A superior grasp of network and server troubleshooting, monitoring tools, and escalation processes Ability to multi-task and work in a dynamic environment with constant change to address emerging security risks and challenges Positive, growth-oriented mindset Thrives in a matrixed, team environment anchored by our values of Collaboration, Disruption, Execution, Inclusion, and Integrity Discover a career with a greater purpose at CBNITS Build resilience and nimbleness through automation. Clearly define and evangelise your mission/vision to the organisation. Recognize and pay off technical debt. See your people, measure your data. BE A PART OF THE SMARTEST TEAM This is your chance to work in a team that is full of smart people with excellent tech knowledge. GET RECOGNIZED FOR YOUR CONTRIBUTION Even your smallest contribution will get recognised. We express real care that goes beyond the standard pay check and benefits package. FLEXIBLE WORKING HOUR Work from home and work flexible hours, we allow you to tailor your work to suit your life outside the office. CAREER DEVELOPMENT AND OPPORTUNITIES From arranging virtual workshops to e-learning, we make it easy for employees to improve their core skills. WHO WE ARE CBNITS LLC an MNC company in Fremont, USA is the place where you are inspired to explore your passions, where your talent is nurtured and cultivated. We have one development centre in India (Kolkata) providing full IT solutions to our clients from the last 7 years. We are mostly dealing with projects like - Big Data Hadoop, Dynamics 365, IoT, SAP, Machine Learning, Deep Learning, Blockchain, Flutter, React JS & React Native, DevOps & Cloud AWS, Golang etc.

Posted 1 week ago

Apply

3.0 years

1 - 3 Lacs

Calcutta

Remote

GlassDoor logo

3+ years using ETL data movement tools Experience implementing a data pipeline using Lambda, Fargate etc Extensive experience with AWS data services: S3, RDS, EMR, ECS, Document DB, Step functions, Athena etc Data Storage Platform – MongoDB and Postgres Extensive experience writing and tuning SQL queries Programming in Python is required and one or more languages (Java, Scala or GoLang) will be an added advantage. Experience in Big data technologies like HDFS, Kafka, Spark, Flink etc is a plus. Job description: - Hands-on experience with implementation of real-time & batch use cases on AWS Lambda, Fargate etc in Python managing data flows that integrate information from various sources into a common pool implementing data pipelines based on the ETL model. - Strong in Development, problem-solving skills, and algorithms with specialization in Data Cloud development effort focusing on scalability, quality and performance - Responsible for data ingestion, solution design, use case development and post production use case support and enhancements. - Expert in Python programming skills. - Possess sound knowledge on AWS platform technologies and architecture. - Working experience in the Agile development environment. Discover a career with a greater purpose at CBNITS Build resilience and nimbleness through automation. Clearly define and evangelise your mission/vision to the organisation. Recognize and pay off technical debt. See your people, measure your data. BE A PART OF THE SMARTEST TEAM This is your chance to work in a team that is full of smart people with excellent tech knowledge. GET RECOGNIZED FOR YOUR CONTRIBUTION Even your smallest contribution will get recognised. We express real care that goes beyond the standard pay check and benefits package. FLEXIBLE WORKING HOUR Work from home and work flexible hours, we allow you to tailor your work to suit your life outside the office. CAREER DEVELOPMENT AND OPPORTUNITIES From arranging virtual workshops to e-learning, we make it easy for employees to improve their core skills. WHO WE ARE CBNITS LLC an MNC company in Fremont, USA is the place where you are inspired to explore your passions, where your talent is nurtured and cultivated. We have one development centre in India (Kolkata) providing full IT solutions to our clients from the last 7 years. We are mostly dealing with projects like - Big Data Hadoop, Dynamics 365, IoT, SAP, Machine Learning, Deep Learning, Blockchain, Flutter, React JS & React Native, DevOps & Cloud AWS, Golang etc.

Posted 1 week ago

Apply

2.0 years

2 - 4 Lacs

Calcutta

Remote

GlassDoor logo

2+ years of software engineering experience (backend) developing highly reliable, scalable products and services. Strong knowledge of Go programming language Familiarity with RDBMS and at least one NoSQL database, graphql, mysql & redis is best.Actively participate in product design, team discussion, product development and product documentation. Actively participate in product design, team discussion, product development and product documentation. Collaborate with other developers across different regions. Understand high availability, scalability and concurrency management. Strong knowledge of Data Structures, Design patterns, algorithms etc. Very strong capacity for problem analysis and solving. Strong knowledge of software implementation best practices Discover a career with a greater purpose at CBNITS Build resilience and nimbleness through automation. Clearly define and evangelise your mission/vision to the organisation. Recognize and pay off technical debt. See your people, measure your data. BE A PART OF THE SMARTEST TEAM This is your chance to work in a team that is full of smart people with excellent tech knowledge. GET RECOGNIZED FOR YOUR CONTRIBUTION Even your smallest contribution will get recognised. We express real care that goes beyond the standard pay check and benefits package. FLEXIBLE WORKING HOUR Work from home and work flexible hours, we allow you to tailor your work to suit your life outside the office. CAREER DEVELOPMENT AND OPPORTUNITIES From arranging virtual workshops to e-learning, we make it easy for employees to improve their core skills. WHO WE ARE CBNITS LLC an MNC company in Fremont, USA is the place where you are inspired to explore your passions, where your talent is nurtured and cultivated. We have one development centre in India (Kolkata) providing full IT solutions to our clients from the last 7 years. We are mostly dealing with projects like - Big Data Hadoop, Dynamics 365, IoT, SAP, Machine Learning, Deep Learning, Blockchain, Flutter, React JS & React Native, DevOps & Cloud AWS, Golang etc.

Posted 1 week ago

Apply

2.0 years

4 - 6 Lacs

Calcutta

Remote

GlassDoor logo

At least 2 years of experience as a Blockchain developer Programming: NodeJS, Solidity, ReactJS. (additional plus point). Experiences with Ethereum & Cryptography in Blockchain. In-depth knowledge of best practices in Blockchain management and data protection. Strong software development background. Strong knowledge of bitcoin-like Blockchains. Experience working with open-source projects. Advanced analytical and problem-solving skills. Experience with DevOps practices and tools like Git. Discover a career with a greater purpose at CBNITS Build resilience and nimbleness through automation. Clearly define and evangelise your mission/vision to the organisation. Recognize and pay off technical debt. See your people, measure your data. BE A PART OF THE SMARTEST TEAM This is your chance to work in a team that is full of smart people with excellent tech knowledge. GET RECOGNIZED FOR YOUR CONTRIBUTION Even your smallest contribution will get recognised. We express real care that goes beyond the standard pay check and benefits package. FLEXIBLE WORKING HOUR Work from home and work flexible hours, we allow you to tailor your work to suit your life outside the office. CAREER DEVELOPMENT AND OPPORTUNITIES From arranging virtual workshops to e-learning, we make it easy for employees to improve their core skills. WHO WE ARE CBNITS LLC an MNC company in Fremont, USA is the place where you are inspired to explore your passions, where your talent is nurtured and cultivated. We have one development centre in India (Kolkata) providing full IT solutions to our clients from the last 7 years. We are mostly dealing with projects like - Big Data Hadoop, Dynamics 365, IoT, SAP, Machine Learning, Deep Learning, Blockchain, Flutter, React JS & React Native, DevOps & Cloud AWS, Golang etc.

Posted 1 week ago

Apply

3.0 years

0 Lacs

Calcutta

Remote

GlassDoor logo

3+ years of experience Development experience on the Sitecore platform (Sitecore Ordercloud) and C# Strong understanding of Ordercloud technologies and infrastructure as well as experience designing and deploying Sitecore application Experience with use of Sitecore xDB to create personalized user experience - personalization Implement Sitecore best practices as and when required Experience in system design, implementation, scalability and interoperability of enterprise systems and web infrastructure. Sitecore Ordercloud Certified (Preferred) Should have good working knowledge of CSS/HTML and JavaScript libraries. Possess Excellent communication & interpersonal skills Preferred from product based company Discover a career with a greater purpose at CBNITS Build resilience and nimbleness through automation. Clearly define and evangelise your mission/vision to the organisation. Recognize and pay off technical debt. See your people, measure your data. BE A PART OF THE SMARTEST TEAM This is your chance to work in a team that is full of smart people with excellent tech knowledge. GET RECOGNIZED FOR YOUR CONTRIBUTION Even your smallest contribution will get recognised. We express real care that goes beyond the standard pay check and benefits package. FLEXIBLE WORKING HOUR Work from home and work flexible hours, we allow you to tailor your work to suit your life outside the office. CAREER DEVELOPMENT AND OPPORTUNITIES From arranging virtual workshops to e-learning, we make it easy for employees to improve their core skills. WHO WE ARE CBNITS LLC an MNC company in Fremont, USA is the place where you are inspired to explore your passions, where your talent is nurtured and cultivated. We have one development centre in India (Kolkata) providing full IT solutions to our clients from the last 7 years. We are mostly dealing with projects like - Big Data Hadoop, Dynamics 365, IoT, SAP, Machine Learning, Deep Learning, Blockchain, Flutter, React JS & React Native, DevOps & Cloud AWS, Golang etc.

Posted 1 week ago

Apply

5.0 years

5 - 8 Lacs

Calcutta

Remote

GlassDoor logo

5+ years of hands-on experience in development Good knowledge of Java 8+, AWS, Spring Boot, Hibernate, Junit (or any Unit testing framework) Should be able to write SQL queries (preferably MySQL/Oracle), Should be conversant with design patterns and industry best practices in coding Experience in debugging production application Preferred to have experience in using JMeter and done/analyzed performance testing using any other tool Preferred to have experience of AWS Preferred to have experience in designing applications Discover a career with a greater purpose at CBNITS Build resilience and nimbleness through automation. Clearly define and evangelise your mission/vision to the organisation. Recognize and pay off technical debt. See your people, measure your data. BE A PART OF THE SMARTEST TEAM This is your chance to work in a team that is full of smart people with excellent tech knowledge. GET RECOGNIZED FOR YOUR CONTRIBUTION Even your smallest contribution will get recognised. We express real care that goes beyond the standard pay check and benefits package. FLEXIBLE WORKING HOUR Work from home and work flexible hours, we allow you to tailor your work to suit your life outside the office. CAREER DEVELOPMENT AND OPPORTUNITIES From arranging virtual workshops to e-learning, we make it easy for employees to improve their core skills. WHO WE ARE CBNITS LLC an MNC company in Fremont, USA is the place where you are inspired to explore your passions, where your talent is nurtured and cultivated. We have one development centre in India (Kolkata) providing full IT solutions to our clients from the last 7 years. We are mostly dealing with projects like - Big Data Hadoop, Dynamics 365, IoT, SAP, Machine Learning, Deep Learning, Blockchain, Flutter, React JS & React Native, DevOps & Cloud AWS, Golang etc.

Posted 1 week ago

Apply

3.0 years

4 - 9 Lacs

Calcutta

Remote

GlassDoor logo

3+ years Exp in setting out comprehensive test plans and scenarios (WHY: To evaluate how well-rounded and experienced they are in their field) Groom test cases and ACs in JIRA (WHY: we will be grooming the test cases via JIRA and tracking them accordingly so familiarity to the platform is important on top of being able to work independently in view of the work that is in store for him/her) Ability to articulate Testing methodologies for various scenarios (WHY: to evaluate how they approach a project in a clear and structured manner with best practices embedded) Sitecore testing Discover a career with a greater purpose at CBNITS Build resilience and nimbleness through automation. Clearly define and evangelise your mission/vision to the organisation. Recognize and pay off technical debt. See your people, measure your data. BE A PART OF THE SMARTEST TEAM This is your chance to work in a team that is full of smart people with excellent tech knowledge. GET RECOGNIZED FOR YOUR CONTRIBUTION Even your smallest contribution will get recognised. We express real care that goes beyond the standard pay check and benefits package. FLEXIBLE WORKING HOUR Work from home and work flexible hours, we allow you to tailor your work to suit your life outside the office. CAREER DEVELOPMENT AND OPPORTUNITIES From arranging virtual workshops to e-learning, we make it easy for employees to improve their core skills. WHO WE ARE CBNITS LLC an MNC company in Fremont, USA is the place where you are inspired to explore your passions, where your talent is nurtured and cultivated. We have one development centre in India (Kolkata) providing full IT solutions to our clients from the last 7 years. We are mostly dealing with projects like - Big Data Hadoop, Dynamics 365, IoT, SAP, Machine Learning, Deep Learning, Blockchain, Flutter, React JS & React Native, DevOps & Cloud AWS, Golang etc.

Posted 1 week ago

Apply

3.0 years

5 - 8 Lacs

Calcutta

Remote

GlassDoor logo

3+ years Should be proficient in working with Angular 8 & 8+ (MUST) Hands-on experience with HTML ,CSS,JavaScript. (MUST) Should have knowledge in Angular Migration, Security implementation, Authorization, Interceptor, Charts, and Multilingual. Experience with RBAC(Rollbase Access Controller) will be an advantage. Discover a career with a greater purpose at CBNITS Build resilience and nimbleness through automation. Clearly define and evangelise your mission/vision to the organisation. Recognize and pay off technical debt. See your people, measure your data. BE A PART OF THE SMARTEST TEAM This is your chance to work in a team that is full of smart people with excellent tech knowledge. GET RECOGNIZED FOR YOUR CONTRIBUTION Even your smallest contribution will get recognised. We express real care that goes beyond the standard pay check and benefits package. FLEXIBLE WORKING HOUR Work from home and work flexible hours, we allow you to tailor your work to suit your life outside the office. CAREER DEVELOPMENT AND OPPORTUNITIES From arranging virtual workshops to e-learning, we make it easy for employees to improve their core skills. WHO WE ARE CBNITS LLC an MNC company in Fremont, USA is the place where you are inspired to explore your passions, where your talent is nurtured and cultivated. We have one development centre in India (Kolkata) providing full IT solutions to our clients from the last 7 years. We are mostly dealing with projects like - Big Data Hadoop, Dynamics 365, IoT, SAP, Machine Learning, Deep Learning, Blockchain, Flutter, React JS & React Native, DevOps & Cloud AWS, Golang etc.

Posted 1 week ago

Apply

3.0 years

2 - 4 Lacs

Calcutta

Remote

GlassDoor logo

3+ years of experience Support various compliance activities conducted by the global Team. These can range from third-party audits to customer self-assessments. Assist in identifying customers organisation structure through use of internal systems and third-party data sources such as Hoovers & D&B. Understanding of company’s profile and its subsidiaries. Understand key development strategies adopted by the companies such as mergers/de-mergers and acquisitions. Check for company details by use open sources like Google, Wiki, and their annual report. Ability to formulate weekly, monthly, quarterly, and yearly reports. Closely work with internal teams to share requested data within timeframe. B.E, B.S, MBA, preferred in Computer Science, and Management of Information Systems, Operations, Finance, Accounting. Experience in data analytics, market analysis or reporting Proven ability to deliver results against a specific set of goals Ability to communicate cross functionally Unquestionable ethics, integrity, and business judgment Analyse and fetch data from oracle – 11i, R12, and Odyssey Discover a career with a greater purpose at CBNITS Build resilience and nimbleness through automation. Clearly define and evangelise your mission/vision to the organisation. Recognize and pay off technical debt. See your people, measure your data. BE A PART OF THE SMARTEST TEAM This is your chance to work in a team that is full of smart people with excellent tech knowledge. GET RECOGNIZED FOR YOUR CONTRIBUTION Even your smallest contribution will get recognised. We express real care that goes beyond the standard pay check and benefits package. FLEXIBLE WORKING HOUR Work from home and work flexible hours, we allow you to tailor your work to suit your life outside the office. CAREER DEVELOPMENT AND OPPORTUNITIES From arranging virtual workshops to e-learning, we make it easy for employees to improve their core skills. WHO WE ARE CBNITS LLC an MNC company in Fremont, USA is the place where you are inspired to explore your passions, where your talent is nurtured and cultivated. We have one development centre in India (Kolkata) providing full IT solutions to our clients from the last 7 years. We are mostly dealing with projects like - Big Data Hadoop, Dynamics 365, IoT, SAP, Machine Learning, Deep Learning, Blockchain, Flutter, React JS & React Native, DevOps & Cloud AWS, Golang etc.

Posted 1 week ago

Apply

Exploring Hadoop Jobs in India

The demand for Hadoop professionals in India has been on the rise in recent years, with many companies leveraging big data technologies to drive business decisions. As a job seeker exploring opportunities in the Hadoop field, it is important to understand the job market, salary expectations, career progression, related skills, and common interview questions.

Top Hiring Locations in India

  1. Bangalore
  2. Mumbai
  3. Pune
  4. Hyderabad
  5. Chennai

These cities are known for their thriving IT industry and have a high demand for Hadoop professionals.

Average Salary Range

The average salary range for Hadoop professionals in India varies based on experience levels. Entry-level Hadoop developers can expect to earn between INR 4-6 lakhs per annum, while experienced professionals with specialized skills can earn upwards of INR 15 lakhs per annum.

Career Path

In the Hadoop field, a typical career path may include roles such as Junior Developer, Senior Developer, Tech Lead, and eventually progressing to roles like Data Architect or Big Data Engineer.

Related Skills

In addition to Hadoop expertise, professionals in this field are often expected to have knowledge of related technologies such as Apache Spark, HBase, Hive, and Pig. Strong programming skills in languages like Java, Python, or Scala are also beneficial.

Interview Questions

  • What is Hadoop and how does it work? (basic)
  • Explain the difference between HDFS and MapReduce. (medium)
  • How do you handle data skew in Hadoop? (medium)
  • What is YARN in Hadoop? (basic)
  • Describe the concept of NameNode and DataNode in HDFS. (medium)
  • What are the different types of join operations in Hive? (medium)
  • Explain the role of the ResourceManager in YARN. (medium)
  • What is the significance of the shuffle phase in MapReduce? (medium)
  • How does speculative execution work in Hadoop? (advanced)
  • What is the purpose of the Secondary NameNode in HDFS? (medium)
  • How do you optimize a MapReduce job in Hadoop? (medium)
  • Explain the concept of data locality in Hadoop. (basic)
  • What are the differences between Hadoop 1 and Hadoop 2? (medium)
  • How do you troubleshoot performance issues in a Hadoop cluster? (advanced)
  • Describe the advantages of using HBase over traditional RDBMS. (medium)
  • What is the role of the JobTracker in Hadoop? (medium)
  • How do you handle unstructured data in Hadoop? (medium)
  • Explain the concept of partitioning in Hive. (medium)
  • What is Apache ZooKeeper and how is it used in Hadoop? (advanced)
  • Describe the process of data serialization and deserialization in Hadoop. (medium)
  • How do you secure a Hadoop cluster? (advanced)
  • What is the CAP theorem and how does it relate to distributed systems like Hadoop? (advanced)
  • How do you monitor the health of a Hadoop cluster? (medium)
  • Explain the differences between Hadoop and traditional relational databases. (medium)
  • How do you handle data ingestion in Hadoop? (medium)

Closing Remark

As you navigate the Hadoop job market in India, remember to stay updated on the latest trends and technologies in the field. By honing your skills and preparing diligently for interviews, you can position yourself as a strong candidate for lucrative opportunities in the big data industry. Good luck on your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies