Home
Jobs

2453 Hive Jobs - Page 32

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

1.0 - 3.0 years

3 - 7 Lacs

Chennai

Hybrid

Naukri logo

Strong experience in Python Good experience in Databricks Experience working in AWS/Azure Cloud Platform. Experience working with REST APIs and services, messaging and event technologies. Experience with ETL or building Data Pipeline tools Experience with streaming platforms such as Kafka. Demonstrated experience working with large and complex data sets. Ability to document data pipeline architecture and design Experience in Airflow is nice to have To build complex Deltalake

Posted 1 week ago

Apply

3.0 - 8.0 years

7 - 11 Lacs

Mumbai

Work from Office

Naukri logo

3+ years of hands-on experience on Collibra tool. Knowledge of Collibra DGC version 5.7 and onward Experience on Spring boot development Experience on Groovy and Flow able for BPMN workflow Development. Experience in both Business And Technical Metadata Experience on platform activity like job server setup and upgrade " Working as SME in data governance, metadata management and data catalog solutions, specifically on Collibra Data Governance. Client interface and consulting skills required Experience in Data Governance of wide variety of data types (structured, semi-structured and unstructured data) and wide variety of data sources (HDFS, S3, Kafka, Cassandra, Hive, HBase, Elastic Search) Partner with Data Stewards for requirements, integrations and processes, participate in meetings and working sessions Partner with Data Management and integration leads to improve Data Management technologies and processes. Working experience of Collibra operating model, workflow BPMN development, and how to integrate various applications or systems with Collibra Experience in setting up peoples roles, responsibilities and controls, data ownership, workflows and common processes Integrate Collibra with other enterprise toolsData Quality Tool, Data Catalog Tool, Master Data Management Solutions Develop and configure all Collibra customized workflows 10. Develop API (REST, SOAP) to expose the metadata functionalities to the end-users Location :Pan India

Posted 1 week ago

Apply

4.0 - 9.0 years

2 - 6 Lacs

Bengaluru

Work from Office

Naukri logo

Roles and Responsibilities: 4+ years of experience as a data developer using Python Knowledge in Spark, PySpark preferable but not mandatory Azure Cloud experience (preferred) Alternate Cloud experience is fine preferred experience in Azure platform including Azure data Lake, data Bricks, data Factory Working Knowledge on different file formats such as JSON, Parquet, CSV, etc. Familiarity with data encryption, data masking Database experience in SQL Server is preferable preferred experience in NoSQL databases like MongoDB Team player, reliable, self-motivated, and self-disciplined

Posted 1 week ago

Apply

1.0 - 3.0 years

2 - 5 Lacs

Chennai

Work from Office

Naukri logo

Mandatory Skills: AWS, Python, SQL, spark, Airflow, SnowflakeResponsibilities Create and manage cloud resources in AWS Data ingestion from different data sources which exposes data using different technologies, such asRDBMS, REST HTTP API, flat files, Streams, and Time series data based on various proprietary systems. Implement data ingestion and processing with the help of Big Data technologies Data processing/transformation using various technologies such as Spark and Cloud Services. You will need to understand your part of business logic and implement it using the language supported by the base data platform Develop automated data quality check to make sure right data enters the platform and verifying the results of the calculations Develop an infrastructure to collect, transform, combine and publish/distribute customer data. Define process improvement opportunities to optimize data collection, insights and displays. Ensure data and results are accessible, scalable, efficient, accurate, complete and flexible Identify and interpret trends and patterns from complex data sets Construct a framework utilizing data visualization tools and techniques to present consolidated analytical and actionable results to relevant stakeholders. Key participant in regular Scrum ceremonies with the agile teams Proficient at developing queries, writing reports and presenting findings Mentor junior members and bring best industry practices

Posted 1 week ago

Apply

5.0 - 10.0 years

5 - 9 Lacs

Mumbai

Work from Office

Naukri logo

Roles & Responsibilities: Resource must have 5+ years of hands on experience in Azure Cloud development (ADF + DataBricks) - mandatory Strong in Azure SQL and good to have knowledge on Synapse / Analytics Experience in working on Agile Project and familiar with Scrum/SAFe ceremonies. Good communication skills - Written & Verbal Can work directly with customer Ready to work in 2nd shift Good in communication and flexible Defines, designs, develops and test software components/applications using Microsoft Azure- Data-bricks, ADF, ADL, Hive, Python, Data bricks, SparkSql, PySpark. Expertise in Azure Data Bricks, ADF, ADL, Hive, Python, Spark, PySpark Strong T-SQL skills with experience in Azure SQL DW Experience handling Structured and unstructured datasets Experience in Data Modeling and Advanced SQL techniques Experience implementing Azure Data Factory Pipelines using latest technologies and techniques. Good exposure in Application Development. The candidate should work independently with minimal supervision

Posted 1 week ago

Apply

2.0 - 5.0 years

2 - 5 Lacs

Chennai

Work from Office

Naukri logo

Must Have Strong work Experience as a Denodo Administrator Experience in T-SQL programming languages. Experience working in Denodo v 6 and Denodo v 8. Strong knowledge on ITIL Process. Excellent Analytical, Debugging, Communication & Reporting Skills. Good to Havve Knowledge on Clustering & Load Balancing, platform upgrade. Experience in Control-M, Service Now, Azure DevOps. DomainBanking. Roles & Responsibility: RoleDenodo Administrator Responsible for Monitoring, Cache & DB Refresh, DB connection changes, Access provisioning, Proactive Prevention of Disk, Space Issues. Responsible for Automation of Memory & SQL Alerts Excellent communication skill and should work directly with customer. Need to work on Shift to Provide Support

Posted 1 week ago

Apply

2.0 years

0 Lacs

Bengaluru East, Karnataka, India

Remote

Linkedin logo

Visa is a world leader in payments and technology, with over 259 billion payments transactions flowing safely between consumers, merchants, financial institutions, and government entities in more than 200 countries and territories each year. Our mission is to connect the world through the most innovative, convenient, reliable, and secure payments network, enabling individuals, businesses, and economies to thrive while driven by a common purpose – to uplift everyone, everywhere by being the best way to pay and be paid. Make an impact with a purpose-driven industry leader. Join us today and experience Life at Visa. Job Description The Client Services BI & Analytics team strives to create an open, trusting data culture where the cost of curiosity – the number of steps, amount of time, and complexity of effort needed to use operational data to derive insights – is as low as possible. We govern Client Services’ operational data and metrics, create easily usable dashboards and data sources, and analyze data to share insights. We are a part of the Client Services Global Business Operations function and work with all levels of stakeholders, from executive leaders sharing insights with the C-Suite to customer-facing colleagues who rely on our assets to incorporate data into their daily responsibilities. This specialist role makes data available from new sources, builds robust data models, creates and optimizes data enrichment pipelines, and provides engineering support to specific projects. You will partner with our Data Visualizers and Solution Designers to ensure that data needed by the business is available and accurate and to develop certified data sets. This technical lead and architect role is a force multiplier to our Visualizers, Analysts, and other data users across Client Services. Responsibilities Design, develop, and maintain scalable data pipelines and systems. Monitor and troubleshoot data pipeline issues to ensure seamless data flow. Establish data processes and automation based on business and technology requirements, leveraging Visa’s supported data platforms and tools Deliver small to large data engineering and Machine learning projects either individually or as part of a project team Setup ML Ops pipelines to Productionalize ML models and setting up Gen AI pipelines Collaborate with cross-functional teams to understand data requirements and ensure data quality, with a focus on implementing data validation and data quality checks at various stages of the pipeline Provide expertise in data warehousing, ETL, and data modeling to support data-driven decision making, with a strong understanding of best practices in data pipeline design and performance optimization Extract and manipulate large datasets using standard tools such as Hadoop (Hive), Spark, Python (pandas, NumPy), Presto, and SQL Develop data solutions using Agile principles Provide ongoing production support Communicate complex concepts in a clear and effective manner Stay up to date with the latest data engineering trends and technologies to ensure the company's data infrastructure is always state-of-the-art, with an understanding of best practices in cloud-based data engineering This is a remote position. A remote position does not require job duties be performed within proximity of a Visa office location. Remote positions may be required to be present at a Visa office with scheduled notice. Qualifications Basic Qualifications -2 or more years of work experience with a Bachelor’s Degree or an Advanced Degree (e.g. Masters, MBA, JD, MD, or PhD) Preferred Qualifications -3 or more years of work experience with a Bachelor’s Degree or more than 2 years of work experience with an Advanced Degree (e.g. Masters, MBA, JD, MD) -3+ years of work experience with a bachelor’s degree in the STEM field. -Strong experience with SQL, Python, Hadoop, Spark, Hive, Airflow and MPP data bases -5+ years of analytics experience with a focus on data Engineering and AI -Experience with both traditional data warehousing tools and techniques (such as SSIS, ODI, and on-prem SQL Server, Oracle) as well as modern technologies (such as Hadoop, Denodo, Spark, Airflow, and Python), and a solid understanding of best practices in data engineering -Advanced knowledge of SQL (e.g., understands subqueries, self-joining tables, stored procedures, can read an execution plan, SQL tuning, etc.) -Solid understanding of best practices in data warehousing, ETL, data modeling, and data architecture. -Experience with NoSQL databases (e.g., MongoDB, Cassandra) -Experience with cloud-based data warehousing and data pipeline management (AWS, GCP, Azure) -Experience in Python, Spark, and exposure to scheduling tools like Tuber/Airflow is preferred. -Able to create data dictionaries, setup and monitor data validation alerts, and execute periodic jobs to maintain data pipelines for completed projects -Experience with visualization software (e.g., Tableau, QlikView, PowerBI) is a plus. -A team player and collaborator, able to work well with a diverse group of individuals in a matrixed environment Additional Information Visa is an EEO Employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability or protected veteran status. Visa will also consider for employment qualified applicants with criminal histories in a manner consistent with EEOC guidelines and applicable local law. Show more Show less

Posted 1 week ago

Apply

1.0 - 3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

About PhonePe Group: PhonePe is India’s leading digital payments company with 50 crore (500 Million) registered users and 3.7 crore (37 Million) merchants covering over 99% of the postal codes across India. On the back of its leadership in digital payments, PhonePe has expanded into financial services (Insurance, Mutual Funds, Stock Broking, and Lending) as well as adjacent tech-enabled businesses such as Pincode for hyperlocal shopping and Indus App Store which is India's first localized App Store. The PhonePe Group is a portfolio of businesses aligned with the company's vision to offer every Indian an equal opportunity to accelerate their progress by unlocking the flow of money and access to services. Culture At PhonePe, we take extra care to make sure you give your best at work, Everyday! And creating the right environment for you is just one of the things we do. We empower people and trust them to do the right thing. Here, you own your work from start to finish, right from day one. Being enthusiastic about tech is a big part of being at PhonePe. If you like building technology that impacts millions, ideating with some of the best minds in the country and executing on your dreams with purpose and speed, join us! PhonePe is seeking passionate BI Engineers with 1-3 years of experience, ideally in Qlik Sense, to drive data availability and insights at scale. If you're driven by data and constantly seek better ways, join our innovative team! What would you get to do in this role? Work with large-scale datasets and solve real-world data modeling challenges to ensure scalability, flexibility, and efficiency in reporting and analytics. Develop interactive Qlik dashboards for various stakeholders to support data-driven decision-making. Help build and optimize data models that support robust reporting and analytics capabilities, while ensuring seamless integration with the organization’s data architecture. Collaborate with stakeholders to understand data requirements and ensure the right data is provided at the right time. Use modern open-source tools and technologies in the data processing stack, with opportunities to experiment and implement automation to improve data workflows. Contribute to the design and development of scalable data warehousing pipelines to process and aggregate raw data into actionable insights. Learn and grow in a dynamic environment, gaining expertise in BI and data visualization best practices. What do you need to have to apply for this position? 1-3 years of BI experience in relevant roles, preferably in a product-based firm. Proficient with Qlik Sense development, dashboard design and performance optimization. Proficient in creating and managing Qlik Sense reports, charts, and visualizations. Data warehousing, modeling & data flow understanding is desired. Strong knowledge in SQL - Hive experience will be preferred. Translate complex business requirements into interactive dashboards and reports. Good in collaboration and execution rigour. PhonePe Full Time Employee Benefits (Not applicable for Intern or Contract Roles) Insurance Benefits - Medical Insurance, Critical Illness Insurance, Accidental Insurance, Life Insurance Wellness Program - Employee Assistance Program, Onsite Medical Center, Emergency Support System Parental Support - Maternity Benefit, Paternity Benefit Program, Adoption Assistance Program, Day-care Support Program Mobility Benefits - Relocation benefits, Transfer Support Policy, Travel Policy Retirement Benefits - Employee PF Contribution, Flexible PF Contribution, Gratuity, NPS, Leave Encashment Other Benefits - Higher Education Assistance, Car Lease, Salary Advance Policy Working at PhonePe is a rewarding experience! Great people, a work environment that thrives on creativity, the opportunity to take on roles beyond a defined job description are just some of the reasons you should work with us. Read more about PhonePe on our blog. Life at PhonePe PhonePe in the news Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Senior Specialist, Product Management-1 Who is Mastercard? Mastercard is a global technology company in the payments industry. Our mission is to connect and power an inclusive, digital economy that benefits everyone, everywhere by making transactions safe, simple, smart, and accessible. Using secure data and networks, partnerships and passion, our innovations and solutions help individuals, financial institutions, governments, and businesses realize their greatest potential. Our decency quotient, or DQ, drives our culture and everything we do inside and outside of our company. With connections across more than 210 countries and territories, we are building a sustainable world that unlocks priceless possibilities for all. Overview The Senior Specialist of Product Management will report to the Vice President of Account Level Management (ALM) in Global Consumer Products & Processing responsible for client management and management of analytical solutions related to ALM. This Senior Specialist will partner with our internal stakeholders in Regional Teams and our Product Management Team to manage on-going strategic relationships with key clients through our Global ALM suite of solutions. This candidate should have the ability to collaborate across a diverse group of internal stakeholders & regional partners, effectively manage multiple priorities and demands, and possess a deep understanding of transaction processing & the credit card industry. Role This Senior Specialist will lead development and execution of analytical solutions across multiple customers. The role will require strong partnership skills as this Senior Specialist will be partnering with our regional lead in the US to ensure accurate execution of customer contract terms and partnering with customers to set-up testing and validation for the solutions leveraged. Quarterly monitoring and reporting on solution validity will be required as a measure of success. The Senior Specialist in this role will manage the relationship with the client on solution deployment and any impacts, while also identifying opportunities to scale the solutions improving customer penetration in partnership with the ALM Product Lead. This role will require the ability to collaborate across a diverse group of internal global stakeholders & regional partners, effectively manage multiple priorities and demands, and possess a deep understanding of transaction processing & the payments card industry as it continues to evolve into a digital footprint. The role will require availability during other key regional time zones. This candidate should be intellectually curious, energetic, a self-starter and able to operate with a sense of urgency. In addition, the role requires an individual who can demonstrate discipline in prioritizing efforts, and the ability to be comfortable managing through ambiguity. Candidate needs to have strong communication skills with an ability to refine and adjust communication to gain support & sponsorship from Executive Management, and experience in driving execution and alignment with Regional Teams, who may not share the same sense of prioritization or urgency. All About You A Bachelor’s Degree in business, finance, marketing, product management, or related field, or equivalent work experience (required) Knowledge / Experience Master’s Degree or equivalent work experience (preferred) Knowledge of Mastercard product and services suite (desirable) Proficient in python or R, Hive, Tableau, MSBI & applications, and VBA Experience with statistical modelling and predictive analytical techniques (preferred) Experience in overseeing multiple projects and initiatives concurrently Understanding of competitive offerings and industry trends Experience in working collaboratively in a cross-functional role operating with a sense of urgency to drive results Ability to influence and motivate others to achieve objectives Ability to think big and bold, innovate with intention, and deliver scalable solutions Ability to digest complex ideas and organize them into executable tasks Strong work ethic and a master of time management, organization, detail orientation, task initiation, planning and prioritization Self-starter and motivated to work independently with a proven track record of delivering success while operating within a team environment Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines. R-243116 Show more Show less

Posted 1 week ago

Apply

5.0 years

5 - 9 Lacs

Hyderābād

On-site

GlassDoor logo

- 5+ years of data scientist experience - Experience with data scripting languages (e.g. SQL, Python, R etc.) or statistical/mathematical software (e.g. R, SAS, or Matlab) - Experience with statistical models e.g. multinomial logistic regression - Experience in data applications using large scale distributed systems (e.g., EMR, Spark, Elasticsearch, Hadoop, Pig, and Hive) - Experience working with data engineers and business intelligence engineers collaboratively - Demonstrated expertise in a wide range of ML techniques The AOP (Analytics Operations and Programs) team is responsible for creating core analytics, insight generation and science capabilities for ROW Ops. We develop scalable analytics applications, AI/ML products and research models to optimize operation processes. You will work with Product Managers, Data Engineers, Data Scientists, Research Scientists, Applied Scientists and Business Intelligence Engineers using rigorous quantitative approaches to ensure high quality data/science products for our customers around the world. We are looking for a Sr.Data Scientist to join our growing Science Team. As Data Scientist, you are able to use a range of science methodologies to solve challenging business problems when the solution is unclear. You will be responsible for building ML models to solve complex business problems and test them in production environment. The scope of role includes defining the charter for the project and proposing solutions which align with org's priorities and production constraints but still create impact. You will achieve this by leveraging strong leadership and communication skills, data science skills and by acquiring domain knowledge pertaining to the delivery operations systems. You will provide ML thought leadership to technical and business leaders, and possess ability to think strategically about business, product, and technical challenges. You will also be expected to contribute to the science community by participating in science reviews and publishing in internal or external ML conferences. Our team solves a broad range of problems that can be scaled across ROW (Rest of the World including countries like India, Australia, Singapore, MENA and LATAM). Here is a glimpse of the problems that this team deals with on a regular basis: • Using live package and truck signals to adjust truck capacities in real-time • HOTW models for Last Mile Channel Allocation • Using LLMs to automate analytical processes and insight generation • Ops research to optimize middle mile truck routes • Working with global partner science teams to affect Reinforcement Learning based pricing models and estimating Shipments Per Route for $MM savings • Deep Learning models to synthesize attributes of addresses • Abuse detection models to reduce network losses Key job responsibilities 1. Use machine learning and analytical techniques to create scalable solutions for business problems Analyze and extract relevant information from large amounts of Amazon’s historical business data to help automate and optimize key processes 2. Design, develop, evaluate and deploy, innovative and highly scalable ML/OR models 3. Work closely with other science and engineering teams to drive real-time model implementations 4. Work closely with Ops/Product partners to identify problems and propose machine learning solutions 5. Establish scalable, efficient, automated processes for large scale data analyses, model development, model validation and model maintenance 6. Work proactively with engineering teams and product managers to evangelize new algorithms and drive the implementation of large-scale complex ML models in production 7. Leading projects and mentoring other scientists, engineers in the use of ML techniques Experience as a leader and mentor on a data science team Master's degree in a quantitative field such as statistics, mathematics, data science, business analytics, economics, finance, engineering, or computer science Expertise in Reinforcement Learning and Gen AI is preferred Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.

Posted 1 week ago

Apply

0 years

3 - 5 Lacs

Pune

On-site

GlassDoor logo

Pune About Us We empower enterprises globally through intelligent, creative, and insightful services for data integration, data analytics and data visualization. Hoonartek is a leader in enterprise transformation, data engineering and an acknowledged world-class Ab Initio delivery partner. Using centuries of cumulative experience, research and leadership, we help our clients eliminate the complexities & risk of legacy modernization and safely deliver big data hubs, operational data integration, business intelligence, risk & compliance solutions and traditional data warehouses & marts. At Hoonartek, we work to ensure that our customers, partners and employees all benefit from our unstinting commitment to delivery, quality and value. Hoonartek is increasingly the choice for customers seeking a trusted partner of vision, value and integrity How We Work? Define, Design and Deliver (D3) is our in-house delivery philosophy. It’s culled from agile and rapid methodologies and focused on ‘just enough design’. We embrace this philosophy in everything we do, leading to numerous client success stories and indeed to our own success. We embrace change, empowering and trusting our people and building long and valuable relationships with our employees, our customers and our partners. We work flexibly, even adopting traditional/waterfall methods where circumstances demand it. At Hoonartek, the focus is always on delivery and value. Job Description We are seeking a proactive and technically strong Site Reliability Engineer (SRE) to ensure the stability, performance, and scalability of our Data Engineering Platform. You will work on cutting-edge technologies including Cloudera Hadoop, Spark, Airflow, NiFi, and Kubernetes—ensuring high availability and driving automation to support massive-scale data workloads, especially in the telecom domain. Key Responsibilities • • Ensure platform uptime and application health as per SLOs/KPIs • • Monitor infrastructure and applications using ELK, Prometheus, Zabbix, etc. • • Debug and resolve complex production issues, performing root cause analysis • • Automate routine tasks and implement self-healing systems • • Design and maintain dashboards, alerts, and operational playbooks • • Participate in incident management, problem resolution, and RCA documentation • • Own and update SOPs for repeatable processes • • Collaborate with L3 and Product teams for deeper issue resolution • • Support and guide L1 operations team • • Conduct periodic system maintenance and performance tuning • • Respond to user data requests and ensure timely resolution • • Address and mitigate security vulnerabilities and compliance issues Technical Skillset • • Hands-on with Spark, Hive, Cloudera Hadoop, Kafka, Ranger • • Strong Linux fundamentals and scripting (Python, Shell) • • Experience with Apache NiFi, Airflow, Yarn, and Zookeeper • • Proficient in monitoring and observability tools: ELK Stack, Prometheus, Loki • • Working knowledge of Kubernetes, Docker, Jenkins CI/CD pipelines • • Strong SQL skills (Oracle/Exadata preferred) • • Familiarity with DataHub, DataMesh, and security best practices is a plus SHIFT - 24/7

Posted 1 week ago

Apply

8.0 years

0 Lacs

Bengaluru

On-site

GlassDoor logo

Imagine what you could do here. At Apple, we believe new insights have a way of becoming excellent products, services, and customer experiences very quickly. Bring passion and dedication to your job and there's no telling what you could accomplish. The people here at Apple don’t just build products - they build the kind of wonder that’s revolutionized entire industries. It’s the diversity of those people and their ideas that inspires the innovation that runs through everything we do, from amazing technology to industry-leading environmental efforts. Join Apple, and help us leave the world better than we found it. Apple's Manufacturing Systems and Infrastructure (MSI) team is responsible for capturing, consolidating and tracking all manufacturing data for Apple’s products and modules worldwide. Our tools enable teams to confidently use data to shape the next generation of product manufacturing at Apple. We seek a practitioner with experience building large-scale data platforms, analytic tools, and solutions. If you are passionate about making data easily accessible, trusted, and available across the entire business at scale, we'd love to hear from you. As a Software Engineering Manager, you are an integral part of a data-centric team driving large- scale data infrastructure and processes development, implementation, and improvement. Our organization thrives on collaborative partnerships. Join and play a key role in developing and driving the adoption of Agentic AI, LLMs, Data Mesh and data-centric micro-services. Description As an Engineering Manager, you will lead a team of engineers responsible for the development and implementation of our cloud-based data infrastructure. You will work closely with cross-functional teams to understand data requirements, design scalable solutions, and ensure the integrity and availability of our data. The ideal candidate will have a deep understanding of cloud technologies, data engineering best practices, and a proven track record of successfully delivering complex data projects. Key Responsibilities include: - Hire, develop, and retain top engineering talent - Build and nurture self-sustained, high-performing teams - Provide mentorship and technical guidance to engineers, fostering continuous learning and development - Lead the design, development, and deployment of scalable cloud-based data infrastructure and applications - Drive end-to-end execution of complex data engineering projects - Partner with Data Scientists, ML Engineers, and business stakeholders to understand data needs and translate them into scalable engineering solutions - Align technical strategy with business goals through effective communication and collaboration - Implement and enforce best practices for data security, privacy, and compliance with regulatory standards - Optimize data storage, processing, and retrieval for improved performance and cost efficiency. - Continuously evaluate and improve the system architecture and workflows - Stay current with emerging trends and technologies in cloud data engineering - Recommend and adopt tools, frameworks, and platforms that enhance productivity and reliability Minimum Qualifications Bachelor’s degree in Computer Science or a related field Minimum 8 years of experience in software development with at least 2 years in a technical leadership or management role. Proven experience as a Full stack developer, with a focus on cloud platforms. Proficient in programming languages such as Python. Strong hands-on expertise with Python frameworks (Django, Flask, or FastAPI, RESTful APIs), React.js and modern JavaScript Experience with authentication and authorization (OAuth, JWT) Strong understanding of cloud services, preferably AWS & Experience in building cloud native platforms using containerization technologies like Kubernetes, docker, helm Preferred Qualifications Knowledge of data warehouse solutions (BigQuery, Snowflake, Druid) and Big Data technologies such as Spark, Kafka, Hive, Iceberg, Trino, Flink. Experience with big data technologies (Hadoop, Spark, etc.). Experience with streaming data technologies (Kafka, Kinesis). Experience building data streaming solutions using Apache Spark / Apache Storm / Flink / Flume. Familiarity with machine learning pipelines is an added advantage. Proven ability to deliver complex, high-scale systems in a production environment. Strong people management and cross-functional collaboration skills. Submit CV

Posted 1 week ago

Apply

0 years

3 - 10 Lacs

Bengaluru

On-site

GlassDoor logo

Employment Type Permanent Closing Date 13 June 2025 11:59pm Job Title IT Domain Specialist Job Summary As the IT Domain Specialist, your role is key in improving the stability and reliability of our cloud offerings and solutions to ensure continuity of service for our customers. You will be responsible for supporting the end-to-end development of key cloud platform and solutions which includes technical design, integration requirements, delivery and lifecycle management. You are a specialist across and/or within a technology domain and viewed as the go-to person in the business to provide technical support in the development and delivery of cloud infrastructure platforms and solutions. Job Description Who We Are Telstra is Australia’s leading telecommunications and technology company spanning over a century with a footprint in over 20+ countries. In India, we’re building a platform for innovative delivery and engagement that will strengthen our position as an industry leader. We’ve grown quickly since our inception in 2019, now with offices in Pune, Hyderabad and Bangalore. Focus of the Role Event Data Engineer role is to plan, coordinate, and execute all activities related to the requirements interpretation, design and implementation of Business intelligence capability. This individual will apply proven industry and technology experience as well as communication skills, problem-solving skills, and knowledge of best practices to issues related to design, development, and deployment of mission-critical business systems with a focus on quality application development and delivery. What We Offer Performance-related pay Access to thousands of learning programs so you can level-up Global presence across 22 countries; opportunities to work where we do business. Up to 26 weeks maternity leave provided to the birth mother with benefits for all child births Employees are entitled to 12 paid holidays per calendar year Eligible employees are entitled to 12 days of paid sick / casual leave per calendar year Relocation support options across India, from junior to senior positions within the company Receive insurance benefits such as medical, accidental and life insurances What You’ll Do Experience in Analysis, Design, and Development in the fields of Business Intelligence, Databases and Web-based Applications. Experience in NiFi, Kafka, Spark, and Cloudera Platforms design and development. Experience in Alteryx Workflow development and Data Visualization development using Tableau to create complex, intuitive dashboards. In-depth understanding and experience in Cloudera framework includes CDP (Cloudera Data Platform). Experience in Cloudera manager to monitor Hadoop cluster and critical services . Hadoop administration ( Hive, Kafka, zookeeper etc.). Experience in data management including data integration, modeling, optimization and data quality. Strong knowledge in writing SQL and database management. Working experience in tools like Alteryx , KNIME will be added advantage. Implementing Data security and access control compliant to Telstra Security Standards Ability to review vendor designs and recommended solutions based on industry best practises Understand overall business operations and develops innovative solutions to help improve productivity Ability to understand and design provisioning solutions at Telstra and how Data lakes Monitor process of software configuration/development/testing to assure quality deliverable. Ensure standards of QA are being met Review deliverables to verify that they meet client and contract expectations; Implement and enforce high standards for quality deliverables Analyses performance and capacity issues of the highest complexity with Data applications. Assists leadership with development and management of new application capabilities to improve productivity Provide training and educate other team members around core capabilities and helps them deliver high quality solutions and deliverables/documentation Self-Motivator to perform Design / Develop user requirements, test and deploy the changes into production. About You Experience in data flow development and Data Visualization development to create complex, intuitive dashboards. Experience with Hortonworks Data Flow (HDF) this includes NiFi and Kafka experience with Cloudera Edge Big Data & Data Lake Experience Cloudera Hadoop with project implementation experience Data Analytics experience Data Analyst and Data Science exposure Exposure to various data management architectures like data warehouse, data lake and data hub, and supporting processes like data integration, data modeling. Working experience with large, heterogeneous datasets in building and optimizing data pipelines, pipeline architectures and integrated datasets using data integration technologies Experience in supporting operations and knowledge of standard operation procedures: OS Patches, Security Scan, Log Onboarding, Agent Onboarding, Log Extraction etc. Development and deployment and scaling of containerised applications with Docker preferred. A good understanding of enterprise application integration, including SOA, ESB, EAI, ETL environments and an understanding of integration considerations such as process orchestration, customer data integration and master data management A good understanding of the security processes, standards & issues involved in multi-tier, multi-tenant web applications We're amongst the top 2% of companies globally in the CDP Global Climate Change Index 2023, being awarded an 'A' rating. If you want to work for a company that cares about sustainability, we want to hear from you. As part of your application with Telstra, you may receive communications from us on +61 440 135 548 (for job applications in Australia) and +1 (623) 400-7726 (for job applications in the Philippines and India). When you join our team, you become part of a welcoming and inclusive community where everyone is respected, valued and celebrated. We actively seek individuals from various backgrounds, ethnicities, genders and disabilities because we know that diversity not only strengthens our team but also enriches our work. We have zero tolerance for harassment of any kind, and we prioritise creating a workplace culture where everyone is safe and can thrive. As part of the hiring process, all identified candidates will undergo a background check, and the results will play a role in the final decision regarding your application. We work flexibly at Telstra. Talk to us about what flexibility means to you. When you apply, you can share your pronouns and / or any reasonable adjustments needed to take part equitably during the recruitment process. We are aware of current limitations with our website accessibility and are working towards improving this. Should you experience any issues accessing information or the application form, and require this in an alternate format, please contact our Talent Acquisition team on DisabilityandAccessibility@team.telstra.com.

Posted 1 week ago

Apply

0 years

3 - 10 Lacs

Bengaluru

On-site

GlassDoor logo

Employment Type Permanent Closing Date 13 June 2025 11:59pm Job Title IT Domain Specialist Job Summary As the IT Domain Specialist, your role is key in improving the stability and reliability of our cloud offerings and solutions to ensure continuity of service for our customers. You will be responsible for supporting the end-to-end development of key cloud platform and solutions which includes technical design, integration requirements, delivery and lifecycle management. You are a specialist across and/or within a technology domain and viewed as the go-to person in the business to provide technical support in the development and delivery of cloud infrastructure platforms and solutions. Job Description Who We Are Telstra is Australia’s leading telecommunications and technology company spanning over a century with a footprint in over 20+ countries. In India, we’re building a platform for innovative delivery and engagement that will strengthen our position as an industry leader. We’ve grown quickly since our inception in 2019, now with offices in Pune, Hyderabad and Bangalore. Focus of the Role Event Data Engineer role is to plan, coordinate, and execute all activities related to the requirements interpretation, design and implementation of Business intelligence capability. This individual will apply proven industry and technology experience as well as communication skills, problem-solving skills, and knowledge of best practices to issues related to design, development, and deployment of mission-critical business systems with a focus on quality application development and delivery. What We Offer Performance-related pay Access to thousands of learning programs so you can level-up Global presence across 22 countries; opportunities to work where we do business. Up to 26 weeks maternity leave provided to the birth mother with benefits for all child births Employees are entitled to 12 paid holidays per calendar year Eligible employees are entitled to 12 days of paid sick / casual leave per calendar year Relocation support options across India, from junior to senior positions within the company Receive insurance benefits such as medical, accidental and life insurances What You’ll Do Experience in Analysis, Design, and Development in the fields of Business Intelligence, Databases and Web-based Applications. Experience in NiFi, Kafka, Spark, and Cloudera Platforms design and development. Experience in Alteryx Workflow development and Data Visualization development using Tableau to create complex, intuitive dashboards. In-depth understanding and experience in Cloudera framework includes CDP (Cloudera Data Platform). Experience in Cloudera manager to monitor Hadoop cluster and critical services . Hadoop administration ( Hive, Kafka, zookeeper etc.). Experience in data management including data integration, modeling, optimization and data quality. Strong knowledge in writing SQL and database management. Working experience in tools like Alteryx , KNIME will be added advantage. Implementing Data security and access control compliant to Telstra Security Standards Ability to review vendor designs and recommended solutions based on industry best practises Understand overall business operations and develops innovative solutions to help improve productivity Ability to understand and design provisioning solutions at Telstra and how Data lakes Monitor process of software configuration/development/testing to assure quality deliverable. Ensure standards of QA are being met Review deliverables to verify that they meet client and contract expectations; Implement and enforce high standards for quality deliverables Analyses performance and capacity issues of the highest complexity with Data applications. Assists leadership with development and management of new application capabilities to improve productivity Provide training and educate other team members around core capabilities and helps them deliver high quality solutions and deliverables/documentation Self-Motivator to perform Design / Develop user requirements, test and deploy the changes into production. About You Experience in data flow development and Data Visualization development to create complex, intuitive dashboards. Experience with Hortonworks Data Flow (HDF) this includes NiFi and Kafka experience with Cloudera Edge Big Data & Data Lake Experience Cloudera Hadoop with project implementation experience Data Analytics experience Data Analyst and Data Science exposure Exposure to various data management architectures like data warehouse, data lake and data hub, and supporting processes like data integration, data modeling. Working experience with large, heterogeneous datasets in building and optimizing data pipelines, pipeline architectures and integrated datasets using data integration technologies Experience in supporting operations and knowledge of standard operation procedures: OS Patches, Security Scan, Log Onboarding, Agent Onboarding, Log Extraction etc. Development and deployment and scaling of containerised applications with Docker preferred. A good understanding of enterprise application integration, including SOA, ESB, EAI, ETL environments and an understanding of integration considerations such as process orchestration, customer data integration and master data management A good understanding of the security processes, standards & issues involved in multi-tier, multi-tenant web applications We're amongst the top 2% of companies globally in the CDP Global Climate Change Index 2023, being awarded an 'A' rating. If you want to work for a company that cares about sustainability, we want to hear from you. As part of your application with Telstra, you may receive communications from us on +61 440 135 548 (for job applications in Australia) and +1 (623) 400-7726 (for job applications in the Philippines and India). When you join our team, you become part of a welcoming and inclusive community where everyone is respected, valued and celebrated. We actively seek individuals from various backgrounds, ethnicities, genders and disabilities because we know that diversity not only strengthens our team but also enriches our work. We have zero tolerance for harassment of any kind, and we prioritise creating a workplace culture where everyone is safe and can thrive. As part of the hiring process, all identified candidates will undergo a background check, and the results will play a role in the final decision regarding your application. We work flexibly at Telstra. Talk to us about what flexibility means to you. When you apply, you can share your pronouns and / or any reasonable adjustments needed to take part equitably during the recruitment process. We are aware of current limitations with our website accessibility and are working towards improving this. Should you experience any issues accessing information or the application form, and require this in an alternate format, please contact our Talent Acquisition team on DisabilityandAccessibility@team.telstra.com.

Posted 1 week ago

Apply

2.0 years

6 - 7 Lacs

Bengaluru

On-site

GlassDoor logo

Company Description Visa is a world leader in payments and technology, with over 259 billion payments transactions flowing safely between consumers, merchants, financial institutions, and government entities in more than 200 countries and territories each year. Our mission is to connect the world through the most innovative, convenient, reliable, and secure payments network, enabling individuals, businesses, and economies to thrive while driven by a common purpose – to uplift everyone, everywhere by being the best way to pay and be paid. Make an impact with a purpose-driven industry leader. Join us today and experience Life at Visa. Job Description Functional Summary The GTM Optimization and Business Health team has a simple mission: we turn massive amounts of data into robust tools and actionable insights that drive business value, ensure ecosystem integrity, and provide best in class experience to our money movement clients. Our team is working to build consolidated, strategic and scalable analytics and monitoring infrastructure for commercial and money movement products. Responsibilities The Process Optimization Analyst will create risk, rules, and performance monitoring dashboards and alerting tools and will use these to monitor transactions in near real time, investigate alerts and anomalous events, and partner with internal teams to investigate and manage incidents from end-to-end. Specific activities may include: Develop monitoring and alerting tools from real-time data feeds to monitor for performance drops, risk and fraud events, and rules violations Monitor near real time alerting tools and investigate and generate incidents for risk events and out of pattern activity Manage a caseload to ensure appropriate investigation and resolution of identified risk and performance events Drive to understand the root problems, define analytical objectives and formalize data requirements for various types of dashboards and analyses Design and launch robust and intuitive dashboards supporting best in class money movement client experience Create and present analytic deliverables to colleagues in the analytics team, other internal stakeholders with varying degrees of analytical and technical expertise Distill massive amounts of data across disparate data sources into efficient functional data repositories in a Big Data environment Independently perform analysis to derive insights and render robust, thoughtful results Partner with Visa Direct and money movement teams across multiple areas of the business to understand their data and reporting needs Compare client performance against industry best practices with a shrewd eye toward identifying performance and/or profitability improvement opportunity Develop presentations of complex data and content for clients in an accurate, understandable, and engaging manner This is a hybrid position. Expectation of days in office will be confirmed by your Hiring Manager. Qualifications Basic Qualifications: 3 or more years of relevant work experience with a Bachelor’s Degree or at least 2 years of work experience with an Advanced degree (e.g. Masters, MBA, JD, MD) or 0 years of work experience with a PhD Preferred Qualifications: 3 or more years of work experience with a Bachelor’s Degree or 2 or more years of relevant experience with an Advanced Degree (e.g. Masters, MBA, JD, MD) or up to 1 years of relevant experience with a PhD Experience monitoring real-time data and following incident management workflows Familiarity with Microsoft Dynamics or other ERP/CRM tools Proficiency in Tableau and experience with best in class data visualization Experience with Elasticsearch and Kibana dashboard and alerting High level of proficiency manipulating data from a variety of sources - Big data skills (Hadoop, Hive, Spark) and/or SQL skills required Strong verbal, written, and interpersonal skills Proficient in all MS Office applications with advanced Excel spreadsheet skills Functional knowledge of programming languages such as Python, Java, and/or Shell Scripting Strong strategic thinking, problem-solving, and decision-making abilities, with the ability to translate complex data into actionable insights Visa experience or knowledge of the payments industry Additional Information Visa is an EEO Employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability or protected veteran status. Visa will also consider for employment qualified applicants with criminal histories in a manner consistent with EEOC guidelines and applicable local law.

Posted 1 week ago

Apply

4.0 years

10 - 17 Lacs

India

On-site

GlassDoor logo

We are looking for an Only immediate joiner and e*xperienced Big Data Developer with a strong background in PySpark, Python/Scala, Spark, SQL, and the Hadoop ecosystem. The ideal candidate should have over 4 years of experience and be ready to join immediately.* This role requires hands-on expertise in big data technologies and the ability to design and implement robust data processing solutions. Key Responsibilities: Design, develop, and optimize large-scale data processing pipelines using PySpark. Work with various Apache tools and frameworks (like Hadoop, Hive, HDFS, etc.) to ingest, transform, and manage large datasets. Ensure high performance and reliability of ETL jobs in production. Collaborate with Data Scientists, Analysts, and other stakeholders to understand data needs and deliver robust data solutions. Implement data quality checks and data lineage tracking for transparency and auditability. Work on data ingestion, transformation, and integration from multiple structured and unstructured sources. Leverage Apache NiFi for automated and repeatable data flow management (if applicable). Write clean, efficient, and maintainable code in Python and Java. Contribute to architectural decisions, performance tuning, and scalability planning. Required Skills: 5–7 years of experience. Strong hands-on experience with PySpark for distributed data processing. Deep understanding of Apache ecosystem (Hadoop, Hive, Spark, HDFS, etc.). Solid grasp of data warehousing, ETL principles, and data modeling. Experience working with large-scale datasets and performance optimization. Familiarity with SQL and NoSQL databases. Proficiency in Python and basic to intermediate knowledge of Java. Experience in using version control tools like Git and CI/CD pipelines. Nice-to-Have Skills: Working experience with Apache NiFi for data flow orchestration. Experience in building real-time streaming data pipelines. Knowledge of cloud platforms like AWS, Azure, or GCP. Familiarity with containerization tools like Docker or orchestration tools like Kubernetes. Soft Skills: Strong analytical and problem-solving skills. Excellent communication and collaboration abilities. Self-driven with the ability to work independently and as part of a team. Education: Bachelor’s or Master’s degree in Computer Science, Information Systems, or a related field. Job Type: Full-time Pay: ₹1,000,000.00 - ₹1,700,000.00 per year Benefits: Health insurance Schedule: Day shift Supplemental Pay: Performance bonus Yearly bonus Ability to commute/relocate: Basavanagudi, Bengaluru, Karnataka: Reliably commute or planning to relocate before starting work (Preferred) Application Question(s): Are you ready to join within 15 days? What is your Current CTC ? Experience: Python: 4 years (Preferred) Pyspark: 4 years (Required) Data warehouse: 4 years (Required) Work Location: In person Application Deadline: 12/06/2025

Posted 1 week ago

Apply

5.0 years

2 - 3 Lacs

Bengaluru

On-site

GlassDoor logo

Company Description Visa is a world leader in payments and technology, with over 259 billion payments transactions flowing safely between consumers, merchants, financial institutions, and government entities in more than 200 countries and territories each year. Our mission is to connect the world through the most innovative, convenient, reliable, and secure payments network, enabling individuals, businesses, and economies to thrive while driven by a common purpose – to uplift everyone, everywhere by being the best way to pay and be paid. Make an impact with a purpose-driven industry leader. Join us today and experience Life at Visa. Job Description To ensure that Visa’s payment technology is truly available to everyone, everywhere requires the success of our key bank or merchant partners and internal business units. The Global Data Science group supports these partners by using our extraordinarily rich data set that spans more than 3 billion cards globally and captures more than 100 billion transactions in a single year. Our focus lies on building creative solutions that have an immediate impact on the business of our highly analytical partners. We work in complementary teams comprising members from Data Science and various groups at Visa. To support our rapidly growing group we are looking for Data Scientists who are equally passionate about the opportunity to use Visa’s rich data to tackle meaningful business problems. You will join one of the Data Science focus areas (e.g., banks, merchants & retailers, digital products, marketing) with an opportunity for rotation within Data Science to gain broad exposure to Visa’s business. The role will be based in Bengaluru, India Essential Functions Be an out-of-the-box thinker who is passionate about brainstorming innovative ways to use our unique data to answer business problems Communicate with clients to understand the challenges they face and convince them with data Extract and understand data to form an opinion on how to best help our clients and derive relevant insights Develop visualizations to make your complex analyses accessible to a broad audience Find opportunities to craft products out of analyses that are suitable for multiple clients Work with stakeholders throughout the organization to identify opportunities for leveraging Visa data to drive business solutions. Mine and analyze data from company databases to drive optimization and improvement of product, marketing techniques and business strategies for Visa and its clients Assess the effectiveness and accuracy of new data sources and data gathering techniques. Develop custom data models and algorithms to apply to data sets. Use predictive modeling to increase and optimize customer experiences, revenue generation, data insights, advertising targeting and other business outcomes. Develop processes and tools to monitor and analyze model performance and data accuracy. This is a hybrid position. Expectation of days in office will be confirmed by your Hiring Manager. Qualifications Basic Qualifications Bachelor’s or Master’s degree in Statistics, Operations Research, Applied Mathematics, Economics, Data Science, Business Analytics, Computer Science, or a related technical field 5+ years of work experience with a bachelor’s degree or 2+ years’ experience with an advance degree (e.g., Master’s or MBA) Analyzing large data sets using programming languages such as Python, R, SQL and/or Spark Developing and refining machine learning models for predictive analytics, classification and regression tasks. Preferred Qualifications 5+ years’ experience in data-based decision-making or quantitative analysis Knowledge of ETL pipelines in Spark, Python, HIVE that process transaction and account level data and standardize data fields across various data sources Generating and visualizing data-based insights in software such as Tableau Competence in Excel, PowerPoint Previous exposure to financial services, credit cards or merchant analytics is a plus Additional Information Visa is an EEO Employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability or protected veteran status. Visa will also consider for employment qualified applicants with criminal histories in a manner consistent with EEOC guidelines and applicable local law.

Posted 1 week ago

Apply

170.0 years

1 - 2 Lacs

Chennai

On-site

GlassDoor logo

Country/Region: IN Requisition ID: 26152 Work Model: Position Type: Salary Range: Location: INDIA - CHENNAI - RNTBCI Title: Lead Data Engineer - AWS Description: Area(s) of responsibility Empowered By Innovation Birlasoft, a global leader at the forefront of Cloud, AI, and Digital technologies, seamlessly blends domain expertise with enterprise solutions. The company’s consultative and design-thinking approach empowers societies worldwide, enhancing the efficiency and productivity of businesses. As part of the multibillion-dollar diversified CKA Birla Group, Birlasoft with its 12,000+ professionals, is committed to continuing the Group’s 170-year heritage of building sustainable communities. Role: Lead Data Engineer -AWS Location: Bangalore /Chennai Experience: 5 – 7 Years Job Profile: Provide estimates for requirements, analyses and develop as per the requirement. Developing and maintaining data pipelines and ETL (Extract, Transform, Load) processes to extract data efficiently and reliably from various sources, transform it into a usable format, and load it into the appropriate data repositories. Creating and maintaining logical and physical data models that align with the organization's data architecture and business needs. This includes defining data schemas, tables, relationships, and indexing strategies for optimal data retrieval and analysis. Collaborating with cross-functional teams and stakeholders to ensure data security, privacy, and compliance with regulations. Collaborate with downstream application to understand their needs and build the data storage and optimize as per their need. Working closely with other stakeholders and Business to understand data requirements and translate them into technical solutions. Familiar with Agile methodologies and have prior experience working with Agile teams using Scrum/Kanban Lead Technical discussions with customers to find the best possible solutions. Proactively identify and implement opportunities to automate tasks and develop reusable frameworks. Optimizing data pipelines to improve performance and cost, while ensuring a high quality of data within the data lake. Monitoring services and jobs for cost and performance, ensuring continual operations of data pipelines, and fixing of defects. Constantly looking for opportunities to optimize data pipelines to improve performance Must Have: Hand on Expertise of 4- 5 years in AWS services like S3, Lambda, Glue, Athena, RDS, Step functions, SNS, SQS, API Gateway, Security, Access and Role permissions, Logging and monitoring Services. Good hand on knowledge on Python, Spark, Hive and Unix, AWS CLI Prior experience in working with streaming solution like Kafka . Prior experience in implementing different file storage types like Delta-lake / Ice-berg. Excellent knowledge in Data modeling and Designing ETL pipeline. Must have strong knowledge in using different databases such as MySQL, Oracle and Writing complex queries. Strong experience working in a continuous integration and Deployment process. Pyspark, AWS ,SQL, Kafka Nice to Have: Hand on experience in the Terraform, GIT, GIT Actions. CICD pipeline and Amazon Q. Terraform, GIT, GIT Actions. CICD pipeline , AI

Posted 1 week ago

Apply

5.0 years

5 - 9 Lacs

Chennai

On-site

GlassDoor logo

Job ID: 18927 Location: Chennai, IN Area of interest: Technology Job type: Regular Employee Work style: Office Working Opening date: 15 May 2025 Job Summary Responsible for Building and maintaining high-performance data systems that enable deeper insights for all parts of our organization Responsible for Developing ETL/ELT pipelines for both batch and streaming data Responsible for Data flow for the real-time and analytics Improving data pipelines performance by implementing the industry’s best practices and different techniques for data parallel processing Responsible for the documentation, design, development and testing of Hadoop reporting and analytical application. Responsible for Technical discussion and finalization of the requirement by communicating effectively with Stakeholder. Responsible for converting functional requirements into the detailed technical design Responsible for adhering to SCRUM timelines and deliver accordingly Responsible for preparing the Unit/SIT/UAT test cases and log the results Responsible for Planning and tracking the implementation to closure Ability to drive enterprise-wide initiatives for usage of external data Envision enterprise-wide Entitlement’s platform and align it with Bank’s NextGen technology vision. Continually looking for process improvements Coordinate between various technical teams for various systems for smooth project execution starting from technical requirements discussion, overall architecture design, technical solution discussions, build, unit testing, regression testing, system integration testing, user acceptance testing, go live, user verification testing and rollback [if required] Prepare technical plan with clear milestone dates for technical tasks which will be input to the PM’s overall project plan. Coordinate with technical teams across technology on need basis who are not directly involved in the project example: Firewall network teams, DataPower teams, EDMP , OAM, OIM, ITSC , GIS teams etc. Responsible to support change management process Responsible to work alongside PSS teams and ensure proper KT sessions are provided to the support teams. Ensure to identify any risks within the project and get that recorded in Risk wise after discussion with business and manager. Ensure the project delivery is seamless with zero to negligible defects. Key Responsibilities Hands on experience with C++, .Net, SQL Language, jQuery, Web API & Service, Postgres SQL & MS SQL server, Azure Dev Ops & related, GitHub, ADO CI/CD Pipeline Should be transversal to handle Linux, PowerShell, Unix shell scripting, Kafka, Spark streaming Hadoop – Hive, Spark, Python, PYSpark Hands on experience of workflow/schedulers like NIFI/Ctrl-m Experience with Data loading tools like sqoop Experience and understanding of Object-oriented programming Motivation to learn innovative trade of programming, debugging, and deploying Self-starter, with excellent self-study skills and growth aspirations, capable of working without direction and able to deliver technical projects from scratch Excellent written and verbal communication skills. Flexible attitude, perform under pressure Ability to lead and influence direction and strategy of technology organization Test driven development, commitment to quality and a thorough approach to work A good team player with ability to meet tight deadlines in a fast-paced environment Guide junior’s developers and share the best practices Having Cloud certification will be an added advantage: any one of Azure/Aws/GCP Must have Knowledge & understanding of Agile principles Must have good understanding of project life cycle Must have Sound problem analysis and resolution abilities Good understanding of External & Internal Data Management & implications of Cloud usage in context of external data Strategy Develop the strategic direction and roadmap for CRES TTO, aligning with Business Strategy, ITO Strategy and investment priorities. Business Work hand in hand with Product Owners, Business Stakeholders, Squad Leads, CRES TTO partners taking product programs from investment decisions into design, specifications, solutioning, development, implementation and hand-over to operations, securing support and collaboration from other SCB teams Ensure delivery to business meeting time, cost and high quality constraints Support respective businesses in growing Return on investment, commercialisation of capabilities, bid teams, monitoring of usage, improving client experience, enhancing operations and addressing defects & continuous improvement of systems Thrive an ecosystem of innovation and enabling business through technology Governance Promote an environment where compliance with internal control functions and the external regulatory framework People & Talent Ability to work with other developers and assist junior team members. Identify training needs and take action to ensure company-wide compliance. Pursue continuing education on new solutions, technology, and skills. Problem solving with other team members in the project. Risk Management Interpreting briefs to create high-quality coding that functions according to specifications. Key stakeholders CRES Domain Clients Functions MT members, Operations and COO ITO engineering, build and run teams Architecture and Technology Support teams Supply Chain Management, Risk, Legal, Compliance and Audit teams External vendors Regulatory & Business Conduct Display exemplary conduct and live by the Group’s Values and Code of Conduct. Take personal responsibility for embedding the highest standards of ethics, including regulatory and business conduct, across Standard Chartered Bank. This includes understanding and ensuring compliance with, in letter and spirit, all applicable laws, regulations, guidelines and the Group Code of Conduct. Lead the team to achieve the outcomes set out in the Bank’s Conduct Principles: [Fair Outcomes for Clients; Effective Financial Markets; Financial Crime Compliance; The Right Environment.] * Effectively and collaboratively identify, escalate, mitigate and resolve risk, conduct and compliance matters. Serve as a Director of the Board Exercise authorities delegated by the Board of Directors and act in accordance with Articles of Association (or equivalent) Other Responsibilities Embed Here for good and Group’s brand and values in team Perform other responsibilities assigned under Group, Country, Business or Functional policies and procedures Multiple functions (double hats) Skills and Experience Technical Project Delivery (Agile & Classic) Vendor Management Stakeholder Management Qualifications 5+ years of lead development role Should have managed a team of minimum 5 members Should have delivered multiple projects end to end Experience in Property Technology products (eg. Lenel, CBRE, Milestone etc) Strong analytical, numerical and problem-solving skills Should be able to understand and communicate technical details of the project Good communication skills – oral and written. Very good exposure to technical projects Eg: server maintenance, system administrator or development or implementation experience Effective interpersonal, relational skills to be able to coach and develop the team to deliver their best Certified Scrum Master About Standard Chartered We're an international bank, nimble enough to act, big enough for impact. For more than 170 years, we've worked to make a positive difference for our clients, communities, and each other. We question the status quo, love a challenge and enjoy finding new opportunities to grow and do better than before. If you're looking for a career with purpose and you want to work for a bank making a difference, we want to hear from you. You can count on us to celebrate your unique talents and we can't wait to see the talents you can bring us. Our purpose, to drive commerce and prosperity through our unique diversity, together with our brand promise, to be here for good are achieved by how we each live our valued behaviours. When you work with us, you'll see how we value difference and advocate inclusion. Together we: Do the right thing and are assertive, challenge one another, and live with integrity, while putting the client at the heart of what we do Never settle, continuously striving to improve and innovate, keeping things simple and learning from doing well, and not so well Are better together, we can be ourselves, be inclusive, see more good in others, and work collectively to build for the long term What we offer In line with our Fair Pay Charter, we offer a competitive salary and benefits to support your mental, physical, financial and social wellbeing. Core bank funding for retirement savings, medical and life insurance, with flexible and voluntary benefits available in some locations. Time-off including annual leave, parental/maternity (20 weeks), sabbatical (12 months maximum) and volunteering leave (3 days), along with minimum global standards for annual and public holiday, which is combined to 30 days minimum. Flexible working options based around home and office locations, with flexible working patterns. Proactive wellbeing support through Unmind, a market-leading digital wellbeing platform, development courses for resilience and other human skills, global Employee Assistance Programme, sick leave, mental health first-aiders and all sorts of self-help toolkits A continuous learning culture to support your growth, with opportunities to reskill and upskill and access to physical, virtual and digital learning. Being part of an inclusive and values driven organisation, one that embraces and celebrates our unique diversity, across our teams, business functions and geographies - everyone feels respected and can realise their full potential. www.sc.com/careers

Posted 1 week ago

Apply

3.0 years

0 Lacs

Andhra Pradesh

On-site

GlassDoor logo

We are looking for a PySpark solutions developer and data engineer who can design and build solutions for one of our Fortune 500 Client programs, which aims towards building a data standardized and curation needs on Hadoop cluster. This is high visibility, fast-paced key initiative will integrate data across internal and external sources, provide analytical insights, and integrate with the customers critical systems. Key Responsibilities Ability to design, build and unit test applications on Spark framework on Python. Build PySpark based applications for both batch and streaming requirements, which will require in-depth knowledge on majority of Hadoop and NoSQL databases as well. Develop and execute data pipeline testing processes and validate business rules and policies. Optimize performance of the built Spark applications in Hadoop using configurations around Spark Context, Spark-SQL, Data Frame, and Pair RDD's. Optimize performance for data access requirements by choosing the appropriate native Hadoop file formats (Avro, Parquet, ORC etc) and compression codec respectively. Build integrated solutions leveraging Unix shell scripting, RDBMS, Hive, HDFS File System, HDFS File Types, HDFS compression codec. Build data tokenization libraries and integrate with Hive & Spark for column-level obfuscation. Experience in processing large amounts of structured and unstructured data, including integrating data from multiple sources. Create and maintain integration and regression testing framework on Jenkins integrated with Bit Bucket and/or GIT repositories. Participate in the agile development process, and document and communicate issues and bugs relative to data standards in scrum meetings. Work collaboratively with onsite and offshore team. Develop & review technical documentation for artifacts delivered. Ability to solve complex data-driven scenarios and triage towards defects and production issues. Ability to learn-unlearn-relearn concepts with an open and analytical mindset. Participate in code release and production deployment. Challenge and inspire team members to achieve business results in a fast paced and quickly changing environment. Preferred Qualifications BE/B.Tech/ B.Sc. in Computer Science/ Statistics from an accredited college or university. Minimum 3 years of extensive experience in design, build and deployment of PySpark-based applications. Expertise in handling complex large-scale Big Data environments preferably (20Tb+). Minimum 3 years of experience in the following: HIVE, YARN, HDFS. Hands-on experience writing complex SQL queries, exporting, and importing large amounts of data using utilities. Ability to build abstracted, modularized reusable code components. Prior experience on ETL tools preferably Informatica PowerCenter is advantageous. Able to quickly adapt and learn. Able to jump into an ambiguous situation and take the lead on resolution. Able to communicate and coordinate across various teams. Are comfortable tackling new challenges and new ways of working Are ready to move from traditional methods and adapt into agile ones Comfortable challenging your peers and leadership team. Can prove yourself quickly and decisively. Excellent communication skills and Good Customer Centricity. Strong Target & High Solution Orientation. About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.

Posted 1 week ago

Apply

5.0 years

0 Lacs

Bangalore Urban, Karnataka, India

On-site

Linkedin logo

Job Title: Data Analyst / Technical Business Analyst Job Summary We are looking for a skilled Data Analyst to support a large-scale data migration initiative within the banking and insurance domain. The role involves analyzing, validating, and transforming data from legacy systems to modern platforms, ensuring regulatory compliance, data integrity, and business continuity. Key Responsibilities Collaborate with business stakeholders, data architects, and IT teams to gather and understand data migration requirements. Analyze legacy banking and insurance systems (e.g., core banking, policy admin, claims, CRM) to identify data structures and dependencies. Work with large-scale datasets and understand big data architectures (e.g., Hadoop, Spark, Hive) to support scalable data migration and transformation. Perform data profiling, cleansing, and transformation using SQL and ETL tools, with the ability to understand and write complex SQL queries and interpret the logic implemented in ETL workflows. Develop and maintain data mapping documents and transformation logic specific to financial and insurance data (e.g., customer KYC, transactions, policies, claims). Validate migrated data against business rules, regulatory standards, and reconciliation reports. Support UAT by preparing test cases and validating migrated data with business users. Ensure data privacy and security compliance throughout the migration process. Document issues, risks, and resolutions related to data quality and migration. Required Skills & Qualifications Bachelor’s degree in Computer Science, Information Systems, Finance, or a related field. 5+ years of experience in data analysis or data migration projects in banking or insurance. Strong SQL skills and experience with data profiling and cleansing. Familiarity with ETL tools (e.g., Informatica, Talend, SSIS) and data visualization tools (e.g., Power BI, Tableau). Experience working with big data platforms (e.g., Hadoop, Spark, Hive) and handling large volumes of structured and unstructured data. Understanding of banking and insurance data domains (e.g., customer data, transactions, policies, claims, underwriting). Knowledge of regulatory and compliance requirements (e.g., AML, KYC, GDPR, IRDAI guidelines). Excellent analytical, documentation, and communication skills. Preferred Qualifications Experience with core banking systems (e.g., Finacle, Flexcube) or insurance platforms Exposure to cloud data platforms (e.g.,AWS, Azure, GCP). Experience working in Agile/Scrum environments. Certification in Business Analysis (e.g., CBAP, CCBA) or Data Analytics. Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Thiruvananthapuram, Kerala, India

Remote

Linkedin logo

Brief Description The Cloud Data Engineer will play a critical implementation role on the Data Engineering and Data Products team and be responsible for data pipeline solutions design and development, troubleshooting, and optimization tuning on the next generation data and analytics platform being developed with leading edge big data technologies in a highly secure cloud infrastructure. The Cloud Data Engineer will serve as a liaison to platform user groups ensuring successful implementation of capabilities on the new platform. Data Engineer Responsibilities : Deliver end-to-end data and analytics capabilities, including data ingest, data transformation, data science, and data visualization in collaboration with Data and Analytics stakeholder groups Design and deploy databases and data pipelines to support analytics projects Develop scalable and fault-tolerant workflows Clearly document issues, solutions, findings and recommendations to be shared internally & externally Learn and apply tools and technologies proficiently, including: Languages: Python, PySpark, ANSI SQL, Python ML libraries Frameworks/Platform: Spark, Snowflake, Airflow, Hadoop , Kafka Cloud Computing: AWS Tools/Products: PyCharm, Jupyter, Tableau, PowerBI Performance optimization for queries and dashboards Develop and deliver clear, compelling briefings to internal and external stakeholders on findings, recommendations, and solutions Analyze client data & systems to determine whether requirements can be met Test and validate data pipelines, transformations, datasets, reports, and dashboards built by team Develop and communicate solutions architectures and present solutions to both business and technical stakeholders Provide end user support to other data engineers and analysts Candidate Requirements : Expert experience in the following[ Should have / Good to have ]: SQL, Python, PySpark, Python ML libraries. Other programming languages (R, Scala, SAS, Java, etc.) are a plus Data and analytics technologies including SQL/NoSQL/Graph databases, ETL, and BI Knowledge of CI/CD and related tools such as Gitlab, AWS CodeCommit etc. AWS services including EMR, Glue, Athena, Batch, Lambda CloudWatch, DynamoDB, EC2, CloudFormation, IAM and EDS Exposure to Snowflake and Airflow. Solid scripting skills (e.g., bash/shell scripts, Python) Proven work experience in the following: Data streaming technologies Big Data technologies including, Hadoop, Spark, Hive, Teradata, etc. Linux command-line operations Networking knowledge (OSI network layers, TCP/IP, virtualization) Candidate should be able to lead the team, communicate with business, gather and interpret business requirements Experience with agile delivery methodologies using Jira or similar tools Experience working with remote teams AWS Solutions Architect / Developer / Data Analytics Specialty certifications, Professional certification is a plus Bachelor Degree in Computer Science relevant field, Masters Degree is a plus Show more Show less

Posted 1 week ago

Apply

4.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

Greetings from TCS!!!!!! We have an opportunity for Big Data Major Skill: Pyspark and Hive Experience: 4+ Years Work Mode: Work from office Location: Chennai/Mumbai/Pune/ Jd: Ingest data from disparate sources (Structured, unstructured and semi-structured) and develop ETL jobs using the above skills. Do impact analysis and come up with estimates Take responsibility for end-to-end deliverable. Create Project Plan & Work on Implementation Strategy Need to have comprehensive understanding on ETL concepts and Cross Environment Data Transfers Need to Handle Customer Communications and Management Reporting Show more Show less

Posted 1 week ago

Apply

4.0 - 6.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Linkedin logo

Responsible for developing, optimize, and maintaining business intelligence and data warehouse systems, ensuring secure, efficient data storage and retrieval, enabling self-service data exploration, and supporting stakeholders with insightful reporting and analysis. Grade - T5 Please note that the Job will close at 12am on Posting Close date, so please submit your application prior to the Close Date Accountabilities What your main responsibilities are: Data Pipeline - Develop and maintain scalable data pipelines and builds out new API integrations to support continuing increases in data volume and complexity Data Integration - Connect offline and online data to continuously improve overall understanding of customer behavior and journeys for personalization. Data pre-processing including collecting, parsing, managing, analyzing and visualizing large sets of data Data Quality Management - Cleanse the data and improve data quality and readiness for analysis. Drive standards, define and implement/improve data governance strategies and enforce best practices to scale data analysis across platforms Data Transformation - Processes data by cleansing data and transforming them to proper storage structure for the purpose of querying and analysis using ETL and ELT process Data Enablement - Ensure data is accessible and useable to wider enterprise to enable a deeper and more timely understanding of operation. Qualifications & Specifications Masters /Bachelor’s degree in Engineering /Computer Science/ Math/ Statistics or equivalent. Strong programming skills in Python/Pyspark/SAS. Proven experience with large data sets and related technologies – Hadoop, Hive, Distributed computing systems, Spark optimization. Experience on cloud platforms (preferably Azure) and it's services Azure Data Factory (ADF), ADLS Storage, Azure DevOps. Hands-on experience on Databricks, Delta Lake, Workflows. Should have knowledge of DevOps process and tools like Docker, CI/CD, Kubernetes, Terraform, Octopus. Hands-on experience with SQL and data modeling to support the organization's data storage and analysis needs. Experience on any BI tool like Power BI (Good to have). Cloud migration experience (Good to have) Cloud and Data Engineering certification (Good to have) Working in an Agile environment 4-6 Years Of Relevant Work Experience Is Required. Experience with stakeholder management is an added advantage. What We Are Looking For Education: Bachelor's degree or equivalent in Computer Science, MIS, Mathematics, Statistics, or similar discipline. Master's degree or PhD preferred. Knowledge, Skills And Abilities Fluency in English Analytical Skills Accuracy & Attention to Detail Numerical Skills Planning & Organizing Skills Presentation Skills Data Modeling and Database Design ETL (Extract, Transform, Load) Skills Programming Skills FedEx was built on a philosophy that puts people first, one we take seriously. We are an equal opportunity/affirmative action employer and we are committed to a diverse, equitable, and inclusive workforce in which we enforce fair treatment, and provide growth opportunities for everyone. All qualified applicants will receive consideration for employment regardless of age, race, color, national origin, genetics, religion, gender, marital status, pregnancy (including childbirth or a related medical condition), physical or mental disability, or any other characteristic protected by applicable laws, regulations, and ordinances. Our Company FedEx is one of the world's largest express transportation companies and has consistently been selected as one of the top 10 World’s Most Admired Companies by "Fortune" magazine. Every day FedEx delivers for its customers with transportation and business solutions, serving more than 220 countries and territories around the globe. We can serve this global network due to our outstanding team of FedEx team members, who are tasked with making every FedEx experience outstanding. Our Philosophy The People-Service-Profit philosophy (P-S-P) describes the principles that govern every FedEx decision, policy, or activity. FedEx takes care of our people; they, in turn, deliver the impeccable service demanded by our customers, who reward us with the profitability necessary to secure our future. The essential element in making the People-Service-Profit philosophy such a positive force for the company is where we close the circle, and return these profits back into the business, and invest back in our people. Our success in the industry is attributed to our people. Through our P-S-P philosophy, we have a work environment that encourages team members to be innovative in delivering the highest possible quality of service to our customers. We care for their well-being, and value their contributions to the company. Our Culture Our culture is important for many reasons, and we intentionally bring it to life through our behaviors, actions, and activities in every part of the world. The FedEx culture and values have been a cornerstone of our success and growth since we began in the early 1970’s. While other companies can copy our systems, infrastructure, and processes, our culture makes us unique and is often a differentiating factor as we compete and grow in today’s global marketplace. Show more Show less

Posted 1 week ago

Apply

1.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Description Amazon is a place where data drives most of our decision-making. Analytics, Operations & Programs (AOP) team is looking for a dynamic data engineer who can be innovative, strong problem solver and can lead the implementation of the analytical data infrastructure that will guide the decision making. As a Data Engineer, you think like an entrepreneur, constantly innovating and driving positive change, but more importantly, you consistently deliver mind-boggling results. You're a leader, who uses both quantitative and qualitative methods to get things done. And on top of it all, you're someone who wonders "What if?" and then seeks out the solution. This position offers exceptional opportunities to grow their technical and non-technical skills. You have the opportunity to really make a difference to our business by inventing, enhancing and building world class systems, delivering results, working on exciting and challenging projects. As a Data Engineer, you are responsible for analyzing large amounts of business data, solve real world problems, and develop metrics and business cases that will enable us to continually delight our customers worldwide. This is done by leveraging data from various platforms such as Jira, Portal, Salesforce. You will work with a team of Product Managers, Software Engineers and Business Intelligence Engineers to automate and scale the analysis, and to make the data more actionable to manage business at scale. You will own many large datasets, implement new data pipelines that feed into or from critical data systems at Amazon. You must be able to prioritize and work well in an environment with competing demands. Successful candidates will bring strong technical abilities combined with a passion for delivering results for customers, internal and external. This role requires a high degree of ownership and a drive to solve some of the most challenging data and analytic problems in retail. Candidates must have demonstrated ability to manage large-scale data modeling projects, identify requirements and tools, build data warehousing solutions that are explainable and scalable. In addition to the technical skills, a successful candidate will possess strong written and verbal communication skills and a high intellectual curiosity with ability to learn new concepts/frameworks and technology rapidly as changes arise. Key job responsibilities Design, implement and support an analytical data infrastructure Managing AWS resources including EC2, EMR, S3, Glue, Redshift, etc. Interface with other technology teams to extract, transform, and load data from a wide variety of data sources using SQL and AWS big data technologies Explore and learn the latest AWS technologies to provide new capabilities and increase efficiency Collaborate with Data Scientists and Business Intelligence Engineers (BIEs) to recognize and help adopt best practices in reporting and analysis Help continually improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers Maintain internal reporting platforms/tools including troubleshooting and development. Interact with internal users to establish and clarify requirements in order to develop report specifications. Work with Engineering partners to help shape and implement the development of BI infrastructure including Data Warehousing, reporting and analytics platforms. Contribute to the development of the BI tools, skills, culture and impact. Write advanced SQL queries and Python code to develop solutions A day in the life This role requires you to live at the intersection of data, software, and analytics. We leverage a comprehensive suite of AWS technologies, with key tools including S3, Redshift, DynamoDB, Lambda, API's, Glue. You will drive the development process from design to release. Managing data ingestion from heterogeneous data sources, with automated data quality checks. Creating scalable data models for effective data processing, storage, retrieval, and archiving. Using scripting for automation and tool development, which is scalable, reusable, and maintainable. Providing infrastructure for self serve analytics and science use cases. Using industry best practices in building CI/CD pipelines About The Team AOP (Analytics Operations and Programs) team is missioned to standardize BI and analytics capabilities, and reduce repeat analytics/reporting/BI workload for operations across IN, AU, BR, MX, SG, AE, EG, SA marketplace. AOP is responsible to provide visibility on operations performance and implement programs to improve network efficiency and defect reduction. The team has a diverse mix of strong engineers, Analysts and Scientists who champion customer obsession. We enable operations to make data-driven decisions through developing near real-time dashboards, self-serve dive-deep capabilities and building advanced analytics capabilities. We identify and implement data-driven metric improvement programs in collaboration (co-owning) with Operations teams Basic Qualifications 1+ years of data engineering experience Experience with data modeling, warehousing and building ETL pipelines Experience with one or more query language (e.g., SQL, PL/SQL, DDL, MDX, HiveQL, SparkSQL, Scala) Experience with one or more scripting language (e.g., Python, KornShell) Preferred Qualifications Experience with big data technologies such as: Hadoop, Hive, Spark, EMR Experience with any ETL tool like, Informatica, ODI, SSIS, BODI, Datastage, etc. Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ASSPL - Karnataka Job ID: A2904529 Show more Show less

Posted 1 week ago

Apply

Exploring Hive Jobs in India

Hive is a popular data warehousing tool used for querying and managing large datasets in distributed storage. In India, the demand for professionals with expertise in Hive is on the rise, with many organizations looking to hire skilled individuals for various roles related to data processing and analysis.

Top Hiring Locations in India

  1. Bangalore
  2. Hyderabad
  3. Pune
  4. Mumbai
  5. Delhi

These cities are known for their thriving tech industries and offer numerous opportunities for professionals looking to work with Hive.

Average Salary Range

The average salary range for Hive professionals in India varies based on experience level. Entry-level positions can expect to earn around INR 4-6 lakhs per annum, while experienced professionals can earn upwards of INR 12-15 lakhs per annum.

Career Path

Typically, a career in Hive progresses from roles such as Junior Developer or Data Analyst to Senior Developer, Tech Lead, and eventually Architect or Data Engineer. Continuous learning and hands-on experience with Hive are crucial for advancing in this field.

Related Skills

Apart from expertise in Hive, professionals in this field are often expected to have knowledge of SQL, Hadoop, data modeling, ETL processes, and data visualization tools like Tableau or Power BI.

Interview Questions

  • What is Hive and how does it differ from traditional databases? (basic)
  • Explain the difference between HiveQL and SQL. (medium)
  • How do you optimize Hive queries for better performance? (advanced)
  • What are the different types of tables supported in Hive? (basic)
  • Can you explain the concept of partitioning in Hive tables? (medium)
  • What is the significance of metastore in Hive? (basic)
  • How does Hive handle schema evolution? (advanced)
  • Explain the use of SerDe in Hive. (medium)
  • What are the various file formats supported by Hive? (basic)
  • How do you troubleshoot performance issues in Hive queries? (advanced)
  • Describe the process of joining tables in Hive. (medium)
  • What is dynamic partitioning in Hive and when is it used? (advanced)
  • How can you schedule jobs in Hive? (medium)
  • Discuss the differences between bucketing and partitioning in Hive. (advanced)
  • How do you handle null values in Hive? (basic)
  • Explain the role of the Hive execution engine in query processing. (medium)
  • Can you give an example of a complex Hive query you have written? (advanced)
  • What is the purpose of the Hive metastore? (basic)
  • How does Hive support ACID transactions? (medium)
  • Discuss the advantages and disadvantages of using Hive for data processing. (advanced)
  • How do you secure data in Hive? (medium)
  • What are the limitations of Hive? (basic)
  • Explain the concept of bucketing in Hive and when it is used. (medium)
  • How do you handle schema evolution in Hive? (advanced)
  • Discuss the role of Hive in the Hadoop ecosystem. (basic)

Closing Remark

As you explore job opportunities in the field of Hive in India, remember to showcase your expertise and passion for data processing and analysis. Prepare well for interviews by honing your skills and staying updated with the latest trends in the industry. Best of luck in your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies