Jobs
Interviews

3311 Big Data Jobs - Page 35

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 8.0 years

7 - 10 Lacs

Bengaluru

Work from Office

Must-have skills Cloud certified in one of these categories Azure Data Engineer Azure Data Factory , Azure Data bricks Spark (PySpark or scala), SQL, DATA Ingestion, Curation Semantic Modelling/ Optimization of data model to work within Rahona Experience in Azure ingestion from on-prem source, e.g. mainframe, SQL server, Oracle. Experience in Sqoop / Hadoop Microsoft Excel (for metadata files with requirements for ingestion) Any other certificate in Azure/AWS/GCP and data engineering hands-on experience in cloud Strong Programming skills with at least one of Python, Scala or Java Mandatory Skills: DataBricks - Data Engineering . Experience: 5-8 Years .

Posted 3 weeks ago

Apply

5.0 - 8.0 years

7 - 10 Lacs

Bengaluru

Work from Office

Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLAs defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers and clients business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLAs Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks Mandatory Skills: Google BigQuery Experience: 5-8 Years

Posted 3 weeks ago

Apply

5.0 - 8.0 years

7 - 10 Lacs

Chennai

Work from Office

Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLAs defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers and clients business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLAs Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks Mandatory Skills: DataBricks - Data Engineering Experience: 5-8 Years

Posted 3 weeks ago

Apply

3.0 - 6.0 years

5 - 15 Lacs

Hyderabad

Work from Office

We're Nagarro. We are a Digital Product Engineering company that is scaling in a big way! We build products, services, and experiences that inspire, excite, and delight. We work at scale across all devices and digital mediums, and our people exist everywhere in the world (18000+ experts across 38 countries, to be exact). Our work culture is dynamic and non-hierarchical. We're looking for great new colleagues. That's where you come in! REQUIREMENTS: Total experience 3+ years. Hands-on experience in Big Data Engineering. Strong working experience in Python, Pyspark, Databricks, Cloud AWS/Azure (Any 1) and SQL Server . Hands on working experience in PySpark. Solid understanding of Spark performance tuning and optimization techniques. Experience with Databricks platform and Delta Lake. Hands on working experience on designing, developing, and maintaining robust ETL pipelines in Databricks. Familiarity with CI/CD pipelines for data engineering workflows is a plus. Excellent problem-solving skills and attention to detail. Experience with cloud platforms such as Azure or AWS. Knowledge of data governance, security, and privacy best practices. Background in big data tools and frameworks such as Hive, Kafka, or Airflow. Strong communication and collaboration abilities. RESPONSIBILITIES: Writing and reviewing great quality code Understanding the clients business use cases and technical requirements and be able to convert them in to technical design which elegantly meets the requirements Mapping decisions with requirements and be able to translate the same to developers Identifying different solutions and being able to narrow down the best option that meets the clients requirements Defining guidelines and benchmarks for NFR considerations during project implementation Writing and reviewing design documents explaining overall architecture, framework, and high-level design of the application for the developers Reviewing architecture and design on various aspects like extensibility, scalability, security, design patterns, user experience, NFRs, etc., and ensure that all relevant best practices are followed Developing and designing the overall solution for defined functional and non-functional requirements; and defining technologies, patterns, and frameworks to materialize it Understanding and relating technology integration scenarios and applying these learnings in projects Resolving issues that are raised during code/review, through exhaustive systematic analysis of the root cause, and being able to justify the decision taken Carrying out POCs to make sure that suggested design/technologies meet the requirements.

Posted 3 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

As a Data Admin, you will leverage your expertise in Statistical Analysis, Machine Learning, Data Visualization, Predictive Modeling, Big Data, SQL, Data Mining, VBA, and Power BI to support category management teams in executing NDAs and Agreements. Your responsibilities will include agreement renewal, supplier premium program follow-up, and supplier business meeting review setup. You will develop and maintain trackers, generate monthly reports, and create visual dashboards representing sub-category spend by region. Additionally, you will work on workflow creation in Microsoft SharePoint and automate reports using Microsoft Excel Macros, VBA, and Power BI. The ideal candidate should possess a bachelor's degree in electrical engineering from a reputable institution. Essential requirements for this role include interpersonal skills, expertise in Microsoft Excel, Macro, and Power BI, creativity in Microsoft PowerPoint presentations, and proficiency in root cause and corrective analysis. Knowledge of Python and SQL would be advantageous. Your attitude and ability to work in a hybrid environment across different time zones are key selection criteria for this position.,

Posted 3 weeks ago

Apply

5.0 - 10.0 years

0 Lacs

karnataka

On-site

You are a Senior Java Developer responsible for redesigning and evolving an enterprise platform that processes data at a high scale. Your role includes writing backend business logic, building backend features, system design/redesign, cloud deployment, and developing CI/CD pipelines. You will collaborate with client stakeholders and development teams across multiple geographies. Your expertise in Java, Spring, Spring Boot, Microservices, Event-Driven Architecture, Kafka, AWS, RabbitMQ, and PostgreSQL is crucial for this role. Knowledge of Elasticsearch or Grafana is a plus. This is an exciting opportunity for seasoned software engineers who excel in working with high-scale, event-driven, multi-tenant, multi-cloud, distributed systems. **Responsibilities:** - Be technically hands-on in analysis, design, and implementation of deliverables. - Take full ownership of assigned features. - Discuss requirements with Product Management and develop solution approaches with the team. - Implement complex features with high quality following the TDD process. - Communicate risks and progress timely. - Mentor other team members. - Support delivered features by debugging and creating RCA for production issues. **Requirements:** - Development experience in building products for large enterprises. - Expertise in Java programming, Data Structures, Algorithms, Spring, Spring Boot, Microservices. - Proficiency in databases like Oracle, SQL Server, or PostgreSQL. - Strong understanding of event-driven architecture and asynchronous messaging, particularly with Kafka. - Exceptional problem-solving skills. - Experience with at least one cloud platform, preferably AWS. - Hands-on experience in building highly performant scalable applications. - Ability to identify root causes of complex issues in scalable deployments. - Working experience in SOA and TDD. - Excellent communication skills. **Must-Have Skillset:** - Strong Computer Science fundamentals with 5-10 years of experience. - Experience in Cloud Native Application Development, preferably AWS. - Strong experience with Spring Boot, Kafka, RabbitMQ, and PostgreSQL. - Experience in Reactive Programming. - Exposure to event-driven architecture with Big Data. - Intermediate to advanced knowledge of Core Java. - Excellent communication skills for stakeholder management and cross-team collaboration.,

Posted 3 weeks ago

Apply

8.0 - 12.0 years

0 Lacs

karnataka

On-site

As a Site Reliability Engineering (SRE) Technical Leader on the Network Assurance Data Platform (NADP) team at ThousandEyes, you will be responsible for ensuring the reliability, scalability, and security of cloud and big data platforms. Your role will involve representing the NADP SRE team, working in a dynamic environment, and providing technical leadership in defining and executing the team's technical roadmap. Collaborating with cross-functional teams, including software development, product management, customers, and security teams, is essential. Your contributions will directly impact the success of machine learning (ML) and AI initiatives by ensuring a robust and efficient platform infrastructure aligned with operational excellence. In this role, you will design, build, and optimize cloud and data infrastructure to ensure high availability, reliability, and scalability of big-data and ML/AI systems. Collaboration with cross-functional teams will be crucial in creating secure, scalable solutions that support ML/AI workloads and enhance operational efficiency through automation. Troubleshooting complex technical problems, conducting root cause analyses, and contributing to continuous improvement efforts are key responsibilities. You will lead the architectural vision, shape the team's technical strategy and roadmap, and act as a mentor and technical leader to foster a culture of engineering and operational excellence. Engaging with customers and stakeholders to understand use cases and feedback, translating them into actionable insights, and effectively influencing stakeholders at all levels are essential aspects of the role. Utilizing strong programming skills to integrate software and systems engineering, building core data platform capabilities and automation to meet enterprise customer needs, is a crucial requirement. Developing strategic roadmaps, processes, plans, and infrastructure to efficiently deploy new software components at an enterprise scale while enforcing engineering best practices is also part of the role. Qualifications for this position include 8-12 years of relevant experience and a bachelor's engineering degree in computer science or its equivalent. Candidates should have the ability to design and implement scalable solutions with a focus on streamlining operations. Strong hands-on experience in Cloud, preferably AWS, is required, along with Infrastructure as a Code skills, ideally with Terraform and EKS or Kubernetes. Proficiency in observability tools like Prometheus, Grafana, Thanos, CloudWatch, OpenTelemetry, and the ELK stack is necessary. Writing high-quality code in Python, Go, or equivalent programming languages is essential, as well as a good understanding of Unix/Linux systems, system libraries, file systems, and client-server protocols. Experience in building Cloud, Big data, and/or ML/AI infrastructure, architecting software and infrastructure at scale, and certifications in cloud and security domains are beneficial qualifications for this role. Cisco emphasizes diversity and encourages candidates to apply even if they do not meet every single qualification. Diverse perspectives and skills are valued, and Cisco believes that diverse teams are better equipped to solve problems, innovate, and create a positive impact.,

Posted 3 weeks ago

Apply

1.0 - 4.0 years

6 - 10 Lacs

Mexico, Gurugram, United States (USA)

Work from Office

Amex GBT is a place where colleagues find inspiration in travel as a force for good and through their work can make an impact on our industry. We re here to help our colleagues achieve success and offer an inclusive and collaborative culture where your voice is valued. This position is a highly visible and integral role within the Global Supplier Partnerships (GSP) team which is responsible for driving revenue by establishing and growing key partnerships with major airlines, hotels, car rental companies, and global distribution systems (GDS). Within GSP, the Global Revenue Management (GRM) team is responsible for providing insights into our supplier performance and supporting negotiations globally to improve revenue for Amex GBT and our partners. Our team works very closely with collaborators across the organization with regular exposure to senior leadership. Right now, we are looking for a forward-thinking optimization associate with outstanding analytics, strong commercial foresight and proven thought leadership to join the GRM team. What You ll Do Increase revenues by supervising key deal performance, supporting deal negotiations and providing key strategy and performance analytics & insights Support regional and global supplier proposals and identify new revenue opportunities Closely collaborate with supplier relationship owners to model preferred supplier deals and develop efficient deal structures for existing and expected performance; evaluate supplier proposals in deal negotiations Develop and roll-out reporting for key strategic deals Drive integration of supplier deal structures with outstanding fare content to ensure revenue optimization Develop and roll-out optimization plans for key markets in EMEA, NA and JAPA Provide deal performance analytics for forecasting Evaluate supplier revenue risk and opportunities for new and existing client bids for Pricing, Sales and Client Management Translate supplier performance models into Amex GBT revenue forecast What We re Looking For Growth mindset Excellent analytical approach with broad commercial foresight and thought leadership to generate substantial insights on performance Self starter who is able to work independently as well as in a distributed team Able to work with data at a detailed level while keeping an eye on the broader strategy Excellent communication skills; must be able to translate sophisticated data into key messages that will be delivered to Leadership teams Ability to thrive in a fast-paced, dynamic work environment Graduate background ideally in a numerate subject Proven experience of working in an analytical role Ability to translate large amounts of data into clear practical insights Strong team member engagement skills including communications, time management and prioritisation skills Ability to understand business processes and commercial implications to make strategic recommendations Big Data Management, SQL & PowerBI proficiency would be a plus Meeting & Events and Hotel proven experience is a plus Location Mexico Click here to learn more about the benefits we offer in Mexico. The #TeamGBT Experience Work and life: Find your happy medium at Amex GBT. Flexible benefits are tailored to each country and start the day you do. These include health and welfare insurance plans, retirement programs, parental leave, adoption assistance, and wellbeing resources to support you and your immediate family . Travel perks: get a choice of deals each week from major travel providers on everything from flights to hotels to cruises and car rentals. Develop the skills you want when the time is right for you, with access to over 20,000 courses on our learning platform, leadership courses, and new job openings available to internal candidates first. We strive to champion Inclusion in every aspect of our business at Amex GBT. You can connect with colleagues through our global INclusion Groups, centered around common identities or initiatives, to discuss challenges, obstacles, achievements, and drive company awareness and action. And much more! A ll applicants will receive equal consideration for employment without regard to age, sex, gender (and characteristics related to sex and gender), pregnancy (and related medical conditions), race, color, citizenship, religion, disability, or any other class or characteristic protected by law. Click Here for Additional Disclosures in Accordance with the LA County Fair Chance Ordinance. Furthermore, we are committed to providing reasonable accommodation to qualified individuals with disabilities. Please let your recruiter know if you need an accommodation at any point during the hiring process. For details regarding how we protect your data, please consult the Amex GBT Recruitment Privacy Statement . What if I don t meet every requirement If you re passionate about our mission and believe you d be a phenomenal addition to our team, don t worry about checking every box;" please apply anyway. You may be exactly the person we re looking for!

Posted 3 weeks ago

Apply

3.0 - 8.0 years

14 - 18 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

About YipitData: YipitData is the leading market research and analytics firm for the disruptive economy and recently raised up to $475M from The Carlyle Group at a valuation over $1B. We analyze billions of alternative data points every day to provide accurate, detailed insights on ridesharing, e-commerce marketplaces, payments, and more. Our on-demand insights team uses proprietary technology to identify, license, clean, and analyze the data many of the world s largest investment funds and corporations depend on. For three years and counting, we have been recognized as one of Inc s Best Workplaces . We are a fast-growing technology company backed by The Carlyle Group and Norwest Venture Partners. Our offices are located in NYC, Austin, Miami, Denver, Mountain View, Seattle , Hong Kong, Shanghai, Beijing, Guangzhou, and Singapore. We cultivate a people-centric culture focused on mastery, ownership, and transparency. Why You Should Apply NOW: You ll be working with many strategic engineering leaders within the company. You ll report directly to the Director of Data Engineering. You will help build out our Data Engineering team presence in India. You will work with a Global team. You ll be challenged with a lot of big data problems. About The Role: We are seeking a highly skilled Senior Data Engineer to join our dynamic Data Engineering team. The ideal candidate possesses 6-8 years of data engineering experience. An excellent candidate should have a solid understanding of Spark and SQL, and have data pipeline experience. Hired individuals will play a crucial role in helping to build out our data engineering team to support our strategic pipelines and optimize for reliability, efficiency, and performance. Additionally, Data Engineering serves as the gold standard for all other YipitData analyst teams, building and maintaining the core pipelines and tooling that power our products. This high-impact, high-visibility team is instrumental to the success of our rapidly growing business. This is a unique opportunity to be the first hire in this team, with the potential to build and lead the team as their responsibilities expand. This is a hybrid opportunity based in India. During training and onboarding, we will expect several hours of overlap with US working hours. Afterward, standard IST working hours are permitted with the exception of 1-2 days per week, when you will join meetings with the US team. As Our Senior Data Engineer You Will: Report directly to the Senior Manager of Data Engineering, who will provide significant, hands-on training on cutting-edge data tools and techniques. Build and maintain end-to-end data pipelines. Help with setting best practices for our data modeling and pipeline builds. Create documentation, architecture diagrams, and other training materials. Become an expert at solving complex data pipeline issues using PySpark and SQL. Collaborate with stakeholders to incorporate business logic into our central pipelines. Deeply learn Databricks, Spark, and other ETL toolings developed internally. You Are Likely To Succeed If: You hold a Bachelor s or Master s degree in Computer Science, STEM, or a related technical discipline. You have 6+ years of experience as a Data Engineer or in other technical functions. You are excited about solving data challenges and learning new skills. You have a great understanding of working with data or building data pipelines. You are comfortable working with large-scale datasets using PySpark, Delta, and Databricks. You understand business needs and the rationale behind data transformations to ensure alignment with organizational goals and data strategy. You are eager to constantly learn new technologies. You are a self-starter who enjoys working collaboratively with stakeholders. You have exceptional verbal and written communication skills. Nice to have: Experience with Airflow, dbt, Snowflake, or equivalent. What We Offer: Our compensation package includes comprehensive benefits, perks, and a competitive salary: We care about your personal life and we mean it. We offer vacation time, parental leave, team events, learning reimbursement, and more! Your growth at YipitData is determined by the impact that you are making, not by tenure, unnecessary facetime, or office politics. Everyone at YipitData is empowered to learn, self-improve, and master their skills in an environment focused on ownership, respect, and trust. We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, marital status, disability, gender, gender identity or expression, or veteran status. We are proud to be an equal-opportunity employer. Job Applicant Privacy Notice

Posted 3 weeks ago

Apply

10.0 - 15.0 years

0 Lacs

ahmedabad, gujarat

On-site

As the Software Engineering Manager for Genea's Smart Building Platform, you will be a key player in shaping the strategic direction of our technology solutions. Collaborating closely with various product stakeholders, you will be instrumental in the identification, design, development, and delivery of technical solutions that exceed our customers" expectations. Your role will involve providing both technical expertise and leadership to the engineering team, with a focus on continuous improvement across our products, technologies, and processes. This position involves a balance of research and development activities, along with team-building initiatives. If you are passionate about tackling the challenges associated with building high-performing distributed systems, including leveraging technologies like IoT, Big Data, and AI, then this opportunity could be the perfect fit for you. You will join a dedicated and skilled team at Genea, committed to delivering top-notch software solutions and ensuring utmost customer satisfaction. In this role, you will: **Leadership & Strategy:** - Oversee software development projects to ensure alignment with product roadmaps and company objectives. - Drive engineering strategy, architecture, and execution to scale backend systems efficiently. - Mentor and guide a large engineering team to foster innovation and a proactive approach towards task completion. - Lead technology decisions in areas such as IoT devices, microservices, event-driven architectures, and big data solutions. - Facilitate architecture discussions and design reviews to maintain best practices and scalability. **Technical Excellence & Execution:** - Uphold high standards of code quality, scalability, maintainability, and readability. - Design, develop, test, and maintain robust, high-performance software using technologies like C#, .NET Core, JavaScript, C/C++, Docker, and Git. - Deliver RESTful APIs and microservices with optimal design and performance. - Implement comprehensive test plans and automated testing procedures to ensure product reliability. - Integrate IoT devices with Building Management Systems (BMS) / HVAC Controls through BACnet/Modbus protocols. **Agile & Team Management:** - Lead SCRUM teams to drive sprint planning, effort estimation, and successful execution. - Monitor team progress and sprint execution, ensuring timely delivery of features and resolution of technical debt. - Implement data-driven metrics and assessments to promote engineering and operational excellence. - Encourage a culture of continuous learning and technical excellence through coaching and mentorship. **Innovation & Future-Forward Thinking:** - Spearhead R&D initiatives to incorporate AI/ML into products and the software development lifecycle (SDLC) for enhanced efficiency. - Foster collaboration across different functions with product managers, designers, and business stakeholders to translate concepts into scalable solutions. **What We Are Looking For:** - Education & Experience: Bachelor's degree in computer science (CS), Electrical/Electronics Engineering (EE), or related field, along with 10-15+ years of hands-on software development experience. - Technical Expertise: Strong understanding of object-oriented design, algorithms, and data structures, experience with IoT, Linux systems, Containers, cloud platforms, and Agile methodologies. - Problem-Solving & Leadership: Ability to tackle complex issues, self-motivated mindset, ownership-driven approach, and effective client collaboration skills. - Communication & Collaboration: Excellent communication skills, experience in remote team environments, and a deep comprehension of software development life cycles. Joining Genea comes with a range of benefits, including a flexible working environment, generous time off policies, comprehensive leave options, health insurance coverage, and recognition as a Top Workplace. Embrace a balanced workweek and become part of a dynamic team dedicated to reshaping the future of commercial real estate operations through innovative technology solutions.,

Posted 3 weeks ago

Apply

6.0 - 8.0 years

25 - 30 Lacs

Gurugram

Work from Office

Ability to use data to drive products and decisions. Be able to explain complex technical concepts to various stakeholders leadership team, product managers, support, and other engineers. You will be involved in the design of data solutions using Hadoop based technologies along with Hadoop, AWS. Design and implementation of various Big Data platform components like (Batch Processing, Live Stream Processing, In----Memory Cache, Query Layer (SQL), Rule Engine and Action Framework ). Design and Implemented Data Access Layer, which can connect to various data sources and uses advanced caching techniques to provide fast responses to real time SQL queries using Big Data Technologies. Implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies; spark architecture, Kafka, any Cloud computing etc Collaborate with leadership to define and set standards for engineering rigor and help cultivate the culture in the team.

Posted 3 weeks ago

Apply

4.0 - 9.0 years

25 - 30 Lacs

Pune

Work from Office

About KPI Partners: KPI Partners is a leading provider of business intelligence and analytics solutions. We are committed to delivering innovative solutions that empower organizations to make data-driven decisions. Our team is passionate about leveraging the latest technologies to transform raw data into actionable insights. Position Summary: We are seeking a talented and experienced Lead Data Engineer with expertise in Unity Catalog to join our dynamic team. The ideal candidate will play a crucial role in architecting and implementing data engineering solutions, ensuring data quality, governance, and accessibility across our organization. You will work collaboratively with cross-functional teams to facilitate effective data management and integration. Key Responsibilities: - Design, develop, and implement data pipelines and ETL processes using Unity Catalog. - Manage and optimize data workflows to improve performance and reliability. - Ensure data quality and governance by implementing best practices and standards. - Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and provide technical solutions. - Mentor and lead junior data engineers, providing guidance and support in their professional development. - Stay up-to-date with industry trends and emerging technologies related to data engineering and analytics. - Participate in architecture discussions and contribute to the overall data strategy of the organization. Qualifications: - Bachelor s or Master s degree in Computer Science, Engineering, or a related field. - Proven experience as a Data Engineer with a focus on data cataloging and management, specifically Unity Catalog. - Strong programming skills in languages such as Python, SQL, or Scala. - Experience with cloud platforms such as AWS, Azure, or Google Cloud. - Knowledge of big data technologies and frameworks such as Apache Spark, Hadoop, or similar. - Familiarity with data warehousing concepts and tools. - Excellent problem-solving skills and the ability to work in a fast-paced environment. - Strong communication skills and the ability to work collaboratively within a team. What We Offer: - Competitive salary and benefits package. - Opportunities for professional growth and development. - A collaborative and innovative work environment. - The chance to work on cutting-edge technologies and impactful projects.

Posted 3 weeks ago

Apply

3.0 - 5.0 years

9 - 10 Lacs

Pune

Work from Office

Join us as a Data Engineer - PySpark Developer at Barclays, where youll take part in the evolution of our digital landscape, driving innovation and excellence. Youll harness cutting-edge technology to revolutionize our digital offerings, ensuring unparalleled customer experiences. As a part of the team, you will deliver technology stack, using strong analytical and problem solving skills to understand the business requirements and deliver quality solutions. Youll be working on complex technical problems that will involve detailed analytical skills and analysis. This will be done in conjunction with fellow engineers, business analysts and business stakeholders. To be successful as Data Engineer - PySpark Developer you should have experience with: Hands-on programming experience in a Big Data Hadoop ecosystem. Proficiency in PySpark, Hive, and Impala. Exposure to Mongo DB or any other NoSQL database. Solid experience with Unix shell. Experience with scheduling tools like AutoSys, airflow. Strong understanding of Agile methodologies and tools (JIRA, Confluence). Experience with CI/CD tools such as Jenkins, TeamCity, or GitLab. Excellent communication and collaboration skills. Ability to work independently and drive delivery with minimal supervision. Some other highly valued skills include: Bachelor s degree in Computer Science, Engineering, or a related field. Relevant certifications in Big Data or cloud technologies are a plus. You may be assessed on key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen, strategic thinking and digital and technology, as well as job-specific technical skills. This role is based in Pune. Purpose of the role To build and maintain the systems that collect, store, process, and analyse data, such as data pipelines, data warehouses and data lakes to ensure that all data is accurate, accessible, and secure. Accountabilities Build and maintenance of data architectures pipelines that enable the transfer and processing of durable, complete and consistent data. Design and implementation of data warehoused and data lakes that manage the appropriate data volumes and velocity and adhere to the required security measures. Development of processing and analysis algorithms fit for the intended data complexity and volumes. Collaboration with data scientist to build and deploy machine learning models. Analyst Expectations To perform prescribed activities in a timely manner and to a high standard consistently driving continuous improvement. Requires in-depth technical knowledge and experience in their assigned area of expertise Thorough understanding of the underlying principles and concepts within the area of expertise They lead and supervise a team, guiding and supporting professional development, allocating work requirements and coordinating team resources. If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L Listen and be authentic, E Energise and inspire, A Align across the enterprise, D Develop others. OR for an individual contributor, they develop technical expertise in work area, acting as an advisor where appropriate. Will have an impact on the work of related teams within the area. Partner with other functions and business areas. Takes responsibility for end results of a team s operational processing and activities. Escalate breaches of policies / procedure appropriately. Take responsibility for embedding new policies/ procedures adopted due to risk mitigation. Advise and influence decision making within own area of expertise. Take ownership for managing risk and strengthening controls in relation to the work you own or contribute to. Deliver your work and areas of responsibility in line with relevant rules, regulation and codes of conduct. Maintain and continually build an understanding of how own sub-function integrates with function, alongside knowledge of the organisations products, services and processes within the function. Demonstrate understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function. Make evaluative judgements based on the analysis of factual information, paying attention to detail. Resolve problems by identifying and selecting solutions through the application of acquired technical experience and will be guided by precedents. Guide and persuade team members and communicate complex / sensitive information. Act as contact point for stakeholders outside of the immediate function, while building a network of contacts outside team and external to the organisation.

Posted 3 weeks ago

Apply

5.0 - 10.0 years

15 - 17 Lacs

Hyderabad

Work from Office

Very good experience in writing simple and complex queries using Joins, String functions, Date functions, Ranking functions, Analytical functions and set operators. Analytics experience is mandatory. Strong knowledge in Indexes, Joins, Views, Triggers, Functions Stored Procedures. Strong knowledge in creating and using Temporary table, Table Variable and CTE s (Common table Expressions), Sub-Queries, Derived table and joins to simplify complex queries involving multiple tables. Expertise in creating, maintaining database objects like Indexes, Functions, views, constraints. Very good experience in building the Relationship using Constraints. Strong Knowledge on SQL Profiler to Track the queries. Experienced in Analysing Execution Plan and managing indexes and troubleshooting deadlocks. Experience with Data Visualization (PLX Dashboard, Google Spreadsheets, Tableau desktop) creating dashboards and reports, and working with PLX / Google Data Studio.

Posted 3 weeks ago

Apply

6.0 - 11.0 years

20 - 25 Lacs

Hyderabad

Work from Office

We are looking for a Senior Data Engineer with expertise in ETL/ELT, Data Engineering, Data Warehousing, Data Lakes, Data Mesh, and Data Fabric architectures. The ideal candidate should have hands-on experience in at least one or two cloud data platforms (AWS, GCP, Azure, Snowflake, or Databricks) and a strong foundation in building PoCs, mentoring freshers, and contributing to accelerators and IPs. Key Responsibilities: cloud-based data pipelines AWS, Azure, GCP, Snowflake, or Databricks data lakes, data warehouses, and scalable data platforms PoCs, MVPs, and IPs Mentor and guide Required Skills Experience: Must-Have: 6-8 years of experience in Data Engineering Cloud Data Services. Hands-on with AWS (Redshift, Glue), GCP (BigQuery, Dataflow), Azure (Synapse, Data Factory), Snowflake, Databricks. Strong SQL, Python, or Scala skills. Knowledge of Data Mesh Data Fabric principles. Nice-to-Have: Exposure to MLOps, AI integrations, and Terraform/Kubernetes for DataOps

Posted 3 weeks ago

Apply

1.0 - 2.0 years

5 - 9 Lacs

Hyderabad

Work from Office

: Bachelor s degree in commerce (BCom) focusing on Accounts or Statistics, or a related field. Experience : 1 to 2 years of experience in accounting or finance. Key Responsibilities: Assist with preparing and processing invoices. Help in the preparation of financial reports, budgets, and forecasts. Perform data entry and maintain accurate records of financial transactions. Process accounts payable and receivable, ensuring timely payments and collections. Support month-end and year-end closing activities, including preparation of journal entries. Assist with tax filing and reporting, including VAT, GST, or other applicable taxes. Prepare and maintain financial records and ledgers by accounting standards. Provide support in audits by supplying required documentation and explanations. Monitor financial transactions and identify any potential errors or inconsistencies. Assist in preparing financial statements, balance sheets, and profit loss reports. Skills Proficiency in MS Excel and accounting software. Basic understanding of accounting principles and practices. Strong analytical and problem-solving skills. Attention to detail and accuracy in data entry and report preparation. Excellent organizational skills and the ability to multitask. Good communication skills, both written and verbal. Familiarity with tax regulations. Knowledge of financial statements and reporting. Ability to work independently and as part of a team

Posted 3 weeks ago

Apply

6.0 - 8.0 years

22 - 30 Lacs

Hyderabad

Work from Office

We are hiring a Senior AI/ML Gen AI Engineer with expertise in AWS Bedrock, AWS Nova, LLMs, RAG, and Agentic AI architectures. This role will focus on AI model development, MLOps, Gen AI solutions, and AI-first innovation initiatives. The ideal candidate will work on PoCs, accelerators, and AI-driven solutions while mentoring freshers and growing our AI practice. Key Responsibilities: AI/ML models, LLM-based applications, and RAG architectures AI-powered accelerators, PoCs, and reusable ML pipelines multi-cloud AI/ML solutions AWS Bedrock, AWS Nova, Azure ML, Vertex AI AI-driven GTM strategies and thought leadership Agentic AI Required Skills Experience: Must-Have: 6-8 years of hands-on experience in AI/ML, LLMs, RAG, and Gen AI. Expertise in Python, TensorFlow, PyTorch, LangChain, Hugging Face. Experience with AWS Bedrock, AWS Nova, Azure AI, Vertex AI, OpenAI APIs. MLOps experience for automating AI model deployment. Hands-on with Vector DBs (Pinecone, FAISS, ChromaDB). Nice-to-Have: Experience with AutoML, Prompt Engineering, Fine-Tuning LLMs. Contributions to open-source AI projects and research

Posted 3 weeks ago

Apply

9.0 - 13.0 years

0 Lacs

karnataka

On-site

At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we're counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. As a Lead Data Engineer at EY, you will play a crucial role in leading large scale solution architecture design and optimization to provide streamlined insights to partners throughout the business. You will lead a team of Mid- and Senior data engineers to collaborate with visualization on data quality and troubleshooting needs. Your key responsibilities will include implementing data processes for the data warehouse and internal systems, leading a team of Junior and Senior Data Engineers in executing data processes, managing data architecture, designing ETL processes, cleaning, aggregating, and organizing data from various sources, and transferring it to data warehouses. You will be responsible for leading the development, testing, and maintenance of data pipelines and platforms to enable data quality utilization within business dashboards and tools. Additionally, you will support team members and direct reports in refining and validating data sets, create, maintain, and support the data platform and infrastructure, and collaborate with various teams to understand data requirements and design solutions that enable advanced analytics, machine learning, and predictive modeling. To qualify for this role, you must have a Bachelor's degree in Engineering, Computer Science, Data Science, or related field, along with 9+ years of experience in software development, data engineering, ETL, and analytics reporting development. You should possess expertise in building and maintaining data and system integrations using dimensional data modeling and optimized ETL pipelines, as well as experience with modern data architecture and frameworks like data mesh, data fabric, and data product design. Other essential skillsets include proficiency in data engineering programming languages such as Python, distributed data technologies like Pyspark, cloud platforms and tools like Kubernetes and AWS services, relational SQL databases, DevOps, continuous integration, and more. You should have a deep understanding of database architecture and administration, excellent written and verbal communication skills, strong organizational skills, problem-solving abilities, and the capacity to work in a fast-paced environment while adapting to changing business priorities. Desired skillsets for this role include a Master's degree in Engineering, Computer Science, Data Science, or related field, as well as experience in a global working environment. Travel requirements may include access to transportation to attend meetings and the ability to travel regionally and globally. Join EY in building a better working world, where diverse teams in over 150 countries provide trust through assurance and help clients grow, transform, and operate across various sectors.,

Posted 3 weeks ago

Apply

2.0 - 6.0 years

0 Lacs

kolkata, west bengal

On-site

You are a Data Engineer with 2 to 4 years of experience in Python and PL/SQL. Your primary responsibility is to design, develop, and maintain data pipelines, ETL processes, and database solutions. You will be working on ETL Development & Data Processing, where you will develop, optimize, and maintain ETL pipelines for data ingestion, transformation, and integration. You will handle structured and semi-structured data from various sources and implement data cleansing, validation, and enrichment processes using Python and PL/SQL. In Database Development & Optimization, you will write, debug, and optimize complex SQL queries, stored procedures, functions, and triggers in PL/SQL. Additionally, you will design and maintain database schemas, indexing strategies, and partitioning for performance optimization, ensuring data consistency, quality, and governance across all data sources. Your role also involves Data Engineering & Automation, where you will automate data workflows using Python scripts and scheduling tools like Airflow, Cron, or DBMS_JOB. You will optimize query performance, troubleshoot database-related performance issues, and monitor data pipelines for failures while implementing alerting mechanisms. Collaboration & Documentation are crucial aspects of your job. You will closely collaborate with Data Analysts, Architects, and Business teams to understand data requirements. Documenting ETL processes, database schemas, and data flow diagrams will be part of your responsibilities. You will also participate in code reviews, testing, and performance tuning activities. Your Technical Skills should include strong experience in Python for data processing (Pandas, NumPy, PySpark), expertise in PL/SQL, hands-on experience with ETL tools, and knowledge of relational and non-relational databases. Exposure to Cloud & Big Data technologies like AWS/GCP/Azure, Spark, or Snowflake will be advantageous. Soft Skills such as problem-solving, effective communication, teamwork, and ability to manage tasks independently are essential for this role. This is a Full-time, Permanent position with a Day shift schedule and an in-person work location.,

Posted 3 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

You should have 5-8 years of experience with the Hadoop ecosystem, especially in utilizing Hive for data querying and analysis. It is essential to have experience in data modeling and ETL processes. Proficiency in MySQL is required, including the ability to write complex queries, stored procedures, and optimize queries. You should be capable of working with large datasets for data analysis purposes. In this role, you will be expected to work closely with and mentor the team, actively contribute to discussions, and present findings clearly. The key skills for this position include expertise in Big Data, Hive, Spark, Sqoop, and MySQL.,

Posted 3 weeks ago

Apply

4.0 - 8.0 years

0 Lacs

haryana

On-site

As a Software Engineer, your primary responsibility will be to write and review high-quality code. It is essential to thoroughly understand functional requirements and analyze the clients" needs within the product's context. You will be expected to envision comprehensive solutions for both functional and non-functional requirements, defining appropriate technologies, patterns, and frameworks. Your role will involve determining and implementing design methodologies and toolsets, as well as coordinating requirements, schedules, and activities to enable application development effectively. Additionally, you will play a crucial role in leading or supporting User Acceptance Testing (UAT) and production rollouts. It is imperative to create, understand, and validate estimated effort for assigned modules/tasks while addressing issues promptly and responding positively to setbacks and challenges with a continuous improvement mindset. Constructive feedback to team members and setting clear expectations will be part of your responsibilities, along with assisting the team in troubleshooting and resolving complex bugs. You will need to devise solutions during code/design reviews, justifying your decisions effectively. Conducting Proof of Concepts (POCs) to ensure that proposed designs/technologies align with requirements is also expected in this role. In terms of requirements, you should hold a degree in MCA/B.Tech./B.E. and have 4 to 6 years of experience, with a minimum of 3 years in Java, Spring, JPA, Hibernate, and at least 2 years in HTML, CSS, and JavaScript frameworks (primarily Angular 2+), JQuery. Hands-on experience with developing, deploying, and debugging applications using JBOSS/Tomcat, as well as proficiency in Data Structures, Algorithms, MariaDB, and Postgres, is necessary. Moreover, you should be familiar with high-quality software engineering practices for agile development cycles, including coding standards, code reviews, source control management, build processes, testing, and deployment. A fundamental understanding of design patterns, Object-Oriented Analysis and Design (OOA & OOD) concepts, effective communication with technical teams and management, and mentoring skills are also crucial. Having knowledge of big data stack and NoSQL databases is advantageous in this role. Your ability to relate to technology integration scenarios and apply them in troubleshooting complex issues will be beneficial for the team's success and meeting client requirements.,

Posted 3 weeks ago

Apply

10.0 - 15.0 years

0 Lacs

maharashtra

On-site

As the Cloud Network Subject Matter Expert (SME) at our organization, you will play a critical role in managing network architecture across cloud and on-premise infrastructure. With over 15 years of experience, including at least 10 years in network architecture leadership and 5 years as a cloud network architect, you will be responsible for executing enterprise-scale projects related to data strategy, cloud-based data lakes, API integration, SDWAN, and multi-cloud environments. Your primary focus will be on providing technical expertise in network and enterprise architecture, particularly from a cloud architect's perspective. This will involve leading enterprise-wide network initiatives, conducting architecture reviews, strategic planning, and addressing network issues proactively. You will also collaborate with various teams including Cloud operation teams, DevOps team, and other cross-functional project teams. In addition to your network architecture responsibilities, you will also act as a Cloud Consultant, offering guidance on complex solutions and technical architectural designs. Your role will involve implementing small to large-scale engagements, providing problem-solving approaches for dynamic challenges, and ensuring the alignment of solution strategies with business objectives. Key Result Areas (KRAs) for this role include project and delivery management, solution architecture, systems integration, risk management, team management, and compliance with industry standards. You will be expected to have hands-on experience in managing enterprise-scale on-premise and cloud networks, familiarity with routing protocols such as OSPF, BGP, RIP, EGP, IGP, and expertise in Azure services and other public cloud solutions. To excel in this role, you should hold a B.E. degree, with additional certifications like CCNP, CCNA, Azure/AWS network certifications, TOGAF, and Microsoft Azure certifications being desirable. Your ability to lead technical troubleshooting, design network architectures, and provide expert guidance to business teams will be essential in driving the organization's IT strategy and competitive advantage. If you are passionate about network architecture, cloud solutions, and driving innovation in a dynamic environment, we invite you to join our team as the Cloud Network SME in Mumbai. Your expertise and leadership will be instrumental in shaping the future of our network infrastructure and cloud services. (Note: This Job Description is a summary based on the provided information and may require further customization as per organizational needs.),

Posted 3 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

You will be responsible for overseeing the company's cloud computing strategy, which includes developing cloud adoption plans, designing cloud applications, and managing and monitoring cloud services. Leading a team of Cloud Engineers and Developers will be a key aspect of your role, where you will provide guidance, training, and motivation as needed. Collaborating with IT teams to ensure the implementation and maintenance of a robust and stable cloud infrastructure will be crucial. You will need to develop and organize cloud systems while ensuring compliance with best practices in privacy, security, and regulatory standards. Staying updated with industry trends and making recommendations to enhance the company's performance will also be part of your responsibilities. Managing and optimizing infrastructure assets to reduce costs through various cloud management tools, as well as implementing and overseeing disaster recovery solutions in the cloud, will be essential. Working closely with IT security to monitor cloud privacy and responding promptly to technical issues are critical tasks. You will be expected to provide guidance on infrastructure movement techniques, including bulk application transfers to the cloud, and identify the top cloud architecture solutions to meet the company's strategic needs effectively. Conducting technical training for team members and stakeholders, ensuring alignment with business goals, and managing project timelines will also be part of your duties. Maintaining an AZ-900 certification and encouraging team members to obtain relevant certifications will be necessary to stay updated with Microsoft Azure cloud services and best practices. Your expertise in Azure cloud services, full-stack development, cloud architecture, application development, big data management, collaboration, team building, problem-solving, analytical thinking, communication, and time management skills will be essential for success in this role. Having an AZ900 Microsoft Azure Fundamentals certification will be a good-to-have skill, validating your foundational knowledge of cloud services and Microsoft Azure offerings. This certification demonstrates your understanding of cloud concepts, Azure services, security and privacy in Azure, as well as pricing and support aspects. Overall, your role will be pivotal in driving the company's cloud computing strategy, ensuring efficient cloud infrastructure management, and fostering a collaborative and innovative environment within the team.,

Posted 3 weeks ago

Apply

7.0 - 12.0 years

0 Lacs

maharashtra

On-site

As a Lead Data Engineer, you will be responsible for leveraging your 7 to 12+ years of hands-on experience in SQL database design, data architecture, ETL, Data Warehousing, Data Mart, Data Lake, Big Data, Cloud (AWS), and Data Governance domains. Your expertise in a modern programming language such as Scala, Python, or Java, with a preference for Spark/ Pyspark, will be crucial in this role. Your role will require you to have experience with configuration management and version control apps like Git, along with familiarity working within a CI/CD framework. If you have experience in building frameworks, it will be considered a significant advantage. A minimum of 8 years of recent hands-on SQL programming experience in a Big Data environment is necessary, with a preference for experience in Hadoop/Hive. Proficiency in PostgreSQL, RDBMS, NoSQL, and columnar databases will be beneficial for this role. Your hands-on experience in AWS Cloud data engineering components, including API Gateway, Glue, IoT Core, EKS, ECS, S3, RDS, Redshift, and EMR, will play a vital role in developing and maintaining ETL applications and data pipelines using big data technologies. Experience with Apache Kafka, Spark, and Airflow is a must-have for this position. If you are excited about this opportunity and possess the required skills and experience, please share your CV with us at omkar@hrworksindia.com. We look forward to potentially welcoming you to our team. Regards, Omkar,

Posted 3 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

You will be joining Lifesight as a Data Engineer in our Bengaluru office, playing a pivotal role in the Data and Business Intelligence organization. Your primary focus will be on leading deep data engineering projects and contributing to the growth of our data platform team. This is an exciting opportunity to shape our technical strategy and foster a strong data engineering team culture in India. As a Data Engineer at Lifesight, you will be responsible for designing and constructing data platforms and services, managing data infrastructure in cloud environments, and enabling strategic business decisions across Lifesight products. Your role will involve building highly scalable, fault-tolerant distributed data processing systems, optimizing data quality in pipelines, and owning data mapping, transformations, and business logic. You will also engage in low-level system debugging, performance optimization, and actively participate in architecture discussions to drive new projects forward. The ideal candidate for this position will possess proficiency in Python and PySpark, along with a deep understanding of Apache Spark, Spark tuning, and building data frames. Experience with big data technologies such as HDFS, YARN, Map-Reduce, Hive, Kafka, and Airflow, as well as NoSQL databases and cloud platforms like AWS and GCP, are essential. You should have at least 5 years of professional experience in data or software engineering, demonstrating expertise in data quality, data engineering, and various big data frameworks and tools. In summary, as a Data Engineer at Lifesight, you will have the opportunity to work on cutting-edge data projects, collaborate with a talented team of engineers, and contribute to the ongoing success and innovation of Lifesight's data platform.,

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies