Jobs
Interviews

478 Data Lake Jobs - Page 17

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

15.0 - 24.0 years

45 - 70 Lacs

Bhubaneswar, Hyderabad

Work from Office

Job Description: Head of Data and Digital, Delivery Bourntec Solutions Title: Head of Data and Digital, Delivery Level: Director Location: Hyderabad/Bhubaneshwar Role Overview: The Head of Data and Digital, Delivery is a critical leadership role responsible for the successful delivery of Bourntec Solutions' data and digital projects and services to our clients. This Director-level position will lead and manage a team of delivery managers, data scientists, digital specialists, and engineers, ensuring high-quality, on-time, and within-budget project execution. The ideal candidate will possess a strong technical background in data and digital technologies, coupled with exceptional leadership, delivery management, and client management skills. This role requires a strategic thinker with a proven ability to build and scale delivery teams, optimize processes, and drive client satisfaction. Key Responsibilities: Delivery Leadership & Management: Provide strategic leadership and direction to the Data and Digital Delivery teams. Oversee the end-to-end delivery lifecycle of data analytics, business intelligence, AI/ML, cloud, and other digital transformation projects. Manage and mentor a team of Delivery Managers, Data Scientists, Digital Specialists, and Engineers, fostering a high-performance and collaborative environment. Ensure projects are delivered with high quality, on time, within budget, and according to client specifications. Proactively identify and mitigate project risks and issues, escalating appropriately and implementing effective solutions. Establish and maintain strong relationships with client stakeholders, acting as a trusted advisor and point of escalation for delivery-related matters. Operational Excellence & Process Improvement: Define and implement best practices, methodologies, and standards for data and digital project delivery. Continuously evaluate and optimize delivery processes to improve efficiency, quality, and predictability. Establish and track key delivery metrics and KPIs, providing regular reports to senior management. Implement and enforce project governance frameworks and compliance standards. Drive the adoption of relevant tools and technologies to enhance delivery capabilities. Team Building & Talent Development: Recruit, onboard, and develop top talent within the Data and Digital Delivery teams. Foster a culture of continuous learning and professional development within the team. Conduct performance reviews, provide feedback, and identify opportunities for growth. Build a scalable and agile delivery organization to support the company's growth objectives. Client Relationship Management: Serve as a key point of contact for executive-level client stakeholders regarding project delivery. Understand client business needs and ensure delivery aligns with their strategic goals. Proactively manage client expectations and ensure high levels of client satisfaction. Identify opportunities for expanding services and solutions within existing client engagements. Strategic Contribution: Contribute to the overall strategic planning and direction of Bourntec Solutions' data and digital offerings. Stay abreast of the latest trends and advancements in data and digital technologies. Collaborate with Sales and Pre-sales teams to develop compelling proposals and solutions for clients. Contribute to the development of new service offerings and intellectual property. Qualifications: Bachelor's or Master's degree in Computer Science, Engineering, Data Science, or a related field. Minimum of 15+ years of progressive experience in data and digital technology delivery, with at least 5+ years in a leadership role managing delivery teams. Proven track record of successfully delivering complex data analytics, business intelligence, AI/ML, cloud, or digital transformation projects for enterprise clients. Strong technical understanding of data warehousing, data lakes, ETL/ELT processes, data modeling, AI/ML algorithms, cloud platforms (AWS, Azure, GCP), and modern digital technologies. Excellent leadership, communication (written and verbal), interpersonal, and presentation skills. Strong client management and stakeholder management skills, with the ability to build and maintain trusted relationships. Proven ability to build, mentor, and motivate high-performing delivery teams. Experience in establishing and implementing delivery methodologies, standards, and processes. Strong analytical and problem-solving skills with a data-driven approach. Experience with project management tools and software. Ability to thrive in a fast-paced and dynamic environment. Preferred Qualifications: Relevant certifications in project management (e.g., PMP, Agile certifications) or cloud platforms. Experience working in a services-based organization. Familiarity with industry-specific data and digital solutions. Bourntec Solutions is an equal opportunity employer and values diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.

Posted 2 months ago

Apply

8.0 - 12.0 years

10 - 14 Lacs

Mumbai, Delhi / NCR, Bengaluru

Work from Office

We are seeking an experienced Data Architect to join our team , working with our UAE-based client. The ideal candidate should have 8-12 years of hands-on experience in data architecture, with at least 4 years in architectural design and integration. Strong expertise in MS Dynamics and data lake architecture is required, along with proficiency in ETL, data modeling, data integration, and data quality assurance. The candidate should have a strong problem-solving mindset, the ability to handle architectural issues, and experience in troubleshooting. They should also be a proactive contributor and a team player with a flexible attitude. The role requires immediate availability and the ability to work as per UAE timings. Location: Chennai, Hyderabad, Kolkata, Pune, Ahmedabad, Remote

Posted 2 months ago

Apply

4.0 - 5.0 years

10 - 14 Lacs

Bengaluru

Work from Office

About the Role - We are seeking a highly skilled and experienced Senior Data Scientist to join our data science team. - As a Senior Data Scientist, you will play a critical role in driving data-driven decision making across the organization by developing and implementing advanced analytical solutions. - You will leverage your expertise in data science, machine learning, and statistical analysis to uncover insights, build predictive models, and solve complex business challenges. Key Responsibilities - Develop and implement statistical and machine learning models (e.g., regression, classification, clustering, time series analysis) to address business problems. - Analyze large and complex datasets to identify trends, patterns, and anomalies. - Develop predictive models for forecasting, churn prediction, customer segmentation, and other business outcomes. - Conduct A/B testing and other experiments to optimize business decisions. - Communicate data insights effectively through visualizations, dashboards, and presentations. - Develop and maintain interactive data dashboards and reports. - Present findings and recommendations to stakeholders in a clear and concise manner. - Work with data engineers to design and implement data pipelines and data warehousing solutions. - Ensure data quality and integrity throughout the data lifecycle. - Develop and maintain data pipelines for data ingestion, transformation, and loading. - Stay up-to-date with the latest advancements in data science, machine learning, and artificial intelligence. - Research and evaluate new technologies and tools to improve data analysis and modeling capabilities. - Explore and implement new data science techniques and methodologies. - Collaborate effectively with data engineers, business analysts, product managers, and other stakeholders. - Communicate technical information clearly and concisely to both technical and non-technical audiences. Qualifications Essential - 4+ years of experience as a Data Scientist or in a related data science role. - Strong proficiency in statistical analysis, machine learning algorithms, and data mining techniques. - Experience with programming languages like Python (with libraries like scikit-learn, pandas, NumPy) or R. - Experience with data visualization tools (e.g., Tableau, Power BI). - Experience with data warehousing and data lake technologies. - Excellent analytical, problem-solving, and communication skills. - Master's degree in Statistics, Mathematics, Computer Science, or a related field Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.

Posted 2 months ago

Apply

8.0 - 12.0 years

7 - 11 Lacs

Bengaluru

Work from Office

Job Position Python Lead Total Exp Required 6+ years Relevant Exp Required around 5 Mandatory skills required Strong Python coding and development Good to have skills required Cloud, SQL , data analysis skills Location Pune - Kharadi - WFO - 3 days/week. About The Role : We are seeking a highly skilled and experienced Python Lead to join our team. The ideal candidate will have strong expertise in Python coding and development, along with good-to-have skills in cloud technologies, SQL, and data analysis. Key Responsibilities : - Lead the development of high-quality, scalable, and robust Python applications. - Collaborate with cross-functional teams to define, design, and ship new features. - Ensure the performance, quality, and responsiveness of applications. - Develop RESTful applications using frameworks like Flask, Django, or FastAPI. - Utilize Databricks, PySpark SQL, and strong data analysis skills to drive data solutions. - Implement and manage modern data solutions using Azure Data Factory, Data Lake, and Data Bricks. Mandatory Skills : - Proven experience with cloud platforms (e.g. AWS) - Strong proficiency in Python, PySpark, R, and familiarity with additional programming languages such as C++, Rust, or Java. - Expertise in designing ETL architectures for batch and streaming processes, database technologies (OLTP/OLAP), and SQL. - Experience with the Apache Spark, and multi-cloud platforms (AWS, GCP, Azure). - Knowledge of data governance and GxP data contexts; familiarity with the Pharma value chain is a plus. Good to Have Skills : - Experience with modern data solutions via Azure. - Knowledge of principles summarized in the Microsoft Cloud Adoption Framework. - Additional expertise in SQL and data analysis. Educational Qualifications Bachelor's/Master's degree or equivalent with a focus on software engineering. If you are a passionate Python developer with a knack for cloud technologies and data analysis, we would love to hear from you. Join us in driving innovation and building cutting-edge solutions! Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.

Posted 2 months ago

Apply

4.0 - 7.0 years

3 - 7 Lacs

Bengaluru

Work from Office

Role Python Developer Location Bangalore Experience4 - 7 Yrs Employment Type Full Time, Working mode Regular Notice Period Immediate - 15 Days About the Role : We are seeking a skilled Python Developer to join our dynamic team and contribute to the development of innovative data-driven solutions. The ideal candidate will have a strong foundation in Python programming, data analysis, and data processing techniques. This role will involve working with various data sources, including Redis, MongoDB, SQL, and Linux, to extract, transform, and analyze data for valuable insights. You will also be responsible for developing and maintaining efficient and scalable data pipelines and visualizations using tools like matplotlib and seaborn. Additionally, experience with web development frameworks such as Flask, FastAPI, or Django, as well as microservices architecture, will be a significant advantage. Key Responsibilities : - Design, develop, and maintain efficient and scalable data pipelines to extract, transform, and load (ETL) data from various sources, including Redis, MongoDB, SQL, and Linux. - Conduct in-depth data analysis and processing using Python libraries and tools to uncover valuable insights and trends. - Develop and maintain data visualizations using matplotlib, seaborn, or other relevant tools to effectively communicate findings to stakeholders. - Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions. - Develop and maintain web applications using Python frameworks like Flask, FastAPI, or Django, adhering to best practices and coding standards. - Design and implement microservices architecture to build scalable and modular systems. - Troubleshoot and resolve technical issues related to data pipelines, applications, and infrastructure. - Stay updated with the latest trends and technologies in the data engineering and Python development landscape. Required Skills and Qualifications : - Strong proficiency in Python programming, including object-oriented programming and functional programming concepts. - Experience with data analysis and processing libraries such as pandas, NumPy, and scikit-learn. - Familiarity with data storage and retrieval technologies, including Redis, MongoDB, SQL, and Linux. - Knowledge of data visualization tools like matplotlib and seaborn. - Experience with web development frameworks such as Flask, FastAPI, or Django. - Understanding of microservices architecture and principles. - Excellent problem-solving and analytical skills. - Ability to work independently and as part of a team. - Strong communication and interpersonal skills. Preferred Skills and Qualifications : - Experience with cloud platforms (AWS, GCP, Azure). - Knowledge of containerization technologies (Docker, Kubernetes). - Familiarity with data warehousing and data lake concepts. - Experience with machine learning and deep learning frameworks (TensorFlow, PyTorch). Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.

Posted 2 months ago

Apply

6.0 - 10.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Job Role Big Data Engineer Work Location Bangalore (CV Ramen Nagar location) Experience 7+ Years Notice Period Immediate - 30 days Mandatory Skills Big Data, Python, SQL, Spark/Pyspark, AWS Cloud JD and required Skills & Responsibilities - Actively participate in all phases of the software development lifecycle, including requirements gathering, functional and technical design, development, testing, roll-out, and support. - Solve complex business problems by utilizing a disciplined development methodology. - Produce scalable, flexible, efficient, and supportable solutions using appropriate technologies. - Analyse the source and target system data. Map the transformation that meets the requirements. - Interact with the client and onsite coordinators during different phases of a project. - Design and implement product features in collaboration with business and Technology stakeholders. - Anticipate, identify, and solve issues concerning data management to improve data quality. - Clean, prepare, and optimize data at scale for ingestion and consumption. - Support the implementation of new data management projects and re-structure the current data architecture. - Implement automated workflows and routines using workflow scheduling tools. - Understand and use continuous integration, test-driven development, and production deployment frameworks. - Participate in design, code, test plans, and dataset implementation performed by other data engineers in support of maintaining data engineering standards. - Analyze and profile data for the purpose of designing scalable solutions. - Troubleshoot straightforward data issues and perform root cause analysis to proactively resolve product issues. Required Skills - 5+ years of relevant experience developing Data and analytic solutions. - Experience building data lake solutions leveraging one or more of the following AWS, EMR, S3, Hive & PySpark - Experience with relational SQL. - Experience with scripting languages such as Python. - Experience with source control tools such as GitHub and related dev process. - Experience with workflow scheduling tools such as Airflow. - In-depth knowledge of AWS Cloud (S3, EMR, Databricks) - Has a passion for data solutions. - Has a strong problem-solving and analytical mindset - Working experience in the design, Development, and test of data pipelines. - Experience working with Agile Teams. - Able to influence and communicate effectively, both verbally and in writing, with team members and business stakeholders - Able to quickly pick up new programming languages, technologies, and frameworks. - Bachelor's degree in computer science Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.

Posted 2 months ago

Apply

5.0 - 6.0 years

7 - 12 Lacs

Hyderabad

Work from Office

About the Role - We are seeking a highly skilled and experienced Senior Azure Databricks Engineer to join our dynamic data engineering team. - As a Senior Azure Databricks Engineer, you will play a critical role in designing, developing, and implementing data solutions on the Azure Databricks platform. - You will be responsible for building and maintaining high-performance data pipelines, transforming raw data into valuable insights, and ensuring data quality and reliability. Key Responsibilities - Design, develop, and implement data pipelines and ETL/ELT processes using Azure Databricks. - Develop and optimize Spark applications using Scala or Python for data ingestion, transformation, and analysis. - Leverage Delta Lake for data versioning, ACID transactions, and data sharing. - Utilize Delta Live Tables for building robust and reliable data pipelines. - Design and implement data models for data warehousing and data lakes. - Optimize data structures and schemas for performance and query efficiency. - Ensure data quality and integrity throughout the data lifecycle. - Integrate Azure Databricks with other Azure services (e.g., Azure Data Factory, Azure Synapse Analytics, Azure Blob Storage). - Leverage cloud-based data services to enhance data processing and analysis capabilities. Performance Optimization & Troubleshooting - Monitor and analyze data pipeline performance. - Identify and troubleshoot performance bottlenecks. - Optimize data processing jobs for speed and efficiency. - Collaborate effectively with data engineers, data scientists, data analysts, and other stakeholders. - Communicate technical information clearly and concisely. - Participate in code reviews and contribute to the improvement of development processes. Qualifications Essential - 5+ years of experience in data engineering, with at least 2 years of hands-on experience with Azure Databricks. - Strong proficiency in Python and SQL. - Expertise in Apache Spark and its core concepts (RDDs, DataFrames, Datasets). - In-depth knowledge of Delta Lake and its features (e.g., ACID transactions, time travel). - Experience with data warehousing concepts and ETL/ELT processes. - Strong analytical and problem-solving skills. - Excellent communication and interpersonal skills. - Bachelor's degree in Computer Science, Computer Engineering, or a related field. Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.

Posted 2 months ago

Apply

7.0 - 8.0 years

17 - 22 Lacs

Mumbai

Work from Office

About the Role - We are seeking a highly skilled and experienced Senior Data Architect to join our growing data engineering team. - As a Senior Data Architect, you will play a critical role in designing, developing, and implementing robust and scalable data solutions to support our business needs. - You will be responsible for defining data architectures, ensuring data quality and integrity, and driving data-driven decision making across the organization. Key Responsibilities - Design and implement data architectures for various data initiatives, including data warehouses, data lakes, and data marts. - Define data models, data schemas, and data flows for complex data integration projects. - Develop and maintain data dictionaries and metadata repositories. - Ensure data quality and consistency across all data sources. - Design and implement data warehousing solutions, including ETL/ELT processes, data transformations, and data aggregations. - Support the development and implementation of business intelligence and reporting solutions. - Optimize data warehouse performance and scalability. - Define and implement data governance policies and procedures. - Ensure data security and compliance with relevant regulations (e.g., GDPR, CCPA). - Develop and implement data access controls and data masking strategies. - Design and implement data solutions on cloud platforms (AWS, Azure, GCP), leveraging cloud-native data services. - Implement data pipelines and data lakes on cloud platforms. - Collaborate effectively with data engineers, data scientists, business analysts, and other stakeholders. - Communicate complex technical information clearly and concisely to both technical and non-technical audiences. - Present data architecture designs and solutions to stakeholders. Qualifications Essential - 7+ years of experience in data architecture, data modeling, and data warehousing. - Strong understanding of data warehousing concepts, including dimensional modeling, ETL/ELT processes, and data quality. - Experience with relational databases (e.g., SQL Server, Oracle, MySQL) and NoSQL databases (e.g., MongoDB, Cassandra). - Experience with data integration tools and technologies. - Excellent analytical and problem-solving skills. - Strong communication and interpersonal skills. - Bachelor's degree in Computer Science, Computer Engineering, or a related field. Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.

Posted 2 months ago

Apply

5.0 - 9.0 years

7 - 11 Lacs

Hyderabad

Work from Office

We are Reckitt Home to the world's best loved and trusted hygiene, health, and nutrition brands Our purpose defines why we exist: to protect, heal and nurture in the relentless pursuit of a cleaner, healthier world We are a global team united by this purpose Join us in our fight to make access to the highest quality hygiene, wellness, and nourishment a right and not a privilege, Information Technology & Digital In IT and D, you'll be a force for good, whether you're championing cyber security, defining how we harness the power of technology to improve our business, or working with data to guide the innovation of consumer loved products, Working globally across functions, you'll own your projects and process from start to finish, with the influence and visibility to achieve what needs to be done And if you're willing to bring your ideas to the table, you'll get the support and investment to make them happen, Your potential will never be wasted You'll get the space and support to take your development to the next level Every day, there will be opportunities to learn from peers and leaders through working on exciting, varied projects with real impact And because our work spans so many different businesses, from Research and Product Development to Sales, you'll keep learning exciting new approaches, About The Role The IT&D Solution Architect is accountable for proactively and holistically driving solution design activities within the product teams, ensuring alignment with the overall Enterprise Architecture and product stream strategy The Solution Architect provides the necessary technical leadership, analysis and design tasks related to the development of a product or a set of products within a product group S/He partners closely with DevOps/development teams to secure that the value planned will be delivered in the most optimal way according to the product strategy and the overall value for money objectives linked to the IT strategy The Solution Architect is also accountable for supervising the design as well as the integration execution within the scope of their product(s), collaborating with the integration platform team S/He owns solution architecture specification(s), Your responsibilities Use problem-solving skills, technical expertise to design, architect, develop & document high-end technology solutions that solve most complex and challenging problems across different projects on Azure, On-Premises or SaaS, Provide technical leadership throughout the design and deployment life cycle and focus on delivery with quality, Lead technical discussions and build consensus among all stakeholders including vendors, Provide technical architecture consultancy and design for projects, Coordinate with the product owners to identify future needs and requirements, Should understand security and compliance requirements, deliver the solutions as per the defined business standards, principles & patterns, Design end to end application (Single or multi-tier) architecture and data security for IaaS, PaaS and SaaS, Should be able to design, document applications hosted on on-premises, cloud providers or other platforms as required, Build, migrate and test Azure / On-Premises environments and integrations of services with IaaS, PaaS, and SaaS services with DevOps mindset, Work with the leadership to develop a strategic plan to keep infrastructure services contemporary, cost competitive and meeting the needs of the business, The experience we're looking for 12+ Years of experience required in enterprise applications, integration & Solution Designing, Ability to translate future-state business capabilities and requirements into solution architecture requirements, Proven track record of delivering new products and services in a fast-paced, dynamic environment especially in CPG or healthcare organizations, Deep technical experience in infrastructure design including On-premises, private and public cloud, networking, virtualization, storage & Integration Services (MuleSoft, API Management), Proficiency in Azure SQL Database, Azure Synapse Analytics, Azure Cosmos DB, and other Azure data services, etc Experience with Extract, Transform, Load (ETL) processes using tools like Azure Data Factory, etc Experience in designing, deploying single or multi-tier applications in compliance focussed industries, Strong skills in designing data models, data warehousing, and data lakes, Understanding of data security, privacy, and compliance requirements, including GDPR and HIPAA, Experience in modernization of workloads in on-premises data centers & cloud platforms like Azure, Skills in optimizing database performance and managing data storage costs, Deep understanding of latest LLM models, machine learning, deep learning, natural language processing (NLP), and computer vision & able to advise on selection of best models, tools for required problem area, Previous experience/ Knowledge on: Azure/Microsoft Technology Stack at Enterprise level Azure/Cloud Network Services Security services and Identity services like Encryption, Active Directory, RBAC, NSGs / ASGs, firewall policies, Azure Networking services (e-g , VNETs, Load Balancers, Front Door, ExpressRoute, Traffic Manager, Content Delivery Network) firewalls, Web App proxies, Bash, PowerShell scripting, Azure Monitor and Application Insights, Azure Based Cloud Automation and Infrastructure as Code (e-g , Terraform, Cloud Shell, PowerShell, ARM, Webhook and runbook) Disaster Recovery Backups, disaster recovery policies, standards applicable for Cloud/On-Prem Enterprise Systems, Ability to quickly comprehend the functions and capabilities of new technologies, Good understanding of strategic and new and emerging technology trends, and the practical application of existing, new and emerging technologies to new and evolving business and operating models, Ability to propose and estimate the financial impact of solution architecture alternatives, Ability to apply multiple technical solutions to business challenges, Qualifications Required Bachelors degree (B E, b-tech) Azure Solutions Architect Expert (Mandatory) Azure DevOps Engineer Expert (preferable) TOGAF Architecture certification (Good to have) Any other Microsoft certifications related to Security, Data etc The skills for success Product Development, system development, Project Management, Programme Management, Design Thinking, Process Automisation, IT Service Management, Innovation Processes, Innovation, User Experience Design, Change Analyst, Change Management, Digital Transformation, Value Analysis, Change Management, Adoption, Technology Adoption Lifecycle, Stakeholder Relationship Management, Vendor Management, Outstanding Communication, stakeholder engagement, Digital Strategy, Product Solution Architecture, Cyber Security Strategy, Cyber Security, Data Privacy, Portfolio Management, Data Governance, Product Compliance, Media Analytics, advertising, Consumer Engagement, Market Value, Market Chain, Data Driven Practices, Advanced Analytics, Data Analytics, Governance, What we offer With inclusion at the heart of everything we do, working alongside our four global Employee Resource Groups, we support our people at every step of their career journey, helping them to succeed in their own individual way We invest in the wellbeing of our people through parental benefits, an Employee Assistance Program to promote mental health, and life insurance for all employees globally We have a range of other benefits in line with the local market Through our global share plans we offer the opportunity to save and share in Reckitt's potential future successes For eligible roles, we also offer short-term incentives to recognise, appreciate and reward your work for delivering outstanding results You will be rewarded in line with Reckitt's pay for performance philosophy, Equality We recognise that in real life, great people don't always 'tick all the boxes' That's why we hire for potential as well as experience Even if you don't meet every point on the job description, if this role and our company feels like a good fit for you, we still want to hear from you All qualified applicants will receive consideration for employment without regard to age, disability or medical condition; colour, ethnicity, race, citizenship, and national origin; religion, faith; pregnancy, family status and caring responsibilities; sexual orientation; sex, gender identity, gender expression, and transgender identity; protected veteran status; size or any other basis protected by appropriate law, Show more Show less

Posted 2 months ago

Apply

8.0 - 13.0 years

25 - 30 Lacs

Nashik

Work from Office

Dreaming big is in our DNA Its who we are as a company Its our culture Its our heritage And more than ever, its our future A future where were always looking forward Always serving up new ways to meet lifes moments A future where we keep dreaming bigger We look for people with passion, talent, and curiosity, and provide them with the teammates, resources and opportunities to unleash their full potential The power we create together when we combine your strengths with ours is unstoppable Are you ready to join a team that dreams as big as you do AB InBev GCC was incorporated in 2014 as a strategic partner for Anheuser-Busch InBev The center leverages the power of data and analytics to drive growth for critical business functions such as operations, finance, people, and technology The teams are transforming Operations through Tech and Analytics, Do You Dream Big We Need You, Job Description Job Title: Azure Data Engineer Location: Bengaluru Reporting to: Senior Manager Data Engineering Purpose of the role We are seeking a skilled and motivated Azure Data Engineer to join our dynamic team The ideal candidate will have hands-on experience with Microsoft Azure cloud services, data engineering, and a strong background in designing and implementing scalable data solutions, Key tasks & accountabilities Design, develop, and maintain scalable data pipelines and workflows using Azure Data Factory, Azure Databricks, and other relevant tools, Implement and optimize data storage solutions in Azure, including Azure PostgreSQL Database, Azure Blob Storage, Collaborate with data scientists, analysts, and business stakeholders to understand data requirements and implement solutions that align with business objectives, Ensure data quality, integrity, and security in all data-related processes and implementations, Work with both structured and unstructured data and implement data transformation and cleansing processes, Optimize and fine-tune performance of data solutions to meet both real-time and batch processing requirements, Troubleshoot and resolve issues related to data pipelines, ensuring minimal downtime and optimal performance, Stay current with industry trends and best practices, and proactively recommend improvements to existing data infrastructure, Qualifications, Experience, Skills Bachelor's degree in Computer Science, Information Technology, or a related field, Proven experience as a Data Engineer with a focus on Microsoft Azure technologies, Hands-on experience with Azure services such as Azure Data Factory, Azure Databricks, Azure SQL Database, Azure Data Lake Storage, and Azure Synapse Analytics, Strong proficiency in SQL and experience with data modeling and ETL processes, Familiarity with data integration and orchestration tools, Knowledge of data warehousing concepts and best practices, Experience with version control systems, preferably Git, Excellent problem-solving and communication skills, Level Of Educational Attainment Required Tech Previous Work Experience 7+ Years of Experience Technical Expertise Proven experience in Azure Databricks and ADLS architecture and implementation, Strong knowledge of medallion architecture and data lake design, Expertise in SQL, Python, and Spark for building and optimizing data pipelines, Familiarity with data integration tools and techniques, including Azure-native solutions, And above all of this, an undying love for beer! We dream big to create future with more cheers

Posted 2 months ago

Apply

8.0 - 13.0 years

25 - 30 Lacs

Bengaluru

Work from Office

AB InBev GCC was incorporated in 2014 as a strategic partner for Anheuser-Busch InBev The center leverages the power of data and analytics to drive growth for critical business functions such as operations, finance, people, and technology The teams are transforming Operations through Tech and Analytics, Do You Dream Big We Need You, Job Description Job Title: Senior Data Engineer Location: Bengaluru Reporting to: Senior Manager Data Engineering Purpose of the role We are seeking a skilled and motivated Azure Data Engineer to join our dynamic team The ideal candidate will have hands-on experience with Microsoft Azure cloud services, data engineering, and a strong background in designing and implementing scalable data solutions, Key tasks & accountabilities Design, develop, and maintain scalable data pipelines and workflows using Azure Data Factory, Azure Databricks, and other relevant tools, Implement and optimize data storage solutions in Azure, including Azure PostgreSQL Database, Azure Blob Storage, Collaborate with data scientists, analysts, and business stakeholders to understand data requirements and implement solutions that align with business objectives, Ensure data quality, integrity, and security in all data-related processes and implementations, Work with both structured and unstructured data and implement data transformation and cleansing processes, Optimize and fine-tune performance of data solutions to meet both real-time and batch processing requirements, Troubleshoot and resolve issues related to data pipelines, ensuring minimal downtime and optimal performance, Stay current with industry trends and best practices, and proactively recommend improvements to existing data infrastructure, Qualifications, Experience, Skills Bachelor's degree in Computer Science, Information Technology, or a related field, Proven experience of atleast 7+ years as a Data Engineer with a focus on Microsoft Azure technologies, Hands-on experience with Azure services such as Azure Data Factory, Azure Databricks, Azure SQL Database, Azure Data Lake Storage, and Azure Synapse Analytics, Strong proficiency in SQL and experience with data modeling and ETL processes, Familiarity with data integration and orchestration tools, Knowledge of data warehousing concepts and best practices, Experience with version control systems, preferably Git, Excellent problem-solving and communication skills, Technical Expertise Proven experience in Azure Databricks and ADLS architecture and implementation, Strong knowledge of medallion architecture and data lake design, Expertise in SQL, Python, and Spark for building and optimizing data pipelines, Familiarity with data integration tools and techniques, including Azure-native solutions, And above all of this, an undying love for beer! We dream big to create future with more cheers

Posted 2 months ago

Apply

8.0 - 14.0 years

10 - 16 Lacs

Bengaluru

Work from Office

Dreaming big is in our DNA Its who we are as a company Its our culture Its our heritage And more than ever, its our future A future where were always looking forward Always serving up new ways to meet lifes moments A future where we keep dreaming bigger We look for people with passion, talent, and curiosity, and provide them with the teammates, resources and opportunities to unleash their full potential The power we create together when we combine your strengths with ours is unstoppable Are you ready to join a team that dreams as big as you do AB InBev GCC was incorporated in 2014 as a strategic partner for Anheuser-Busch InBev The center leverages the power of data and analytics to drive growth for critical business functions such as operations, finance, people, and technology The teams are transforming Operations through Tech and Analytics, Do You Dream Big We Need You, Job Description Job Title: Data Architect Location: Bengaluru Reporting to: Senior Manager Data Engineering Purpose of the role We are seeking a skilled and motivated Azure Data Architect to join our dynamic team The ideal candidate will have hands-on experience with Microsoft Azure cloud services, data engineering, and a strong background in designing and implementing scalable data solutions, Key tasks & accountabilities Architecting Optimal Solutions Design and implement robust data architectures using Azure Databricks and ADLS aligned with the medallion architecture (bronze, silver, and gold layers), in the Data Mesh set up Ensure scalability, performance, and cost efficiency in data extraction, ingestion, and transformation processes, Governance and Quality Standards Define and review governance frameworks and quality standards for data pipelines, Collaborate with data engineers to ensure compliance with organizational and regulatory standards, Code Review and Optimization Conduct thorough code reviews to ensure optimal performance and alignment with stakeholder requirements, Provide guidance on best practices for coding and data pipeline development, Data Modelling and Complex Transformations Define data modeling requirements to support complex transformations as required by stakeholders, Collaborate with business users to understand data transformation needs and ensure that the models meet those needs effectively, Data Governance Review Review and enhance data governance standards to ensure consistency, security, and compliance, Implement policies to monitor and maintain the quality and accessibility of data, Integration Design Design and define integration solutions between systems, ensuring seamless data flow and communication, Leverage Azure native tools and services for system integrations where applicable, Cost Estimation and Management Provide cost estimates for proposed architectures and solutions, Optimize resource usage to minimize costs while maintaining performance and reliability, Proposing Data Consumption Solutions Design optimal solutions for data consumption, catering to analytical, reporting, and operational needs, Ensure user-friendly access to data through appropriate tools and interfaces, Qualifications, Experience, Skills Level Of Educational Attainment Required Bachelor of Technology Previous Work Experience Min 9 Years of Experience Preferred Experience Familiarity with Unity Catalog for data governance and sharing, Hands-on experience with data mesh architectures and distributed data platforms, Knowledge of CDC tools such as Aecorsoft for real-time data extraction, Technical Skills - Technical Expertise Proven experience in Azure Databricks and ADLS architecture and implementation, Strong knowledge of medallion architecture and data lake design, Expertise in SQL, Python, and Spark for building and optimizing data pipelines, DevOps, CICD Familiarity with data integration tools and techniques, including Azure-native solutions, Key requirement SAP ECC & S4Hana integration experience and experience in data modelling of the SAP ECC Finance modules, e-g , Account receivable, Account Payable, General Ledger, Inventory, TAX etc Governance and Modeling: Experience in defining data governance frameworks and standards, Strong understanding of data modeling techniques and principles, Analytical Skills: Ability to analyze business requirements and translate them into scalable technical solutions, Proficiency in estimating costs and managing budgets for large-scale data projects, Soft Skills: Excellent communication skills to collaborate with cross-functional teams and stakeholders, Strong problem-solving skills and a proactive approach to technical challenges, And above all of this, an undying love for beer! We dream big to create future with more cheers

Posted 2 months ago

Apply

8.0 - 14.0 years

10 - 16 Lacs

Nashik

Work from Office

Dreaming big is in our DNA Its who we are as a company Its our culture Its our heritage And more than ever, its our future A future where were always looking forward Always serving up new ways to meet lifes moments A future where we keep dreaming bigger We look for people with passion, talent, and curiosity, and provide them with the teammates, resources and opportunities to unleash their full potential The power we create together when we combine your strengths with ours is unstoppable Are you ready to join a team that dreams as big as you do AB InBev GCC was incorporated in 2014 as a strategic partner for Anheuser-Busch InBev The center leverages the power of data and analytics to drive growth for critical business functions such as operations, finance, people, and technology The teams are transforming Operations through Tech and Analytics, Do You Dream Big We Need You, Job Description Job Title: Data Architect Location: Bengaluru Reporting to: Senior Manager Data Engineering Purpose of the role We are seeking a skilled and motivated Azure Data Architect to join our dynamic team The ideal candidate will have hands-on experience with Microsoft Azure cloud services, data engineering, and a strong background in designing and implementing scalable data solutions, Key tasks & accountabilities Architecting Optimal Solutions Design and implement robust data architectures using Azure Databricks and ADLS aligned with the medallion architecture (bronze, silver, and gold layers), in the Data Mesh set up Ensure scalability, performance, and cost efficiency in data extraction, ingestion, and transformation processes, Governance and Quality Standards Define and review governance frameworks and quality standards for data pipelines, Collaborate with data engineers to ensure compliance with organizational and regulatory standards, Code Review and Optimization Conduct thorough code reviews to ensure optimal performance and alignment with stakeholder requirements, Provide guidance on best practices for coding and data pipeline development, Data Modelling and Complex Transformations Define data modeling requirements to support complex transformations as required by stakeholders, Collaborate with business users to understand data transformation needs and ensure that the models meet those needs effectively, Data Governance Review Review and enhance data governance standards to ensure consistency, security, and compliance, Implement policies to monitor and maintain the quality and accessibility of data, Integration Design Design and define integration solutions between systems, ensuring seamless data flow and communication, Leverage Azure native tools and services for system integrations where applicable, Cost Estimation and Management Provide cost estimates for proposed architectures and solutions, Optimize resource usage to minimize costs while maintaining performance and reliability, Proposing Data Consumption Solutions Design optimal solutions for data consumption, catering to analytical, reporting, and operational needs, Ensure user-friendly access to data through appropriate tools and interfaces, Qualifications, Experience, Skills Level Of Educational Attainment Required Bachelor of Technology Previous Work Experience Min 9 Years of Experience Preferred Experience Familiarity with Unity Catalog for data governance and sharing, Hands-on experience with data mesh architectures and distributed data platforms, Knowledge of CDC tools such as Aecorsoft for real-time data extraction, Technical Skills - Technical Expertise Proven experience in Azure Databricks and ADLS architecture and implementation, Strong knowledge of medallion architecture and data lake design, Expertise in SQL, Python, and Spark for building and optimizing data pipelines, DevOps, CICD Familiarity with data integration tools and techniques, including Azure-native solutions, Key requirement SAP ECC & S4Hana integration experience and experience in data modelling of the SAP ECC Finance modules, e-g , Account receivable, Account Payable, General Ledger, Inventory, TAX etc Governance and Modeling: Experience in defining data governance frameworks and standards, Strong understanding of data modeling techniques and principles, Analytical Skills: Ability to analyze business requirements and translate them into scalable technical solutions, Proficiency in estimating costs and managing budgets for large-scale data projects, Soft Skills: Excellent communication skills to collaborate with cross-functional teams and stakeholders, Strong problem-solving skills and a proactive approach to technical challenges, And above all of this, an undying love for beer! We dream big to create future with more cheers

Posted 2 months ago

Apply

1.0 - 5.0 years

3 - 7 Lacs

Gurugram

Work from Office

Airbnb was born in 2007 when two hosts welcomed three guests to their San Francisco home, and has since grown to over 5 million hosts who have welcomed over 2 billion guest arrivals in almost every country across the globe Every day, hosts offer unique stays and experiences that make it possible for guests to connect with communities in a more authentic way, Description Airbnb was born in 2007 when two Hosts welcomed three guests to their San Francisco home, and has since grown to over 4 million Hosts who have welcomed more than 1 billion guest arrivals in almost every country across the globe Every day, Hosts offer unique stays and experiences that make it possible for guests to connect with communities in a more authentic way, This role should be based in Gurgaon, India No relocation and Visa support, The Community You Will Join The Analytics Centre of Excellence (ACOE) at Airbnb, based in India, is a hub of knowledge and expertise that aims to provide data-driven decision-making, enabling Airbnb's business goals The ACOE's vision is to build a world-class analytics organization that provides scalable analytics We work with various business functions such as payments, trust, digital, customer support, hosting, sales,social, compliance, risk, platforms and partnership & economics, The ACOE's delivery framework is designed to provide relevant and contextual insights for data-driven decisions This includes a one-stop solution for metrics, dashboards driving actionable insights, optimization of performance, experimentation, measuring pre/post feature impact, sizing the ROI of opportunities, prioritization of opportunities, anomaly-driven alerting mechanisms, root cause analysis of metric deviation, and exploratory hypothesis testing, The Difference You Will Make You will be a part of the agent performance analytics team and will be responsible for evaluating the performance of customer service agents, identifying trends, and providing actionable insights to enhance agent productivity and service quality, Build business insights capabilities, passionate about solving complex problems with data; adding a new perspective to existing solutions and making business decisions based on careful and thoughtful analysis, You work independently with minimal supervision and act as a resource for colleagues with less experience, Work with technical and business teams to develop robust insights, create data narratives and leverage Airbnbs rich data to define metrics, You have a strong sense of urgency for and commitment to Airbnbs mission of belonging, Develop a deep understanding of principles of excellent customer service and how agent behavior impacts the customer experience Align analytical efforts with overall business goals and understanding how agent performance contributes to these goals, A Typical Day Analyze and report on agent performance metrics, including response times, resolution rates, and customer satisfaction scores, Develop and maintain dashboards and reports to monitor agent performance and identify areas for improvement, Conduct regular performance reviews and provide feedback to agents and management, Collaborate with training and development teams to create targeted training programs based on performance analysis, Monitor agent interactions and provide qualitative feedback to enhance service delivery, Identify trends in customer inquiries and feedback to inform service improvements, Assist in the development of performance-related incentive programs, Stay up-to-date with industry trends and best practices in customer service analytics, Ensure the availability of data quality and data integrity for analysis and reporting, working closely with global operations and India analytics functions, Build deep business context and become a trusted advisor to the business for tactical and strategic initiatives, Present and drive ideations, solutions, and progress updates to the business stakeholders, Your Expertise 5+ years in industry experience and a degree (Masters or PhD is a plus) in a quantitative field, Expert communication and collaboration skills with the ability to work effectively with internal teams in a cross-cultural and cross-functional environment Ability to conduct rigorous analysis and communicate conclusions to both technical and non-technical audiences Proven track record of delivering valuable insights and influencing business impact through analytics, Beginner level understanding of analytical frameworks such as Cohort Analysis, Product Funnel Analysis, Segmentation, Factor Analysis, Sensitivity Analysis, and statistical methodologies, Intermediate to advanced level proficiency in SQL, Tableau/Superset/Power BI, and data warehouses/data lakes such as Presto, Hive, Teradata, Spark, etc Advanced level expertise in presentation tools like Keynote, Google Slides, and PowerPoint, Experience partnering with internal teams to drive action and providing expertise and direction on analytics, data science, experimental design, and measurement, Exceptional problem-solving abilities, self-motivation and ability to work autonomously, taking ownership of projects and driving them to completion, Strong organizational and time management skills, with the ability to manage multiple priorities and meet deadlines, Experience in the hospitality or travel industry, Knowledge of customer service software and CRM systems, Hybrid Work Requirements & Expectations: To support productivity and maintain a professional hybrid work environment (2 days work from office), employees are expected to adhere to the following: Workspace: A dedicated, quiet, and private workspace free from interruptions and external noise Internet Connectivity: During the working hours, maintain a minimum and consistent internet speed of 10 Mbps on your official devices to ensure reliability for work-related tasks, including calls and virtual meetings Professionalism: Employees must remain fully engaged, respectful, and maintain a professional presence during virtual meetings, with video participation required unless otherwise approved, Confidentiality & Security: Employees are responsible for protecting Airbnbs Intellectual Property and Confidential Information Work-related activities, including calls and meetings, must not be conducted in public places, while traveling, or in any setting that may compromise confidentiality or work quality, Our Commitment To Inclusion & Belonging Airbnb is committed to working with the broadest talent pool possible We believe diverse ideas foster innovation and engagement, and allow us to attract creatively-led people, and to develop the best products, services and solutions All qualified individuals are encouraged to apply,

Posted 2 months ago

Apply

4.0 - 7.0 years

6 - 9 Lacs

Bengaluru

Work from Office

Reference 250002R6 Responsibilities Primary Skills : Deep Practical Knowledge on C#, Dot net framework 4 5 and above, Deep knowledge of Angular 8+ practices and commonly used modules based on extensive work experience Good Practical knowledge on ASPDot net MVC Good Practical knowledge on Web API Good Knowledge on OAuth 2 0 Good practical Knowledge on Entity Framework Good Hands on experience on LINQ Good Practical Knowledge on SQL Server and Query Tuning Secondary Skills : Understanding on Elastic search Worked with DATALAKE with dot net framework Basic knowledge on Azure DevOps Knowledge on ASPDot net Core Knowledge on Bootstrap Understanding on Agile process, scrum Knowledge on JavaScript and HTML5 Required Profile required Primary Skills : Deep Practical Knowledge on C#, Dot net framework 4 5 and above, Deep knowledge of Angular 8+ practices and commonly used modules based on extensive work experience Good Practical knowledge on ASPDot net MVC Good Practical knowledge on Web API Good Knowledge on OAuth 2 0 Good practical Knowledge on Entity Framework Good Hands on experience on LINQ Good Practical Knowledge on SQL Server and Query Tuning Secondary Skills : Understanding on Elastic search Worked with DATALAKE with dot net framework Basic knowledge on Azure DevOps Knowledge on ASPDot net Core Knowledge on Bootstrap Understanding on Agile process, scrum Knowledge on JavaScript and HTML5 Why join us We are committed to creating a diverse environment and are proud to be an equal opportunity employer All qualified applicants receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability, age, or veteran status, Business insight At Societe Generale, we are convinced that people are drivers of change, and that the world of tomorrow will be shaped by all their initiatives, from the smallest to the most ambitious, Whether youre joining us for a period of months, years or your entire career, together we can have a positive impact on the future Creating, daring, innovating and taking action are part of our DNA, If you too want to be directly involved, grow in a stimulating and caring environment, feel useful on a daily basis and develop or strengthen your expertise, you will feel right at home with us! Still hesitating You should know that our employees can dedicate several days per year to solidarity actions during their working hours, including sponsoring people struggling with their orientation or professional integration, participating in the financial education of young apprentices, and sharing their skills with charities There are many ways to get involved, We are committed to support accelerating our Groups ESG strategy by implementing ESG principles in all our activities and policies They are translated in our business activity (ESG assessment, reporting, project management or IT activities), our work environment and in our responsible practices for environment protection, Diversity and Inclusion We are an equal opportunities employer and we are proud to make diversity a strength for our company Societe Generale is committed to recognizing and promoting all talents, regardless of their beliefs, age, disability, parental status, ethnic origin, nationality, gender identity, sexual orientation, membership of a political, religious, trade union or minority organisation, or any other characteristic that could be subject to discrimination,

Posted 2 months ago

Apply

8.0 - 12.0 years

9 - 14 Lacs

Pune

Work from Office

Design, develop, and maintain high-performance data pipelines using GCP services such as BigQuery, Dataflow, Pub/Sub, and Composer (Airflow). Design and develop Looker Dashboards, with apt security provisioning and drill down capabilities. Ensure data security, lineage, quality, and compliance across GCP data ecosystems through IAM, audit logging, data encryption, and schema management. Monitor, troubleshoot, and optimize pipeline and warehouse performance using GCP native tools such as Cloud Monitoring, Cloud Logging, and BigQuery Optimizer. Write SQL queries, dbt models, or Dataflow pipelines to transform raw data into analytics-ready datasets. Develop and optimize SQL queries and data transformation scripts for data warehousing and reporting purposes. Lead proof-of-concepts (POCs) and best practice implementations for modern data architecture, including data lakes and cloud-native data warehouses. Ensure data quality, governance, and security best practices across all layers of the data stack. Write clean, maintainable, and efficient code following best practices. Requirements Data Engineering: 8–12 years of experience in data engineering, with at least 3–5 years hands-on experience specifically in Google Cloud Platform (GCP) and BI tools like Looker. BigQuery (data modeling, optimization, security); Advanced SQL proficiency with complex data transformation, windowing functions, and analytical querying. Ability to design and develop modular, maintainable SQL models using dbt best practices. Basic to intermediate knowledge of Python for scripting and automation. Exposure to ETL and batch scheduling/ orchestration solutions Strong understanding of data architecture patterns: data lakes, cloud-native data warehouses, event-driven architectures. Experience with version control systems like Git and branching strategies. Looker: Hands on experience in Looker with design, development, configuration/ setup, dashboarding and reporting techniques. Experience building and maintaining LookML models, Explores, PDTs, and semantic layers. Understanding of security provisioning and access controls, performance tuning of dashboard/ reports based on large dataset, building drill down capabilities. Proven ability to design scalable, user-friendly dashboards and self-service analytics environments. Expertise in optimizing Looker performance: materialized views, query tuning, aggregate tables. Strong command over Row-Level Security, Access Filters, and permission sets in Looker to support enterprise-grade data governance. General: Experience with Agile delivery methodologies (e.g. Scrum, Kanban) Demonstrable track record of dealing well with ambiguity, prioritizing needs, and delivering results in a dynamic environment. Conduct regular workshops, demos and stakeholder reviews to showcase data solutions and capture feedback. Excellent communication and collaboration skills. Collaborate with development teams to streamline the software delivery process and improve system reliability. Mentor and upskill junior engineers and analysts on GCP tools, Looker modeling best practices, and advanced visualization techniques. Ability to translate business objectives into data solutions with a focus on delivering measurable business value. Flexible to work in shifts and provide on-call support, owning the smooth operation of applications and systems in a production environment.

Posted 2 months ago

Apply

4.0 - 8.0 years

10 - 20 Lacs

Gurugram

Remote

US Shift- 5 working days. Remote Work. (US Airline Group) Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. Strong focus on AWS and PySpark. Knowledge of AWS services, including but not limited to S3, Redshift, Athena, EMR, and Glue. Proficiency in PySpark and related Big Data technologies for ETL processing. Strong SQL skills for data manipulation and querying. Familiarity with data warehousing concepts and dimensional modeling. Experience with data governance, data quality, and data security practices. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills to work effectively with cross-functional teams.

Posted 2 months ago

Apply

3.0 - 7.0 years

5 - 10 Lacs

Hyderabad, Bengaluru, Delhi / NCR

Hybrid

Role & Responsibilities Job Description: We are seeking a skilled and experienced Microsoft Fabric Engineer to join data engineering team. The ideal candidate will have a strong background in designing, developing, and maintaining data solutions using Microsoft Fabric, i ncluding experience across key workloads such as Data Engineering, Data Factory, Data Science, Real-Time Analytics, and Power BI. Require deep understanding of Synapse Data Warehouse, OneLake, Notebooks, Lakehouse architecture, and Power BI integration within Microsoft ecosystem. Key Responsibilities: Design, implement scalable and secure data solutions using Microsoft Fabric. Build and maintain Data Pipelines using Dataflows Gen2 and Data Factory. Work with Lakehouse architecture and manage datasets in OneLake. Develop notebooks (PySpark or T-SQL) for data transformation and processing. Collaborate with data analysts to create interactive dashboards, reports using Power BI (within Fabric). Leverage Synapse Data Warehouse and KQL databases for structured real-time analytics. Monitor and optimize performance of data pipelines and queries. Ensure to adhere data quality, security, and governance practices. Stay current with Microsoft Fabric updates and roadmap, recommending enhancements. Required Skills: 3+ years of hands-on experience with Microsoft Fabric or similar tools in the Microsoft data stack. Strong proficiency with: Data Factory (Fabric) Synapse Data Warehouse / SQL Analytics Endpoints Power BI integration and DAX Notebooks (PySpark, T-SQL) Lakehouse and OneLake Understanding of data modeling, ETL/ELT processes, and real-time data streaming. Experience with KQL (Kusto Query Language) is a plus. Familiarity with Microsoft Purview, Azure Data Lake, or Azure Synapse Analytics is advantageous. Qualifications: Microsoft Fabric, Onelake, Data Factory, Data Lake, DataMesh

Posted 2 months ago

Apply

8.0 - 11.0 years

35 - 37 Lacs

Kolkata, Ahmedabad, Bengaluru

Work from Office

Dear Candidate, Looking for a Cloud Data Engineer to build cloud-based data pipelines and analytics platforms. Key Responsibilities: Develop ETL workflows using cloud data services. Manage data storage, lakes, and warehouses. Ensure data quality and pipeline reliability. Required Skills & Qualifications: Experience with BigQuery, Redshift, or Azure Synapse. Proficiency in SQL, Python, or Spark. Familiarity with data lake architecture and batch/streaming. Soft Skills: Strong troubleshooting and problem-solving skills. Ability to work independently and in a team. Excellent communication and documentation skills. Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Delivery Manager Integra Technologies

Posted 2 months ago

Apply

3 - 6 years

4 - 8 Lacs

Bengaluru

Work from Office

locationsIN - Bangaloreposted onPosted Today time left to applyEnd DateMay 22, 2025 (5 days left to apply) job requisition idR140300 Company Overview A.P. Moller - Maersk is an integrated container logistics company and member of the A.P. Moller Group. Connecting and simplifying trade to help our customers grow and thrive . With a dedicated team of over 95,000 employees, operating in 130 countries; we go all the way to enable global trade for a growing world . From the farm to your refrigerator, or the factory to your wardrobe, A.P. Moller - Maersk is developing solutions that meet customer needs from one end of the supply chain to the other. About the Team At Maersk, the Global Ocean Manifest team is at the heart of global trade compliance and automation. We build intelligent, high-scale systems that seamlessly integrate customs regulations across 100+ countries, ensuring smooth cross-border movement of cargo by ocean, rail, and other transport modes. Our mission is to digitally transform customs documentation, reducing friction, optimizing workflows, and automating compliance for a complex web of regulatory bodies, ports, and customs authorities. We deal with real-time data ingestion, document generation, regulatory rule engines, and multi-format data exchange while ensuring resilience and security at scale. Key Responsibilities Work with large, complex datasets and ensure efficient data processing and transformation. Collaborate with cross-functional teams to gather and understand data requirements. Ensure data quality, integrity, and security across all processes. Implement data validation, lineage, and governance strategies to ensure data accuracy and reliability. Build, optimize , and maintain ETL pipelines for structured and unstructured data , ensuring high throughput, low latency, and cost efficiency . Experience in building scalable, distributed data pipelines for processing real-time and historical data. Contribute to the architecture and design of data systems and solutions. Write and optimize SQL queries for data extraction, transformation, and loading (ETL). Advisory to Product Owners to identify and manage risks, debt, issues and opportunities for the technical improvement . Providing continuous improvement suggestions in internal code frameworks, best practices and guidelines . Contribute to engineering innovations that fuel Maersks vision and mission. Required Skills & Qualifications 4 + years of experience in data engineering or a related field. Strong problem-solving and analytical skills. E xperience on Java, Spring framework Experience in building data processing pipelines using Apache Flink and Spark. Experience in distributed data lake environments ( Dremio , Databricks, Google BigQuery , etc.) E xperience on Apache Kafka, Kafka Streams Experience working with databases. PostgreSQL preferred, with s olid experience in writin g and opti mizing SQL queries. Hands-on experience in cloud environments such as Azure Cloud (preferred), AWS, Google Cloud, etc. Experience with data warehousing and ETL processes . Experience in designing and integrating data APIs (REST/ GraphQL ) for real-time and batch processing. Knowledge on Great Expectations, Apache Atlas, or DataHub , would be a plus Knowledge on RBAC, encryption, GDPR compliance would be a plus Business skills Excellent communication and collaboration skills Ability to translate between technical language and business language, and communicate to different target groups Ability to understand complex design Possessing the ability to balance and find competing forces & opinions , within the development team Personal profile Fact based and result oriented Ability to work independently and guide the team Excellent verbal and written communication Maersk is committed to a diverse and inclusive workplace, and we embrace different styles of thinking. Maersk is an equal opportunities employer and welcomes applicants without regard to race, colour, gender, sex, age, religion, creed, national origin, ancestry, citizenship, marital status, sexual orientation, physical or mental disability, medical condition, pregnancy or parental leave, veteran status, gender identity, genetic information, or any other characteristic protected by applicable law. We will consider qualified applicants with criminal histories in a manner consistent with all legal requirements. We are happy to support your need for any adjustments during the application and hiring process. If you need special assistance or an accommodation to use our website, apply for a position, or to perform a job, please contact us by emailing . Maersk is committed to a diverse and inclusive workplace, and we embrace different styles of thinking. Maersk is an equal opportunities employer and welcomes applicants without regard to race, colour, gender, sex, age, religion, creed, national origin, ancestry, citizenship, marital status, sexual orientation, physical or mental disability, medical condition, pregnancy or parental leave, veteran status, gender identity, genetic information, or any other characteristic protected by applicable law. We will consider qualified applicants with criminal histories in a manner consistent with all legal requirements. We are happy to support your need for any adjustments during the application and hiring process. If you need special assistance or an accommodation to use our website, apply for a position, or to perform a job, please contact us by emailing .

Posted 2 months ago

Apply

5 - 10 years

7 - 11 Lacs

Mumbai, Hyderabad

Work from Office

EDS Specialist - NAV02KL Company Worley Primary Location IND-MM-Navi Mumbai Job Engineering Design Systems (EDS) Schedule Full-time Employment Type Employee Job Level Experienced Job Posting Apr 7, 2025 Unposting Date May 30, 2025 Reporting Manager Title Manager We deliver the worlds most complex projects. Work as part of a collaborative and inclusive team . Enjoy a varied & challenging role. Building on our past. Ready for the future Worley is a global professional services company of energy, chemicals and resources experts headquartered in Australia. Right now, were bridging two worlds as we accelerate to more sustainable energy sources, while helping our customers provide the energy, chemicals and resources that society needs now. We partner with our customers to deliver projects and create value over the life of their portfolio of assets. We solve complex problems by finding integrated data-centric solutions from the first stages of consulting and engineering to installation and commissioning, to the last stages of decommissioning and remediation. Join us and help drive innovation and sustainability in our projects.? The Role As an EDS Specialist with Worley, you will work closely with our existing team to deliver projects for our clients while continuing to develop your skills and experience etc. Duties and responsibilities The AVEVA Engineering Senior Administrator is responsible for project set up, maintenance and support of the system. Senior Administrator shall ensure the set-up, configuration and deliverables are in line with organization/project/client standards. Gain full understanding of the scope, overall schedule, deliverables, milestones and coordination procedure. Understanding, documenting and managing the functional requirements (business scope) for an AVEVA Engineering implementation. Performing AVEVA Engineering support tasks Performing project implementations including configurations, reports and gateway. Suggesting how to improve AVEVA Engineering or optimize implementation. Providing advanced support and troubleshooting. Continually seeking opportunities to increase end-user satisfaction. Promote use of AVEVA Engineering and the value it brings to the projects within the organization. Qualifications: Bachelors degree in Engineering with at least 10 years of experience 5+ years of relevant experience in AVEVA Engineering. 5+ years of relevant experience in AEVA PDMS/E3D Administration. In-depth working knowledge of configuration and management of AVEVA Engineering, including project Administration Fully proficient with the management of Dabacon databases Knowledge of Engineering workflow in an EPC environment. Strong analytical and problem-solving skills Ability to work in a fast-paced environment. Effective oral and written communication skills required Experience with setting up Integration between Aveva Engineering and other Aveva and Hexagon design applications Good understanding of the Engineering data flow between various engineering application will be a plus Proficient in PML programming Good to have : Knowledge of writing PML1/2/.net and C# programs, and Visual basic .Net Previous experience of AVEVA NET Previous experience of AVEVA CAT/SPEC, ERM Moving forward together Were committed to building a diverse, inclusive and respectful workplace where everyone feels they belong, can bring themselves, and are heard. We provide equal employment opportunities to all qualified applicants and employees without regard to age, race, creed, color, religion, sex, national origin, ancestry, disability status, veteran status, sexual orientation, gender identity or expression, genetic information, marital status, citizenship status or any other basis as protected by law. We want our people to be energized and empowered to drive sustainable impact. So, our focus is on a values-inspired culture that unlocks brilliance through belonging, connection and innovation. And we're not just talking about it; we're doing it. We're reskilling our people, leveraging transferable skills, and supporting the transition of our workforce to become experts in today's low carbon energy infrastructure and technology. Whatever your ambition, theres a path for you here. And theres no barrier to your potential career success. Join us to broaden your horizons, explore diverse opportunities, and be part of delivering sustainable change.

Posted 2 months ago

Apply

5 - 8 years

6 - 10 Lacs

Bengaluru

Work from Office

About The Role Role Purpose The purpose of this role is to design, test and maintain software programs for operating systems or applications which needs to be deployed at a client end and ensure its meet 100% quality assurance parameters B?ig Data Developer - Spark,Scala,Pyspark Big Data Developer - Spark, Scala, Pyspark Coding & scripting Years of Experience5 to 12 years LocationBangalore Notice Period0 to 30 days Key Skills: - Proficient in Spark,Scala,Pyspark coding & scripting - Fluent in big data engineering development using the Hadoop/Spark ecosystem - Hands-on experience in Big Data - Good Knowledge of Hadoop Eco System - Knowledge of cloud architecture AWS - Data ingestion and integration into the Data Lake using the Hadoop ecosystem tools such as Sqoop, Spark, Impala, Hive, Oozie, Airflow etc. - Candidates should be fluent in the Python / Scala language - Strong communication skills ? 2. Perform coding and ensure optimal software/ module development Determine operational feasibility by evaluating analysis, problem definition, requirements, software development and proposed software Develop and automate processes for software validation by setting up and designing test cases/scenarios/usage cases, and executing these cases Modifying software to fix errors, adapt it to new hardware, improve its performance, or upgrade interfaces. Analyzing information to recommend and plan the installation of new systems or modifications of an existing system Ensuring that code is error free or has no bugs and test failure Preparing reports on programming project specifications, activities and status Ensure all the codes are raised as per the norm defined for project / program / account with clear description and replication patterns Compile timely, comprehensive and accurate documentation and reports as requested Coordinating with the team on daily project status and progress and documenting it Providing feedback on usability and serviceability, trace the result to quality risk and report it to concerned stakeholders ? 3. Status Reporting and Customer Focus on an ongoing basis with respect to project and its execution Capturing all the requirements and clarifications from the client for better quality work Taking feedback on the regular basis to ensure smooth and on time delivery Participating in continuing education and training to remain current on best practices, learn new programming languages, and better assist other team members. Consulting with engineering staff to evaluate software-hardware interfaces and develop specifications and performance requirements Document and demonstrate solutions by developing documentation, flowcharts, layouts, diagrams, charts, code comments and clear code Documenting very necessary details and reports in a formal way for proper understanding of software from client proposal to implementation Ensure good quality of interaction with customer w.r.t. e-mail content, fault report tracking, voice calls, business etiquette etc Timely Response to customer requests and no instances of complaints either internally or externally ? Deliver No. Performance Parameter Measure 1. Continuous Integration, Deployment & Monitoring of Software 100% error free on boarding & implementation, throughput %, Adherence to the schedule/ release plan 2. Quality & CSAT On-Time Delivery, Manage software, Troubleshoot queries, Customer experience, completion of assigned certifications for skill upgradation 3. MIS & Reporting 100% on time MIS & report generation Mandatory Skills: Python for Insights. Experience5-8 Years. Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.

Posted 2 months ago

Apply

4 - 9 years

4 - 8 Lacs

Pune

Work from Office

About The Role Strong Experience in Informatica Powercenter ETL tool & Relational database experience 0.6 Years to 9 years of experience with the technical analysis and design development and implementation of data warehousing and Data Lake solutions Strong SQL programming skills Experience with Teradata is a big plus Strong UNIX Shell scripting experience to support data warehousing solutions Good experience in Hadoop hive eco system Works in the area of Software Engineering, which encompasses the development, maintenance and optimization of software solutions/applications.1. Applies scientific methods to analyse and solve software engineering problems.2. He/she is responsible for the development and application of software engineering practice and knowledge, in research, design, development and maintenance.3. His/her work requires the exercise of original thought and judgement and the ability to supervise the technical and administrative work of other software engineers.4. The software engineer builds skills and expertise of his/her software engineering discipline to reach standard software engineer skills expectations for the applicable role, as defined in Professional Communities.5. The software engineer collaborates and acts as team player with other software engineers and stakeholders. Skills (competencies) Verbal Communication

Posted 2 months ago

Apply

6 - 11 years

15 - 30 Lacs

Hyderabad, Gurugram, Bengaluru

Work from Office

Position: Senior AWS Data Engineer - Interested candidates can send their resumes to heena.ruchwani@gspann.com Experience: 6+ Years Locations: Pune, Hyderabad, Gurugram, Bangalore Notice Period: Immediate to 30 Days Preferred Job Description: We are hiring a Senior AWS Data Engineer to join our growing team. The ideal candidate will have deep expertise in AWS data services, strong ETL experience, and a passion for solving complex data problems at scale. Key Responsibilities: Design and develop scalable, high-performance data pipelines in AWS Work with services like Glue, Redshift, S3, EMR, Lambda, and Athena Build and optimize ETL processes for both structured and unstructured data Collaborate with cross-functional teams to deliver actionable data solutions Implement best practices for data quality, security, and cost-efficiency Required Skills: 6+ years in Data Engineering 3+ years working with AWS (Glue, S3, Redshift, Lambda, EMR, etc.) Proficient in Python or Scala for data transformation Strong SQL skills and experience in performance tuning Hands-on experience with Spark or PySpark Knowledge of data lake and DWH architecture Nice to Have: Familiarity with Kafka, Kinesis, or real-time data streaming Exposure to Terraform or CloudFormation Experience with CI/CD tools like Git and Jenkins How to Apply: Interested candidates can send their resumes to heena.ruchwani@gspann.com

Posted 2 months ago

Apply

2 - 7 years

6 - 10 Lacs

Bengaluru

Work from Office

Hello Talented Techie! We provide support in Project Services and Transformation, Digital Solutions and Delivery Management. We offer joint operations and digitalization services for Global Business Services and work closely alongside the entire Shared Services organization. We make efficient use of the possibilities of new technologies such as Business Process Management (BPM) and Robotics as enablers for efficient and effective implementations. We are looking for Data Engineer ( AWS, Confluent & Snaplogic ) Data Integration Integrate data from various Siemens organizations into our data factory, ensuring seamless data flow and real-time data fetching. Data Processing Implement and manage large-scale data processing solutions using AWS Glue, ensuring efficient and reliable data transformation and loading. Data Storage Store and manage data in a large-scale data lake, utilizing Iceberg tables in Snowflake for optimized data storage and retrieval. Data Transformation Apply various data transformations to prepare data for analysis and reporting, ensuring data quality and consistency. Data Products Create and maintain data products that meet the needs of various stakeholders, providing actionable insights and supporting data-driven decision-making. Workflow Management Use Apache Airflow to orchestrate and automate data workflows, ensuring timely and accurate data processing. Real-time Data Streaming Utilize Confluent Kafka for real-time data streaming, ensuring low-latency data integration and processing. ETL Processes Design and implement ETL processes using SnapLogic , ensuring efficient data extraction, transformation, and loading. Monitoring and Logging Use Splunk for monitoring and logging data processes, ensuring system reliability and performance. You"™d describe yourself as: Experience 3+ relevant years of experience in data engineering, with a focus on AWS Glue, Iceberg tables, Confluent Kafka, SnapLogic, and Airflow. Technical Skills : Proficiency in AWS services, particularly AWS Glue. Experience with Iceberg tables and Snowflake. Knowledge of Confluent Kafka for real-time data streaming. Familiarity with SnapLogic for ETL processes. Experience with Apache Airflow for workflow management. Understanding of Splunk for monitoring and logging. Programming Skills Proficiency in Python, SQL, and other relevant programming languages. Data Modeling Experience with data modeling and database design. Problem-Solving Strong analytical and problem-solving skills, with the ability to troubleshoot and resolve data-related issues. Preferred Qualities: Attention to Detail Meticulous attention to detail, ensuring data accuracy and quality. Communication Skills Excellent communication skills, with the ability to collaborate effectively with cross-functional teams. Adaptability Ability to adapt to changing technologies and work in a fast-paced environment. Team Player Strong team player with a collaborative mindset. Continuous Learning Eagerness to learn and stay updated with the latest trends and technologies in data engineering. Create a better #TomorrowWithUs! This role, based in Bangalore, is an individual contributor position. You may be required to visit other locations within India and internationally. In return, you'll have the opportunity to work with teams shaping the future. At Siemens, we are a collection of over 312,000 minds building the future, one day at a time, worldwide. We value your unique identity and perspective and are fully committed to providing equitable opportunities and building a workplace that reflects the diversity of society. Come bring your authentic self and create a better tomorrow with us. Find out more about Siemens careers at: www.siemens.com/careers

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies