Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Notice period 30 days to immediate Role description Myrefers GCP PythonApache beamp3 to 8 years of overall IT experience which includes hands on experience in Big Data technologies Mandatory Hands on experience in Python and PySpark Python as a language is practically usable for anything we are looking for application Development and Extract Transform Load and Data lake curation experience using Python Build pySpark applications using Spark Dataframes in Python using Jupyter notebook and PyCharm IDE Worked on optimizing spark jobs that processes huge volumes of data Hands on experience in version control tools like Git Worked on Amazons Analytics services like Amazon EMR Amazon Athena AWS Glue Worked on Amazons Compute services like Amazon Lambda Amazon EC2 and Amazons Storage service like S3 and few other services like SNS Experience knowledge of bash shell scripting will be a plus Has built ETL processes to take data copy it structurally transform it etc involving a wide variety of formats like CSV TSV XML and JSON Experience in working with fixed width delimited multi record file formats etc Good to have knowledge of datawarehousing concepts dimensions facts schemas snowflake star etc Have worked with columnar storage formats Parquet Avro ORC etc Well versed with compression techniques Snappy Gzip Good to have knowledge of AWS databases atleast one Aurora RDS Redshift ElastiCache DynamoDB Skills Mandatory Skills :GCP, Apache Spark,Python,SparkSQL,Big Data Hadoop Ecosystem
Posted 9 hours ago
0 years
0 Lacs
Hyderabad, Telangana, India
Remote
Work Level : Individual Core : Responsible Leadership : Team Alignment Industry Type : Information Technology Function : Database Administrator Key Skills : mSQL,SQL Writing,PLSQL Education : Graduate Note: This is a requirement for one of the Workassist Hiring Partner. 🎯 Role Overview Write, optimize, and maintain SQL queries, stored procedures, and functions. This is a Remote Position. Assist in designing and managing relational databases. Perform data extraction, transformation, and loading (ETL) tasks. Ensure database integrity, security, and performance. Work with developers to integrate databases into applications. Support data analysis and reporting by writing complex queries. Document database structures, processes, and best practices. Company Description Workassist is an online recruitment and employment solution platform based in Lucknow, India. We provide relevant profiles to employers and connect job seekers with the best opportunities across various industries. With a network of over 10,000+ recruiters, we help employers recruit talented individuals from sectors such as Banking & Finance, Consulting, Sales & Marketing, HR, IT, Operations, and Legal. We have adapted to the new normal and strive to provide a seamless job search experience for job seekers worldwide. Our goal is to enhance the job seeking experience by leveraging technology and matching job seekers with the right employers. For a seamless job search experience, visit our website: https://bit.ly/3QBfBU2 (Note: There are many more opportunities apart from this on the portal. Depending on the skills, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 9 hours ago
2.0 years
0 Lacs
India
Remote
Title: Power BI Developer Experience: 2+ Yrs Notice Period: 15 Days or Less Location: Remote Job Description 2+ years of experience Data Analysis background Hands-on Expertise in Power BI Microsoft Azure experience Any one Data Migration ETL experience (Microsoft based) Please DO NOT apply if your profile does not meet the job description or required qualifications. Irrelevant applications will not be considered. Share this opportunity to help it reach more job seekers! © Allime Tech Solutions Pvt. Ltd. All rights reserved. About Us At Allime Tech Solutions, we believe in empowering innovation through technology. Our mission is to connect talent with opportunity, creating a future where everyone can thrive. Driven by integrity and excellence, we are committed to providing tailored solutions for our clients.
Posted 9 hours ago
4.0 - 7.0 years
5 - 8 Lacs
Hyderābād
On-site
About NationsBenefits: At NationsBenefits, we are leading the transformation of the insurance industry by developing innovative benefits management solutions. We focus on modernizing complex back-office systems to create scalable, secure, and high-performing platforms that streamline operations for our clients. As part of our strategic growth, we are focused on platform modernization — transitioning legacy systems to modern, cloud-native architectures that support the scalability, reliability, and high performance of core back-office functions in the insurance domain. Position Overview: We are seeking a self-driven Data Engineer with 4–7 years of experience to build and optimize scalable ETL/ELT pipelines using Azure Databricks, PySpark, and Delta Lake. The role involves working across scrum teams to develop data solutions, ensure data governance with Unity Catalog, and support real-time and batch processing. Strong problem-solving skills, T-SQL expertise, and hands-on experience with Azure cloud tools are essential. Healthcare domain knowledge is a plus. Job Description: Work with different scrum teams to develop all the quality database programming requirements of the sprint. Experience in Azure cloud platforms like Advanced Python Programming, Databricks , Azure SQL , Data factory (ADF), Data Lake, Data storage, SSIS. Create and deploy scalable ETL/ELT pipelines with Azure Databricks by utilizing PySpark and SQL . Create Delta Lake tables with ACID transactions and schema evolution to support real-time and batch processing. Experience in Unity Catalog for centralized data governance, access control, and data lineage tracking. Independently analyse, solve, and correct issues in real time, providing problem resolution end-to-end. Develop unit tests to be able to test them automatically. Use SOLID development principles to maintain data integrity and cohesiveness. Interact with product owner and business representatives to determine and satisfy needs. Sense of ownership and pride in your performance and its impact on company's success. Critical thinker and problem-solving skills. Team player. Good time-management skills. Great interpersonal and communication skills. Mandatory Qualifications: 4-7 years of experience as a Data Engineer. Self-driven with minimal supervision. Proven experience with T-SQL programming, Azure Databricks, Spark (PySpark/Scala), Delta Lake, Unity Catalog, ADLS Gen2 Microsoft TFS, Visual Studio, Devops exposure. Experience with cloud platforms such as Azure or any. Analytical, problem-solving mindset. HealthCare domain knowledge Preferred Qualifications Healthcare Domain Knowledge
Posted 9 hours ago
12.0 years
5 - 10 Lacs
Hyderābād
On-site
Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. [Senior Manager Software Development Engineering] What you will do Let’s do this. Let’s change the world. In this vital role you will be responsible for designing, developing, and maintaining software applications and solutions that meet business needs and ensuring the availability and performance of critical systems and applications. This role involves working closely with product managers, designers, and other engineers to create high-quality, scalable software solutions and automating operations, monitoring system health, and responding to incidents to minimize downtime. Roles & Responsibilities: Provide technical leadership to enhance the culture of innovation, automation, and solving difficult scientific and business challenges. Technical leadership includes providing vision and direction to develop scalable reliable solutions. Provide leadership to select right-sized and appropriate tools and architectures based on requirements, data source format, and current technologies Develop, refactor, research and improve Weave cloud platform capabilities. Understand business drivers and technical needs so our cloud services seamlessly, automatically, and securely provides them the best service. Develop data flow pipelines to extract, transform, and load data from various data sources in various forms, including custom ETL pipelines that enable model and product development Build strong partnership with stakeholder Build data products and service processes which perform data transformation, metadata extraction, workload management and error processing management to ensure high quality data Provide clear documentation for delivered solutions and processes, integrating documentation Collaborate with business partners to understand user stories and ensure technical solution/build can deliver to those needs Work with multi-functional teams to design and document effective and efficient solutions. Develop change management strategies and assist in their implementation. Mentor junior data engineers on standard methodologies in the industry and in the Amgen data landscape What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Basic Qualifications and Experience: Doctorate Degree /Master's degree / Bachelor's degree and 12to 17 years Computer Science, IT or related field experience Preferred Skills: Must-Have Skills: Superb communication and interpersonal skills, with the ability to work cross-functionally with multi-functional GTM, product, and engineering teams. Minimum of 10+ years overall Software Engineer or Cloud Architect experience Minimum 3+ years in architecture role using public cloud solutions such as AWS Experience with AWS Technology stack Good-to-Have Skills: Familiarity with big data technologies, AI platforms, and cloud-based data solutions. Ability to work effectively across matrixed organizations and lead collaboration between data and AI teams. Passion for technology and customer success, particularly in driving innovative AI and data solutions. Experience working with teams of data scientists, software engineers and business experts to drive insights Experience with AWS Services such as EC2, S3, Redshift/Spectrum, Glue, Athena, RDS, Lambda, and API gateway. Experience with Big Data Technologies (Hadoop, Hive, Hbase, Pig, Spark, etc) Good understanding of relevant data standards and industry trends Ability to understand new business requirements and prioritize them for delivery Experience working in biopharma/life sciences industry Proficient in one of the coding languages (Python, Java, Scala) Hands on experience writing SQL using any RDBMS (Redshift, Postgres, MySQL, Teradata, Oracle, etc.). Experience with Schema Design & Dimensional data modeling. Experience with software DevOps CI/CD tools, such Git, Jenkins, Linux, and Shell Script Hands on experience using Databricks/Jupyter or similar notebook environment. Experience working with GxP systems Experience working in an agile environment (i.e. user stories, iterative development, etc.) Experience working with test-driven development and software test automation Experience working in a Product environment Good overall understanding of business, manufacturing, and laboratory systems common in the pharmaceutical industry, as well as the integration of these systems through applicable standards. Soft Skills: Excellent analytical and troubleshooting skills. Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to handle multiple priorities successfully. Team-oriented, with a focus on achieving team goals What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
Posted 9 hours ago
0 years
6 - 8 Lacs
Hyderābād
On-site
DESCRIPTION Amazon Business Customer Support (ABCS) is looking for a Business Intelligence Engineer to help build next generation metrics and drive business analytics that have measurable impact. The successful candidate will have a strong understanding of different businesses and customer profiles - the underlying analytics, and the ability to translate business requirements into analysis, collect and analyze data, and make recommendations back to the business. BIEs also continuously learn new systems, tools, and industry best practices to help design new studies and build new tools that help our team automate, and accelerate analytics. As a Business Intelligence Engineer, you will develop strategic reports, design UIs and drive projects to support ABCS decision making. This role is inherently cross-functional — you will work closely with finance teams, engineering, and leadership across Amazon Business Customer Service. A successful candidate will be a self-starter, comfortable with ambiguity, able to think big and be creative (while still paying careful attention to detail). You should be skilled in database design, be comfortable dealing with large and complex data sets, and have experience building self-service dashboards and using visualization tools especially Tableau. You should have strong analytical and communication skills. You will work with a team of analytics professionals who are passionate about using machine learning to build automated systems and solve problems that matter to our customers. Your work will directly impact our customers and operations. Members of this team will be challenged to innovate use the latest big data techniques. We are looking for people who are motivated by thinking big, moving fast, and exploring business insights. If you love to implement solutions to complex problems while working hard, having fun, and making history, this may be the opportunity for you. Own the design, development, and maintenance of ongoing metrics, reports, analyses, dashboards on the key drivers of our business. Key job responsibilities Scope, Design, and build database structure and schema. Create data pipelines using ETL connections/ SQL queries. Retrieve and analyze data using a broad set of Amazon's data technologies. Pull data on an ad-hoc basis using SQL queries. Design, build and maintain automated reporting and dashboards Conduct deep dives to identify root causes of pain points and opportunities For improvement: Become a subject matter expert in AB CS data, and support team members in Dive deep: Work closely with CSBI teams to ensure ABCS uses Globally alleged standard Metrics and definition: Collaborate with finance, business to gather data and metrics requirements. A day in the life We thrive on solving challenging problems to innovate for our customers. By pushing the boundaries of technology, we create unparalleled experiences that enable us to rapidly adapt in a dynamic environment. If you are not sure that every qualification on the list above describes you exactly, we'd still love to hear from you! At Amazon, we value people with unique backgrounds, experiences, and skillsets. If you’re passionate about this role and want to make an impact on a global scale, please apply! BASIC QUALIFICATIONS Experience with data visualization using Tableau, Quicksight, or similar tools Experience with data modeling, warehousing and building ETL pipelines Experience writing complex SQL queries Experience using SQL to pull data from a database or data warehouse and scripting experience (Python) to process data for modeling PREFERRED QUALIFICATIONS Experience with AWS solutions such as EC2, DynamoDB, S3, and Redshift Experience in data mining, ETL, etc. and using databases in a business environment with large-scale, complex datasets Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.
Posted 9 hours ago
3.0 years
0 Lacs
Hyderābād
On-site
DESCRIPTION AOP team within Amazon Transportation is looking for an innovative, hands-on and customer-obsessed Business Intelligence Engineer for Analytics team. Candidate must be detail oriented, have superior verbal and written communication skills, strong organizational skills, excellent technical skills and should be able to juggle multiple tasks at once. Ideal candidate must be able to identify problems before they happen and implement solutions that detect and prevent outages. The candidate must be able to accurately prioritize projects, make sound judgments, work to improve the customer experience and get the right things done. This job requires you to constantly hit the ground running and have the ability to learn quickly. Primary responsibilities include defining the problem and building analytical frameworks to help the operations to streamline the process, identifying gaps in the existing process by analyzing data and liaising with relevant team(s) to plug it and analyzing data and metrics and sharing update with the internal teams. Key job responsibilities 1) Apply multi-domain/process expertise in day to day activities and own end to end roadmap. 2) Translate complex or ambiguous business problem statements into analysis requirements and maintain high bar throughout the execution. 3) Define analytical approach; review and vet analytical approach with stakeholders. 4) Proactively and independently work with stakeholders to construct use cases and associated standardized outputs 5) Scale data processes and reports; write queries that clients can update themselves; lead work with data engineering for full-scale automation 6) Have a working knowledge of the data available or needed by the wider business for more complex or comparative analysis 7) Work with a variety of data sources and Pull data using efficient query development that requires less post processing (e.g., Window functions, virt usage) 8) When needed, pull data from multiple similar sources to triangulate on data fidelity 9) Actively manage the timeline and deliverables of projects, focusing on interactions in the team 10) Provide program communications to stakeholders 11) Communicate roadblocks to stakeholders and propose solutions 12) Represent team on medium-size analytical projects in own organization and effectively communicate across teams [January 21, 2025, 1:30 PM] Dhingra, Gunjit: Day in life A day in the life 1) Solve ambiguous analyses with less well-defined inputs and outputs; drive to the heart of the problem and identify root causes 2) Have the capability to handle large data sets in analysis through the use of additional tools 3) Derive recommendations from analysis that significantly impact a department, create new processes, or change existing processes 4) Understand the basics of test and control comparison; may provide insights through basic statistical measures such as hypothesis testing 5) Identify and implement optimal communication mechanisms based on the data set and the stakeholders involved 6) Communicate complex analytical insights and business implications effectively About the team AOP (Analytics Operations and Programs) team is missioned to standardize BI and analytics capabilities, and reduce repeat analytics/reporting/BI workload for operations across IN, AU, BR, MX, SG, AE, EG, SA marketplace. AOP is responsible to provide visibility on operations performance and implement programs to improve network efficiency and defect reduction. The team has a diverse mix of strong engineers, Analysts and Scientists who champion customer obsession. We enable operations to make data-driven decisions through developing near real-time dashboards, self-serve dive-deep capabilities and building advanced analytics capabilities. We identify and implement data-driven metric improvement programs in collaboration (co-owning) with Operations teams. BASIC QUALIFICATIONS 3+ years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience Experience with data visualization using Tableau, Quicksight, or similar tools Experience with data modeling, warehousing and building ETL pipelines Experience in Statistical Analysis packages such as R, SAS and Matlab Experience using SQL to pull data from a database or data warehouse and scripting experience (Python) to process data for modeling PREFERRED QUALIFICATIONS Experience with AWS solutions such as EC2, DynamoDB, S3, and Redshift Experience in data mining, ETL, etc. and using databases in a business environment with large-scale, complex datasets Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.
Posted 9 hours ago
5.0 years
0 Lacs
Hyderābād
On-site
Job Description Overview We are seeking a skilled Associate Manager – AIOps & MLOps Operations to support and enhance the automation, scalability, and reliability of AI/ML operations across the enterprise. This role requires a solid understanding of AI-driven observability, machine learning pipeline automation, cloud-based AI/ML platforms, and operational excellence. The ideal candidate will assist in deploying AI/ML models, ensuring continuous monitoring, and implementing self-healing automation to improve system performance, minimize downtime, and enhance decision-making with real-time AI-driven insights. Support and maintain AIOps and MLOps programs, ensuring alignment with business objectives, data governance standards, and enterprise data strategy. Assist in implementing real-time data observability, monitoring, and automation frameworks to enhance data reliability, quality, and operational efficiency. Contribute to developing governance models and execution roadmaps to drive efficiency across data platforms, including Azure, AWS, GCP, and on-prem environments. Ensure seamless integration of CI/CD pipelines, data pipeline automation, and self-healing capabilities across the enterprise. Collaborate with cross-functional teams to support the development and enhancement of next-generation Data & Analytics (D&A) platforms. Assist in managing the people, processes, and technology involved in sustaining Data & Analytics platforms, driving operational excellence and continuous improvement. Support Data & Analytics Technology Transformations by ensuring proactive issue identification and the automation of self-healing capabilities across the PepsiCo Data Estate. Responsibilities Support the implementation of AIOps strategies for automating IT operations using Azure Monitor, Azure Log Analytics, and AI-driven alerting. Assist in deploying Azure-based observability solutions (Azure Monitor, Application Insights, Azure Synapse for log analytics, and Azure Data Explorer) to enhance real-time system performance monitoring. Enable AI-driven anomaly detection and root cause analysis (RCA) by collaborating with data science teams using Azure Machine Learning (Azure ML) and AI-powered log analytics. Contribute to developing self-healing and auto-remediation mechanisms using Azure Logic Apps, Azure Functions, and Power Automate to proactively resolve system issues. Support ML lifecycle automation using Azure ML, Azure DevOps, and Azure Pipelines for CI/CD of ML models. Assist in deploying scalable ML models with Azure Kubernetes Service (AKS), Azure Machine Learning Compute, and Azure Container Instances. Automate feature engineering, model versioning, and drift detection using Azure ML Pipelines and MLflow. Optimize ML workflows with Azure Data Factory, Azure Databricks, and Azure Synapse Analytics for data preparation and ETL/ELT automation. Implement basic monitoring and explainability for ML models using Azure Responsible AI Dashboard and InterpretML. Collaborate with Data Science, DevOps, CloudOps, and SRE teams to align AIOps/MLOps strategies with enterprise IT goals. Work closely with business stakeholders and IT leadership to implement AI-driven insights and automation to enhance operational decision-making. Track and report AI/ML operational KPIs, such as model accuracy, latency, and infrastructure efficiency. Assist in coordinating with cross-functional teams to maintain system performance and ensure operational resilience. Support the implementation of AI ethics, bias mitigation, and responsible AI practices using Azure Responsible AI Toolkits. Ensure adherence to Azure Information Protection (AIP), Role-Based Access Control (RBAC), and data security policies. Assist in developing risk management strategies for AI-driven operational automation in Azure environments. Prepare and present program updates, risk assessments, and AIOps/MLOps maturity progress to stakeholders as needed. Support efforts to attract and build a diverse, high-performing team to meet current and future business objectives. Help remove barriers to agility and enable the team to adapt quickly to shifting priorities without losing productivity. Contribute to developing the appropriate organizational structure, resource plans, and culture to support business goals. Leverage technical and operational expertise in cloud and high-performance computing to understand business requirements and earn trust with stakeholders. Qualifications 5+ years of technology work experience in a global organization, preferably in CPG or a similar industry. 5+ years of experience in the Data & Analytics field, with exposure to AI/ML operations and cloud-based platforms. 5+ years of experience working within cross-functional IT or data operations teams. 2+ years of experience in a leadership or team coordination role within an operational or support environment. Experience in AI/ML pipeline operations, observability, and automation across platforms such as Azure, AWS, and GCP. Excellent Communication: Ability to convey technical concepts to diverse audiences and empathize with stakeholders while maintaining confidence. Customer-Centric Approach: Strong focus on delivering the right customer experience by advocating for customer needs and ensuring issue resolution. Problem Ownership & Accountability: Proactive mindset to take ownership, drive outcomes, and ensure customer satisfaction. Growth Mindset: Willingness and ability to adapt and learn new technologies and methodologies in a fast-paced, evolving environment. Operational Excellence: Experience in managing and improving large-scale operational services with a focus on scalability and reliability. Site Reliability & Automation: Understanding of SRE principles, automated remediation, and operational efficiencies. Cross-Functional Collaboration: Ability to build strong relationships with internal and external stakeholders through trust and collaboration. Familiarity with CI/CD processes, data pipeline management, and self-healing automation frameworks. Strong understanding of data acquisition, data catalogs, data standards, and data management tools. Knowledge of master data management concepts, data governance, and analytics.
Posted 9 hours ago
9.0 years
7 - 9 Lacs
Hyderābād
Remote
Job Description Overview Primary focus would be to lead development work within Azure Data Lake environment and other related ETL technologies, with the responsibility of ensuring on time and on budget delivery; Satisfying project requirements, while adhering to enterprise architecture standards. Role will lead key data lake projects and resources, including innovation related initiatives (e.g. adoption of technologies like Databricks, Presto, Denodo, Python,Azure data factory; database encryption; enabling rapid experimentation etc.)). This role will also have L3 and release management responsibilities for ETL processes Responsibilities Lead delivery of key Enterprise Data Warehouse and Azure Data Lake projects within time and budget Drive solution design and build to ensure scalability, performance and reuse of data and other components Ensure on time and on budget delivery which satisfies project requirements, while adhering to enterprise architecture standards. Manage work intake, prioritization and release timing; balancing demand and available resources. Ensure tactical initiatives are aligned with the strategic vision and business needs Oversee coordination and partnerships with Business Relationship Managers, Architecture and IT services teams to develop and maintain EDW and data lake best practices and standards along with appropriate quality assurance policies and procedures May lead a team of employee and contract resources to meet build requirements: Set priorities for the team to ensure task completion Coordinate work activities with other IT services and business teams. Hold team accountable for milestone deliverables Provide L3 support for existing applications Release management Qualifications Experience Bachelor’s degree in Computer Science, MIS, Business Management, or related field 9 + years’ experience in Information Technology or Business Relationship Management 5 + years’ experience in Data Warehouse/Azure Data Lake 3 years’ experience in Azure data lake 2 years’ experience in project management Technical Skills Thorough knowledge of data warehousing / data lake concepts Hands on experience on tools like Azure data factory, databricks, pyspark and other data management tools on Azure Proven experience in managing Data, BI or Analytics projects Solutions Delivery experience - expertise in system development lifecycle, integration, and sustainability Experience in data modeling or database experience; Non-Technical Skills Excellent remote collaboration skills Experience working in a matrix organization with diverse priorities Experience dealing with and managing multiple vendors Exceptional written and verbal communication skills along with collaboration and listening skills Ability to work with agile delivery methodologies Ability to ideate requirements & design iteratively with business partners without formal requirements documentation Ability to budget resources and funding to meet project deliverables
Posted 9 hours ago
12.0 - 16.0 years
2 - 9 Lacs
Hyderābād
On-site
Job description Some careers shine brighter than others If you’re looking for a career that will help you stand out, join HSBC, and fulfil your potential Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further HSBC is one of the largest banking and financial services organizations in the world, with operations in 64 countries and territories We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realize their ambitions We are currently seeking an experienced professional to join our team in the role of Consultant Specialist 12 - 16 years of experience with below requirements and skills: Advanced SQL Development: Write complex SQL queries for data extraction, transformation, and analysis. Optimize SQL queries for performance and scalability. SQL Tuning and Joins: Analyze and improve query performance. Deep understanding of joins, indexing, and query execution plans. GCP BigQuery and GCS: Work with Google BigQuery for data warehousing and analytics. Manage and integrate data using Google Cloud Storage (GCS). Airflow DAG Development: Design, develop, and maintain workflows using Apache Airflow. Write custom DAGs to automate data pipelines and processes. Python Programming: Develop and maintain Python scripts for data processing and automation. Debug and optimize Python code for performance and reliability. Shell Scripting: Write and debug basic shell scripts for automation and system tasks. Continuous Learning: Stay updated with the latest tools and technologies in data engineering. Demonstrate a strong ability and attitude to learn and adapt quickly. Communication: Collaborate effectively with cross-functional teams. Clearly communicate technical concepts to both technical and non-technical stakeholders. Requirements To be successful in this role, you should meet the following requirements: Advanced SQL writing and query optimization. Strong understanding of SQL tuning, joins, and indexing. Hands-on experience with GCP services, especially BigQuery and GCS. Proficiency in Python programming and debugging. Experience with Apache Airflow and DAG development. Basic knowledge of shell scripting. Excellent problem-solving skills and a growth mindset. Strong verbal and written communication skills. Experience with data pipeline orchestration and ETL processes. Familiarity with other GCP services like Dataflow or Pub/Sub. Knowledge of CI/CD pipelines and version control (e.g., Git). You’ll achieve more when you join HSBC www.hsbc.com/careers HSBC is committed to building a culture where all employees are valued, respected and opinions count We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website Issued by – HSBC Software Development India
Posted 9 hours ago
7.0 - 9.0 years
2 - 6 Lacs
Hyderābād
On-site
Job Description Overview This role is designed for an experienced Business Analyst who will play a pivotal part in driving data-driven decision-making and process optimization for North America Data Product Management team.The ideal candidate will combine advanced analytics skills, deep SQL expertise, and practical data engineering knowledge with a strong understanding of the FMCG Domain. You will work cross-functionally to transform business requirements into actionable insights and scalable solutions, supporting both strategic and operational objectives. Responsibilities Business Process Analysis & Optimization Analyze existing business processes, identify improvement opportunities, and recommend solutions that enhance efficiency, reduce costs, and drive growth within the beverages sector. Collaborate with stakeholders to map and document end-to-end business processes and data flows. Data Analysis & Reporting Design, write, and optimize complex SQL queries to extract, manipulate, and analyze large datasets from multiple sources. Develop and maintain dashboards, reports, and KPIs that provide actionable insights to business leaders and operational teams. Requirements Gathering & Solution Design Engage with business stakeholders to gather, document, and prioritize business and functional requirements for analytics, reporting, and data engineering projects. Translate business needs into technical specifications for development teams, ensuring alignment with business goals. Data Engineering Support Work closely with data engineering teams to support the design, development, and maintenance of robust data pipelines and data models. Participate in data migration, integration, and transformation projects, ensuring data quality and integrity throughout. Domain Expertise & Stakeholder Engagement Leverage deep domain knowledge of the beverages industry to provide context for data analysis, interpret trends, and recommend relevant business actions. Act as a trusted advisor to business partners, fostering strong relationships and ensuring solutions are tailored to sector needs. Continuous Improvement & Innovation Stay up to date with industry trends, best practices, and new technologies in analytics, data engineering, and the beverages sector. Proactively identify and champion opportunities for process automation, digitalization, and innovation. Qualifications Education: Bachelor’s or Master’s degree in Business, Computer Science, Engineering, Statistics, or a related field. Experience: 7–9 years in business analysis, data analytics, or a related field within the consumer goods, beverages, or FMCG industry. SQL Expertise: Advanced proficiency in SQL for data extraction, manipulation, and analysis. Data Engineering: Experience working with data pipelines, ETL processes, and data modeling (hands-on or in close partnership with data engineering teams). Domain Knowledge: Strong understanding of the beverages industry, including market dynamics, supply chain, sales, and marketing operations. Analytical Thinking: Ability to synthesize complex data from multiple sources, identify trends, and provide clear, actionable recommendations. Communication: Excellent written and verbal communication skills; able to translate technical concepts for non-technical stakeholders and vice versa. Stakeholder Management: Proven ability to work cross-functionally, manage multiple priorities, and build strong relationships with business and technical teams. Problem-Solving: Solution-oriented mindset with a track record of driving process improvements and delivering business value. Preferred Qualifications Experience with data visualization tools (e.g., Power BI, Tableau). Familiarity with cloud data platforms (e.g., Azure, AWS, GCP). Knowledge of Python or R for data analysis (a plus). Previous experience in a data product or digital transformation environment.
Posted 9 hours ago
4.0 years
0 Lacs
Hyderābād
On-site
DESCRIPTION Amazon Transportation team is looking for an innovative, hands-on and customer-obsessed Business Analyst for Analytics team. Candidate must be detail oriented, have superior verbal and written communication skills, strong organizational skills, excellent technical skills and should be able to juggle multiple tasks at once. Ideal candidate must be able to identify problems before they happen and implement solutions that detect and prevent outages. The candidate must be able to accurately prioritize projects, make sound judgments, work to improve the customer experience and get the right things done. This job requires you to constantly hit the ground running and have the ability to learn quickly. Primary responsibilities include defining the problem and building analytical frameworks to help the operations to streamline the process, identifying gaps in the existing process by analyzing data and liaising with relevant team(s) to plug it and analyzing data and metrics and sharing update with the internal teams. Key job responsibilities 1) Apply multi-domain/process expertise in day to day activities and own end to end roadmap. 2) Translate complex or ambiguous business problem statements into analysis requirements and maintain high bar throughout the execution. 3) Define analytical approach; review and vet analytical approach with stakeholders. 4) Proactively and independently work with stakeholders to construct use cases and associated standardized outputs 5) Scale data processes and reports; write queries that clients can update themselves; lead work with data engineering for full-scale automation 6) Have a working knowledge of the data available or needed by the wider business for more complex or comparative analysis 7) Work with a variety of data sources and Pull data using efficient query development that requires less post processing (e.g., Window functions, virt usage) 8) When needed, pull data from multiple similar sources to triangulate on data fidelity 9) Actively manage the timeline and deliverables of projects, focusing on interactions in the team 10) Provide program communications to stakeholders 11) Communicate roadblocks to stakeholders and propose solutions 12) Represent team on medium-size analytical projects in own organization and effectively communicate across teams A day in the life 1) Solve ambiguous analyses with less well-defined inputs and outputs; drive to the heart of the problem and identify root causes 2) Have the capability to handle large data sets in analysis through the use of additional tools 3) Derive recommendations from analysis that significantly impact a department, create new processes, or change existing processes 4) Understand the basics of test and control comparison; may provide insights through basic statistical measures such as hypothesis testing 5) Identify and implement optimal communication mechanisms based on the data set and the stakeholders involved 6) Communicate complex analytical insights and business implications effectively About the team AOP (Analytics Operations and Programs) team is missioned to standardize BI and analytics capabilities, and reduce repeat analytics/reporting/BI workload for operations across IN, AU, BR, MX, SG, AE, EG, SA marketplace. AOP is responsible to provide visibility on operations performance and implement programs to improve network efficiency and defect reduction. The team has a diverse mix of strong engineers, Analysts and Scientists who champion customer obsession. We enable operations to make data-driven decisions through developing near real-time dashboards, self-serve dive-deep capabilities and building advanced analytics capabilities. We identify and implement data-driven metric improvement programs in collaboration (co-owning) with Operations teams. BASIC QUALIFICATIONS 4+ years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience Experience with data visualization using Tableau, Quicksight, or similar tools Experience with data modeling, warehousing and building ETL pipelines Experience with AWS solutions such as EC2, DynamoDB, S3, and Redshift Experience using SQL to pull data from a database or data warehouse and scripting experience (Python) to process data for modeling Experience developing and presenting recommendations of new metrics allowing better understanding of the performance of the business 4+ years of ecommerce, transportation, finance or related analytical field experience PREFERRED QUALIFICATIONS Experience in Statistical Analysis packages such as R, SAS and Matlab Experience in data mining, ETL, etc. and using databases in a business environment with large-scale, complex datasets Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.
Posted 9 hours ago
10.0 years
5 - 10 Lacs
Hyderābād
On-site
Job Description Overview The role will be responsible to successfully distribute the master data across the landscape including MDG, S4 HANA, DDH and downstream application. The role will be responsible to ensure data consistency, seamless movement of data to avoid any adverse impact to business transactions. The Data conversion and ETL expert will have a good understanding of Data architecture, Data Solutions and Systems capabilities based around SAP S4/HANA as the core platform, and should be able to understand and influence end to end business requirements so that realistic and attainable solution is deployed. Responsibilities Partner with multiple Value Streams to define the data design and data standards for the S/4 migration project Partnership with other sector data leads to integrate the data migration standards and activities. Ensure data consistency across the landscape Development of standards and guidelines for master data interface modelling Support onboarding and KT for project resources commencing S4 migration/deployments projects Develop processes, template and migration tools (ETL) for new objects in scope for S4 deployment Qualifications Bachelor’s degree required 10+ years of functional experience with data / conversions / interfaces Demonstrated ability to effectively communicate with all levels of the organization Ability to work flexible hours based on varying business requirements Solves highly complex problems within their work team Ability to quickly adapt to changes in timelines and sequences Adaptability and flexibility including ability to manage deadline pressure, ambiguity and change
Posted 9 hours ago
5.0 years
2 - 3 Lacs
Hyderābād
On-site
Category: Business Consulting, Strategy and Digital Transformation Main location: India, Andhra Pradesh, Hyderabad Position ID: J0725-0862 Employment Type: Full Time Position Description: Job Title: Data EngineerExperience Level: 5+ YearsLocation: Hyderabad Job Summary We are looking for a seasoned and innovative Senior Data Engineer to join our dynamic data team. This role is ideal for professionals with a strong foundation in data engineering, coupled with hands-on experience in machine learning workflows, statistical analysis, and big data technologies. You will play a critical role in building scalable data pipelines, enabling advanced analytics, and supporting data science initiatives. Proficiency in Python is essential, and experience with PySpark is a strong plus. Key Responsibilities Data Pipeline Development: Design and implement scalable, high-performance ETL/ELT pipelines using Python and PySpark. ML & Statistical Integration: Collaborate with data scientists to integrate machine learning models and statistical analysis into data workflows. Data Modeling: Create and optimize data models (relational, dimensional, and columnar) to support analytics and ML use cases. Big Data Infrastructure: Manage and optimize data platforms such as Snowflake, Redshift, BigQuery, and Databricks. Performance Tuning: Monitor and enhance the performance of data pipelines and queries. Data Governance: Ensure data quality, integrity, and compliance through robust governance practices. Cross-functional Collaboration: Partner with analysts, scientists, and product teams to translate business needs into technical solutions. Automation & Monitoring: Automate data workflows and implement monitoring and alerting systems. Mentorship: Guide junior engineers and promote best practices in data engineering and ML integration. Innovation: Stay current with emerging technologies in data engineering, ML, and analytics. Required Qualifications Bachelor’s or Master’s degree in Computer Science, Data Science, Engineering, or a related field. 5+ years of experience in data engineering with a strong focus on Python and big data tools. Solid understanding of machine learning concepts and statistical analysis techniques. Proficiency in SQL and Python; experience with PySpark is highly desirable. Experience with cloud platforms (AWS, Azure, or GCP) and data tools (e.g., Glue, Data Factory, Dataflow). Familiarity with data warehousing and lakehouse architectures. Knowledge of data modeling techniques (e.g., star schema, snowflake schema). Experience with version control systems like Git. Strong problem-solving skills and ability to work in a fast-paced environment. Excellent communication and collaboration skills. Skills: English Data Engineering Python SQLite Statistical Analysis What you can expect from us: Together, as owners, let’s turn meaningful insights into action. Life at CGI is rooted in ownership, teamwork, respect and belonging. Here, you’ll reach your full potential because… You are invited to be an owner from day 1 as we work together to bring our Dream to life. That’s why we call ourselves CGI Partners rather than employees. We benefit from our collective success and actively shape our company’s strategy and direction. Your work creates value. You’ll develop innovative solutions and build relationships with teammates and clients while accessing global capabilities to scale your ideas, embrace new opportunities, and benefit from expansive industry and technology expertise. You’ll shape your career by joining a company built to grow and last. You’ll be supported by leaders who care about your health and well-being and provide you with opportunities to deepen your skills and broaden your horizons. Come join our team—one of the largest IT and business consulting services firms in the world.
Posted 9 hours ago
5.0 - 10.0 years
0 Lacs
Hyderābād
On-site
Job Description Overview DataOps L3 The role will leverage & enhance existing technologies in the area of data and analytics solutions like Power BI, Azure data engineering technologies, ADLS, ADB, Synapse, and other Azure services. The role will be responsible for developing and support IT products and solutions using these technologies and deploy them for business users Responsibilities 5 to 10 Years of IT & Azure Data engineering technologies experience Prior experience in ETL, data pipelines, data flow techniques using Azure Data Services Working experience in Python, Py Spark, Azure Data Factory, Azure Data Lake Gen2, Databricks, Azure Synapse and file formats like JSON & Parquet. Experience in creating ADF Pipelines to source and process data sets. Experience in creating Databricks notebooks to cleanse, transform and enrich data sets. Development experience in orchestration of pipelines Good understanding about SQL, Databases, Datawarehouse systems preferably Teradata Experience in deployment and monitoring techniques. Working experience with Azure DevOps CI/CD pipelines to deploy Azure resources. Experience in handling operations/Integration with source repository Must have good knowledge on Datawarehouse concepts and Datawarehouse modelling. Working knowledge of SNOW including resolving incidents, handling Change requests /Service requests, reporting on metrics to provide insights. Collaborate with the project team to understand tasks to model tables using data warehouse best practices and develop data pipelines to ensure the efficient delivery of data. Strong expertise in performance tuning and optimization of data processing systems. Proficient in Azure Data Factory, Azure Databricks, Azure SQL Database, and other Azure data services. Develop and enforce best practices for data management, including data governance and security. Work closely with cross-functional teams to understand data requirements and deliver solutions that meet business needs. Proficient in implementing DataOps framework. Qualifications Azure data factory Azure Databricks Azure Synapse PySpark/SQL ADLS Azure DevOps with CI/CD implementation. Nice-to-Have Skill Sets: Business Intelligence tools (preferred—Power BI) DP-203 Certified.
Posted 9 hours ago
12.0 years
5 - 9 Lacs
Hyderābād
On-site
Job Description Overview PepsiCo Data BI & Integration Platforms is seeking an experienced Cloud Platform Databricks SME, responsible for overseeing the Platform administration, Security, new NPI tools integration, migrations, platform maintenance and other platform administration activities on Azure/AWS.The ideal candidate will have hands-on experience with Azure/AWS services – Infrastructure as Code (IaC), platform provisioning & administration, cloud network design, cloud security principles and automation. Responsibilities Databricks Subject Matter Expert (SME) plays a pivotal role in admin, security best practices, platform sustain support, new tools adoption, cost optimization, supporting new patterns/design solutions using the Databricks platform. Here’s a breakdown of typical responsibilities: Core Technical Responsibilities Architect and optimize big data pipelines using Apache Spark, Delta Lake, and Databricks-native tools. Design scalable data ingestion and transformation workflows, including batch and streaming (e.g., Kafka, Spark Structured Streaming). Create integration guidelines to configure and integrate Databricks with other existing security tools relevant to data access control. Implement data security and governance using Unity Catalog, access controls, and data classification techniques. Support migration of legacy systems to Databricks on cloud platforms like Azure, AWS, or GCP. Manage cloud platform operations with a focus on FinOps support, optimizing resource utilization, cost visibility, and governance across multi-cloud environments. Collaboration & Advisory Act as a technical advisor to data engineering and analytics teams, guiding best practices and performance tuning. Partner with architects and business stakeholders to align Databricks solutions with enterprise goals. Lead proof-of-concept (PoC) initiatives to demonstrate Databricks capabilities for specific use cases. Strategic & Leadership Contributions Mentor junior engineers and promote knowledge sharing across teams. Contribute to platform adoption strategies, including training, documentation, and internal evangelism. Stay current with Databricks innovations and recommend enhancements to existing architectures. Specialized Expertise (Optional but Valuable) Machine Learning & AI integration using MLflow, AutoML, or custom models. Cost optimization and workload sizing for large-scale data processing. Compliance and audit readiness for regulated industries. Qualifications Bachelor’s degree in computer science. At least 12 years of experience in IT cloud infrastructure, architecture and operations, including security, with at least 5 years in a Platform admin role Strong understanding of data security principles and best practices. Expertise in Databricks platform, security features, Unity Catalog, and data access control mechanisms. Experience with data classification and masking techniques. Strong understanding of cloud cost management, with hands-on experience in usage analytics, budgeting, and cost optimization strategies across multi-cloud platforms. Strong knowledge of cloud architecture, design, and deployment principles and practices, including microservices, serverless, containers, and DevOps. Deep expertise in Azure/AWS big data & analytics technologies, including Databricks, real time data ingestion, data warehouses, serverless ETL, No SQL databases, DevOps, Kubernetes, virtual machines, web/function apps, monitoring and security tools. Deep expertise in Azure/AWS networking and security fundamentals, including network endpoints & network security groups, firewalls, external/internal DNS, load balancers, virtual networks and subnets. Proficient in scripting and automation tools, such as PowerShell, Python, Terraform, and Ansible. Excellent problem-solving, analytical, and communication skills, with the ability to explain complex technical concepts to non-technical audiences. Certifications in Azure/AWS/Databricks platform administration, networking and security are preferred. Strong self-organization, time management and prioritization skills A high level of attention to detail, excellent follow through, and reliability Strong collaboration, teamwork and relationship building skills across multiple levels and functions in the organization Ability to listen, establish rapport, and credibility as a strategic partner vertically within the business unit or function, as well as with leadership and functional teams Strategic thinker focused on business value results that utilize technical solutions Strong communication skills in writing, speaking, and presenting Capable to work effectively in a multi-tasking environment. Fluent in English language.
Posted 9 hours ago
10.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Acuity Knowledge Partners (Acuity) is a leading provider of bespoke research, analytics and technology solutions to the financial services sector, including asset managers, corporate and investment banks, private equity and venture capital firms, hedge funds and consulting firms. Its global network of over 6,000 analysts and industry experts, combined with proprietary technology, supports more than 600 financial institutions and consulting companies to operate more efficiently and unlock their human capital, driving revenue higher and transforming operations. Acuity is headquartered in London and operates from 10 locations worldwide. The company fosters a diverse, equitable and inclusive work environment, nurturing talent, regardless of race, gender, ethnicity or sexual orientation. Acuity was established as a separate business from Moody’s Corporation in 2019, following its acquisition by Equistone Partners Europe (Equistone). In January 2023, funds advised by global private equity firm Permira acquired a majority stake in the business from Equistone, which remains invested as a minority shareholder. For more information, visit www.acuitykp.com Position Title- Associate Director (Senior Architect – Data) Department-IT Location- Gurgaon/ Bangalore Job Summary The Enterprise Data Architect will enhance the company's strategic use of data by designing, developing, and implementing data models for enterprise applications and systems at conceptual, logical, business area, and application layers. This role advocates data modeling methodologies and best practices. We seek a skilled Data Architect with deep knowledge of data architecture principles, extensive data modeling experience, and the ability to create scalable data solutions. Responsibilities include developing and maintaining enterprise data architecture, ensuring data integrity, interoperability, security, and availability, with a focus on ongoing digital transformation projects. Key Responsibilities Strategy & Planning Develop and deliver long-term strategic goals for data architecture vision and standards in conjunction with data users, department managers, clients, and other key stakeholders. Create short-term tactical solutions to achieve long-term objectives and an overall data management roadmap. Establish processes for governing the identification, collection, and use of corporate metadata; take steps to assure metadata accuracy and validity. Establish methods and procedures for tracking data quality, completeness, redundancy, and improvement. Conduct data capacity planning, life cycle, duration, usage requirements, feasibility studies, and other tasks. Create strategies and plans for data security, backup, disaster recovery, business continuity, and archiving. Ensure that data strategies and architectures are aligned with regulatory compliance. Develop a comprehensive data strategy in collaboration with different stakeholders that aligns with the transformational projects’ goals. Ensure effective data management throughout the project lifecycle. Acquisition & Deployment Ensure the success of enterprise-level application rollouts (e.g. ERP, CRM, HCM, FP&A, etc.) Liaise with vendors and service providers to select the products or services that best meet company goals Operational Management o Assess and determine governance, stewardship, and frameworks for managing data across the organization. o Develop and promote data management methodologies and standards. o Document information products from business processes and create data entities o Create entity relationship diagrams to show the digital thread across the value streams and enterprise o Create data normalization across all systems and data base to ensure there is common definition of data entities across the enterprise o Document enterprise reporting needs develop the data strategy to enable single source of truth for all reporting data o Address the regulatory compliance requirements of each country and ensure our data is secure and compliant o Select and implement the appropriate tools, software, applications, and systems to support data technology goals. o Oversee the mapping of data sources, data movement, interfaces, and analytics, with the goal of ensuring data quality. o Collaborate with project managers and business unit leaders for all projects involving enterprise data. o Address data-related problems regarding systems integration, compatibility, and multiple-platform integration. o Act as a leader and advocate of data management, including coaching, training, and career development to staff. o Develop and implement key components as needed to create testing criteria to guarantee the fidelity and performance of data architecture. o Document the data architecture and environment to maintain a current and accurate view of the larger data picture. o Identify and develop opportunities for data reuse, migration, or retirement. Data Architecture Design: Develop and maintain the enterprise data architecture, including data models, databases, data warehouses, and data lakes. Design and implement scalable, high-performance data solutions that meet business requirements. Data Governance: Establish and enforce data governance policies and procedures as agreed with stakeholders. Maintain data integrity, quality, and security within Finance, HR and other such enterprise systems. Data Migration: Oversee the data migration process from legacy systems to the new systems being put in place. Define & Manage data mappings, cleansing, transformation, and validation to ensure accuracy and completeness. Master Data Management: Devise processes to manage master data (e.g., customer, vendor, product information) to ensure consistency and accuracy across enterprise systems and business processes. Provide data management (create, update and delimit) methods to ensure master data is governed Stakeholder Collaboration: Collaborate with various stakeholders, including business users, other system vendors, and stakeholders to understand data requirements. Ensure the enterprise system meets the organization's data needs. Training and Support: Provide training and support to end-users on data entry, retrieval, and reporting within the candidate enterprise systems. Promote user adoption and proper use of data. 10 Data Quality Assurance: Implement data quality assurance measures to identify and correct data issues. Ensure the Oracle Fusion and other enterprise systems contain reliable and up-to-date information. Reporting and Analytics: Facilitate the development of reporting and analytics capabilities within the Oracle Fusion and other systems Enable data-driven decision-making through robust data analysis. Continuous Improvement: Continuously monitor and improve data processes and the Oracle Fusion and other system's data capabilities. Leverage new technologies for enhanced data management to support evolving business needs. Technology and Tools: Oracle Fusion Cloud Data modeling tools (e.g., ER/Studio, ERwin) ETL tools (e.g., Informatica, Talend, Azure Data Factory) Data Pipelines: Understanding of data pipeline tools like Apache Airflow and AWS Glue. Database management systems: Oracle Database, MySQL, SQL Server, PostgreSQL, MongoDB, Cassandra, Couchbase, Redis, Hadoop, Apache Spark, Amazon RDS, Google BigQuery, Microsoft Azure SQL Database, Neo4j, OrientDB, Memcached) Data governance tools (e.g., Collibra, Informatica Axon, Oracle EDM, Oracle MDM) Reporting and analytics tools (e.g., Oracle Analytics Cloud, Power BI, Tableau, Oracle BIP) Hyperscalers / Cloud platforms (e.g., AWS, Azure) Big Data Technologies such as Hadoop, HDFS, MapReduce, and Spark Cloud Platforms such as Amazon Web Services, including RDS, Redshift, and S3, Microsoft Azure services like Azure SQL Database and Cosmos DB and experience in Google Cloud Platform services such as BigQuery and Cloud Storage. Programming Languages: (e.g. using Java, J2EE, EJB, .NET, WebSphere, etc.) SQL: Strong SQL skills for querying and managing databases. Python: Proficiency in Python for data manipulation and analysis. Java: Knowledge of Java for building data-driven applications. Data Security and Protocols: Understanding of data security protocols and compliance standards. Key Competencies Qualifications: Education: Bachelor’s degree in computer science, Information Technology, or a related field. Master’s degree preferred. Experience: 10+ years overall and at least 7 years of experience in data architecture, data modeling, and database design. Proven experience with data warehousing, data lakes, and big data technologies. Expertise in SQL and experience with NoSQL databases. Experience with cloud platforms (e.g., AWS, Azure) and related data services. Experience with Oracle Fusion or similar ERP systems is highly desirable. Skills: Strong understanding of data governance and data security best practices. Excellent problem-solving and analytical skills. Strong communication and interpersonal skills. Ability to work effectively in a collaborative team environment. Leadership experience with a track record of mentoring and developing team members. Excellent in documentation and presentations. Good knowledge of applicable data privacy practices and laws. Certifications: Relevant certifications (e.g., Certified Data Management Professional, AWS Certified Big Data – Specialty) are a plus. Behavioral A self-starter, an excellent planner and executor and above all, a good team player Excellent communication skills and inter-personal skills are a must Must possess organizational skills, including multi-task capability, priority setting and meeting deadlines Ability to build collaborative relationships and effectively leverage networks to mobilize resources Initiative to learn business domain is highly desirable Likes dynamic and constantly evolving environment and requirements
Posted 9 hours ago
10.0 - 12.0 years
0 Lacs
Hyderābād
On-site
Job Description Overview PepsiCo Data BI & Integration Platforms is seeking an experienced Cloud Platform technology leader, responsible for overseeing the design, deployment, and maintenance of Enterprise Data Foundation cloud infrastructure initiative on Azure/AWS.The ideal candidate will have hands-on experience with Azure/AWS services – Infrastructure as Code (IaC), platform provisioning & administration, cloud network design, cloud security principles and automation. Responsibilities Cloud Infrastructure & Automation Manage and mentor a team of cloud platform infrastructure SMEs, providing technical leadership and direction. Provide guidance and support for application migration, modernization, and transformation projects, leveraging cloud-native technologies and methodologies. Implement cloud infrastructure policies, standards, and best practices, ensuring cloud environment adherence to security and regulatory requirements. Design, deploy and optimize cloud-based infrastructure using Azure/AWS services that meet the performance, availability, scalability, and reliability needs of our applications and services. Drive troubleshooting of cloud infrastructure issues, ensuring timely resolution and root cause analysis by partnering with global cloud center of excellence & enterprise application teams, and PepsiCo premium cloud partners (Microsoft, AWS). Establish and maintain effective communication and collaboration with internal and external stakeholders, including business leaders, developers, customers, and vendors. Develop Infrastructure as Code (IaC) to automate provisioning and management of cloud resources. Write and maintain scripts for automation and deployment using PowerShell, Python, or Azure/AWS CLI. Work with stakeholders to document architectures, configurations, and best practices. Knowledge of cloud security principles around data protection, identity and access Management (IAM), compliance and regulatory, threat detection and prevention, disaster recovery and business continuity. Qualifications Bachelor’s degree in computer science. At least 10 to 12 years of experience in IT cloud infrastructure, architecture and operations, including security, with at least 8 years in a technical leadership role Strong knowledge of cloud architecture, design, and deployment principles and practices, including microservices, serverless, containers, and DevOps. Deep expertise in Azure/AWS big data & analytics technologies, including Databricks, real time data ingestion, data warehouses, serverless ETL, No SQL databases, DevOps, Kubernetes, virtual machines, web/function apps, monitoring and security tools. Deep expertise in Azure/AWS networking and security fundamentals, including network endpoints & network security groups, firewalls, external/internal DNS, load balancers, virtual networks and subnets. Proficient in scripting and automation tools, such as PowerShell, Python, Terraform, and Ansible. Excellent problem-solving, analytical, and communication skills, with the ability to explain complex technical concepts to non-technical audiences. Certifications in Azure/AWS platform administration, networking and security are preferred. Strong self-organization, time management and prioritization skills A high level of attention to detail, excellent follow through, and reliability Strong collaboration, teamwork and relationship building skills across multiple levels and functions in the organization Ability to listen, establish rapport, and credibility as a strategic partner vertically within the business unit or function, as well as with leadership and functional teams Strategic thinker focused on business value results that utilize technical solutions Strong communication skills in writing, speaking, and presenting Capable to work effectively in a multi-tasking environment. Fluent in English language.
Posted 9 hours ago
0 years
4 - 9 Lacs
Hyderābād
On-site
Job requisition ID :: 86795 Date: Aug 1, 2025 Location: Hyderabad Designation: Consultant Entity: Deloitte Touche Tohmatsu India LLP Audit & Assurance/Audit(A&A) IT Audit /Analytics - Consultant Your potential, unleashed. India’s impact on the global economy has increased at an exponential rate and Deloitte presents an opportunity to unleash and realize your potential amongst cutting edge leaders, and organizations shaping the future of the region, and indeed, the world beyond. At Deloitte, your whole self to work, every day. Combine that with our drive to propel with purpose and you have the perfect playground to collaborate, innovate, grow, and make an impact that matters. The team Assurance is about much more than just the numbers. It’s about attesting to accomplishments and challenges and helping to assure strong foundations for future aspirations. Deloitte exemplifies the what, how, and why of change so you’re always ready to act ahead. Learn more about Audit & Assurance Practice Your work profile In our Assurance (A&A) Team you’ll build and nurture positive working relationships with teams and clients with the intention to exceed client expectations. Job Description Roles and Responsibilities At Deloitte, you are expected to contribute to the firm's growth and development in a variety of ways, including: Eligibility criteria and requirements: SAP Analytics Cloud (SAC): Skilled in designing and implementing comprehensive dashboards and planning models, with experience in SAC storyboards, data preparation, and integration with SAP ERP and BW systems. Business Intelligence Tools: Proficient in Power BI or Tableau, capable of creating interactive reports, optimizing performance, and following visualization best practices. Programming: Strong knowledge of Python, including libraries such as Pandas, NumPy, and Scikit-learn, for data manipulation, automation, and building advanced analytics solutions. Machine Learning: Hands-on experience with developing and deploying machine learning models to solve real-world business challenges. Finance Domain Expertise: In-depth understanding of finance processes, financial reporting, and key performance indicators relevant to the finance function. This domain knowledge is critical to effectively translate business requirements into technical solutions. Data Engineering: Experience with data pipelines, ETL workflows, and cloud data platforms is preferred Your potential, unleashed. India’s impact on the global economy has increased at an exponential rate and Deloitte presents an opportunity to unleash and realize your potential amongst cutting edge leaders, and organizations shaping the future of the region, and indeed, the world beyond. At Deloitte, your whole self to work, every day. Combine that with our drive to propel with purpose and you have the perfect playground to collaborate, innovate, grow, and make an impact that matters. The team Assurance is about much more than just the numbers. It’s about attesting to accomplishments and challenges and helping to assure strong foundations for future aspirations. Deloitte exemplifies the what, how, and why of change so you’re always ready to act ahead. Learn more about Audit & Assurance Practice Your work profile In our Assurance (A&A) Team you’ll build and nurture positive working relationships with teams and clients with the intention to exceed client expectations. Everyone’s welcome… entrust your happiness to us Our workspaces and initiatives are geared towards your 360-degree happiness. This includes specific needs you may have in terms of accessibility, flexibility, safety and security, and caregiving. Here’s a glimpse of things that are in store for you. Interview tips We want job seekers exploring opportunities at Deloitte to feel prepared, confident and comfortable. To help you with your interview, we suggest that you do your research, know some background about the organisation and the business area you’re applying to. Check out recruiting tips from Deloitte professionals.
Posted 9 hours ago
0 years
0 Lacs
Gurugram, Haryana, India
Remote
Every day, tens of millions of people come to Roblox to explore, create, play, learn, and connect with friends in 3D immersive digital experiences– all created by our global community of developers and creators. At Roblox, we’re building the tools and platform that empower our community to bring any experience that they can imagine to life. Our vision is to reimagine the way people come together, from anywhere in the world, and on any device. We’re on a mission to connect a billion people with optimism and civility, and looking for amazing talent to help us get there. A career at Roblox means you’ll be working to shape the future of human interaction, solving unique technical challenges at scale, and helping to create safer, more civil shared experiences for everyone. Roblox Operating System (ROS) is our internal productivity platform that governs how Roblox operates as a company. Through an integrated suite of tools, ROS shapes how we make talent and personnel decisions, plan and organize work, discover knowledge, and scale efficiently. We are seeking a Senior Data Engineer to enhance our data posture and architecture, synchronizing data across vital third-party systems like Workday, Greenhouse, GSuite, and JIRA, as well as our internal Roblox OS application database. Our Roblox OS app suite encompasses internal tools and third-party applications for People Operations, Talent Acquisition, Budgeting, Roadmapping, and Business Analytics. We envision an integrated platform that streamlines processes while providing employees and leaders with the information they need to support the business. This is a new team in our Roblox India location, working closely with data scientists & analysts, product & engineering, and other stakeholders in India & US. You will report to the Engineering Manager of the Roblox OS Team in your local location and collaborate with Roblox internal teams globally. Work Model : This role is based in Gurugram and follows a hybrid structure — 3 days from the office (Tuesday, Wednesday & Thursday) and 2 days work from home. Shift Time : 2:00pm - 10:30pm IST (Cabs will be provided) You Will Design and Build Scalable Data Pipelines: Architect, develop, and maintain robust, scalable data pipelines using orchestration frameworks like Airflow to synchronize data between internal systems. Implement and Optimize ETL Processes: Apply strong understanding of ETL (Extract, Transform, Load) processes and best practices for seamless data integration and transformation. Develop Data Solutions with SQL: Utilize your proficiency in SQL and relational databases (e.g., PostgreSQL) for advanced querying, data modeling, and optimizing data solutions. Contribute to Data Architecture: Actively participate in data architecture and implementation discussions, ensuring data integrity and efficient data transposition. Manage and optimize data infrastructure, including database, cloud storage solutions, and API endpoints. Write High-Quality Code: Focus on developing clear, readable, testable, modular, and well-monitored code for data manipulation, automation, and software development with a strong emphasis on data integrity. Troubleshoot and Optimize Performance: Apply excellent analytical and problem-solving skills to diagnose data issues and optimize pipeline performance. Collaborate Cross-Functionally: Work effectively with cross-functional teams, including data scientists, analysts, and business stakeholders, to translate business needs into technical data solutions. Ensure Data Governance and Security: Implement data anonymization and pseudonymization techniques to protect sensitive data, and contribute to master data management (MDM) concepts including data quality, lineage, and governance frameworks. You Have Data Engineering Expertise: At least 6+ Proven experience designing, building, and maintaining scalable data pipelines, coupled with a strong understanding of ETL processes and best practices for data integration. Database and Data Warehousing Proficiency: Deep proficiency in SQL and relational databases (e.g., PostgreSQL), and familiarity with at least one cloud-based data warehouse solution (e.g., Snowflake, Redshift, BigQuery). Technical Acumen: Strong scripting skills for data manipulation and automation. Familiarity with data streaming platforms (e.g., Kafka, Kinesis), and knowledge of containerization (e.g., Docker) and cloud infrastructure (e.g., AWS, Azure, GCP) for deploying and managing data solutions. Data & Cloud Infrastructure Management: Experience with managing and optimizing data infrastructure, including database, cloud storage solutions, and configuring API endpoints. Software Development Experience: Experience in software development with a focus on data integrity and transposition, and a commitment to writing clear, readable, testable, modular, and well-monitored code. Problem-Solving & Collaboration Skills: Excellent analytical and problem-solving abilities to troubleshoot complex data issues, combined with strong communication and collaboration skills to work effectively across teams. Passion for Data: A genuine passion for working with amounts of data from various sources, understanding the critical impact of data quality on company strategy at an executive level. Adaptability: Ability to thrive and deliver results in a fast-paced environment with competing priorities. Roles that are based in an office are onsite Tuesday, Wednesday, and Thursday, with optional presence on Monday and Friday (unless otherwise noted). Roblox provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws. Roblox also provides reasonable accommodations for all candidates during the interview process.
Posted 9 hours ago
3.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Job Purpose Assist in building out the backlog of Power BI dashboards, ensuring they meet business requirements and provide actionable insights. Collect and maintain a firmwide inventory of existing reports, identifying those that need to be converted to Power BI. Collaborate with the team to contract and integrate Snowflake, ensuring seamless data flow and accessibility for reporting and analytics. Desired Skills And Experience Candidates should have a B.E./B.Tech/MCA/MBA in Information Systems, Computer Science or a related field 3+ year’s strong experience in developing and managing Power BI dashboards and reports, preferably within the financial services industry. Experience required in Data Warehousing, SQL, and hands-on expertise in ETL/ELT processes. Familiarity with Snowflake data warehousing solutions and integration. Proficiency in data integration from various sources including APIs and databases. Proficient in SQL for querying and manipulating data. Strong understanding of data warehousing concepts and practices. Experience with deploying and managing dashboards on a Power BI server to service a large number of users. Familiarity with other BI tools and platforms. Experience with financial datasets and understanding Private equity metrics. Knowledge of cloud platforms, particularly Azure, Snowflake, and Databricks. Excellent problem-solving skills and attention to detail. Strong communication skills, both written and oral, with a business and technical aptitude Must possess good verbal and written communication and interpersonal skills Key Responsibilities Create and maintain interactive and visually appealing Power BI dashboards to visualize data insights. Assist in building out the backlog of Power BI dashboards, ensuring they meet business requirements and provide actionable insights. Integrate data from various sources including APIs, databases, and cloud storage solutions such as Azure, Snowflake, and Databricks. Collect and maintain a firmwide inventory of existing reports, identifying those that need to be converted to Power BI. Collaborate with the team to contract and integrate Snowflake, ensuring seamless data flow and accessibility for reporting and analytics. Continuously refine and improve the user interface of dashboards based on ongoing input and feedback. Monitor and optimize the performance of dashboards to handle large volumes of data efficiently. Work closely with stakeholders to understand their reporting needs and translate them into effective Power BI solutions. Ensure the accuracy and reliability of data within Power BI dashboards and reports. Deploy dashboards onto a Power BI server to be serviced to a large number of users, ensuring high availability and performance. Ensure that dashboards provide self-service capabilities and are interactive for end-users. Create detailed documentation of BI processes and provide training to internal teams and clients on Power BI usage Stay updated with the latest Power BI and Snowflake features and best practices to continuously improve reporting capabilities. Behavioral Competencies Effectively communicate with business and technology partners, peers and stakeholders Ability to deliver results under demanding timelines to real-world business problems Ability to work independently and multi-task effectively Identify and communicate areas for improvement Demonstrate high attention to detail, should work in a dynamic environment whilst maintaining high quality standards, a natural aptitude to develop good internal working relationships and a flexible work ethic Responsible for Quality Checks and adhering to the agreed Service Level Agreement (SLA) / Turn Around Time (TAT)
Posted 9 hours ago
0 years
0 Lacs
India
Remote
Company Description BI Hub Solution offers expert and impartial business advice and services at an honest price. We prioritize listening to our clients' requirements to select the right solutions that fit their needs. We genuinely care for your business and take a sincere interest in helping your company reach its potential. Role Description This is a full-time remote role for a Tableau BI Developer Intern. The intern will be responsible for developing data models, creating dashboards, and conducting data warehousing activities. The intern will also be involved in Extract, Transform, Load (ETL) processes and utilizing analytical skills to provide business insights. Qualifications Proficiency in Data Modeling and Developing Dashboards Experience in Data Warehousing and Extract, Transform, Load (ETL) processes Strong Analytical Skills for Business Insights Excellent problem-solving and critical-thinking abilities Ability to work independently and remotely Bachelor’s degree in Computer Science, Information Systems, or related field is a plus
Posted 9 hours ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
10- 15 yrs experience in Databrick and exposure to Data/AI platforms. Expertise in Pyspark/Data factory Develop efficient Extract, Load and Transform (ELT/ETL) processes to facilitate seamless data integration, transformation, and loading from various sources into the data platform using Azure and Databricks This includes inbound and outbound data processes. Conduct and support unit and system testing/ SIT/ UAT Support platform deployment and post go-live support Expert in Pyspark and Data Factory
Posted 9 hours ago
8.0 - 12.0 years
2 - 9 Lacs
Hyderābād
On-site
Job description Some careers shine brighter than others If you’re looking for a career that will help you stand out, join HSBC, and fulfil your potential Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further HSBC is one of the largest banking and financial services organizations in the world, with operations in 64 countries and territories We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realize their ambitions We are currently seeking an experienced professional to join our team in the role of Consultant Specialist 8 - 12 years of experience with below requirements and skills: Advanced SQL Development: Write complex SQL queries for data extraction, transformation, and analysis. Optimize SQL queries for performance and scalability. SQL Tuning and Joins: Analyze and improve query performance. Deep understanding of joins, indexing, and query execution plans. GCP BigQuery and GCS: Work with Google BigQuery for data warehousing and analytics. Manage and integrate data using Google Cloud Storage (GCS). Airflow DAG Development: Design, develop, and maintain workflows using Apache Airflow. Write custom DAGs to automate data pipelines and processes. Python Programming: Develop and maintain Python scripts for data processing and automation. Debug and optimize Python code for performance and reliability. Shell Scripting: Write and debug basic shell scripts for automation and system tasks. Continuous Learning: Stay updated with the latest tools and technologies in data engineering. Demonstrate a strong ability and attitude to learn and adapt quickly. Communication: Collaborate effectively with cross-functional teams. Clearly communicate technical concepts to both technical and non-technical stakeholders. Requirements To be successful in this role, you should meet the following requirements: Advanced SQL writing and query optimization. Strong understanding of SQL tuning, joins, and indexing. Hands-on experience with GCP services, especially BigQuery and GCS. Proficiency in Python programming and debugging. Experience with Apache Airflow and DAG development. Basic knowledge of shell scripting. Excellent problem-solving skills and a growth mindset. Strong verbal and written communication skills. Experience with data pipeline orchestration and ETL processes. Familiarity with other GCP services like Dataflow or Pub/Sub. Knowledge of CI/CD pipelines and version control (e.g., Git). You’ll achieve more when you join HSBC www.hsbc.com/careers HSBC is committed to building a culture where all employees are valued, respected and opinions count We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website Issued by – HSBC Software Development India
Posted 9 hours ago
7.0 years
0 Lacs
Hyderābād
On-site
Job Description Overview As a key member of the team, you will be responsible for designing, building, and maintaining the data pipelines and platforms that support analytics, machine learning, and business intelligence. You will lead a team of data engineers and collaborate closely with cross-functional stakeholders to ensure that data is accessible, reliable, secure, and optimized for AI-driven applications Responsibilities Architect and implement scalable data solutions to support LLM training, fine-tuning, and inference workflows. Lead the development of ETL/ELT pipelines for structured and unstructured data across diverse sources. Ensure data quality, governance, and compliance with industry standards and regulations. Collaborate with Data Scientists, MLOps, and product teams to align data infrastructure with GenAI product goals. Mentor and guide a team of data engineers, promoting best practices in data engineering and DevOps. Optimize data workflows for performance, cost-efficiency, and scalability in cloud environments. Drive innovation by evaluating and integrating modern data tools and platforms (e.g., Databricks, Azure etc) Qualifications Bachelor’s or Master’s degree in Computer Science, Data Engineering, Information Systems, or a related technical field. 7+ years of experience in data engineering, with at least 2+ years in a leadership or senior role. Proven experience designing and managing data platforms and pipelines in cloud environments (Azure, AWS, or GCP). Experience supporting AI/ML workloads, especially involving Large Language Models (LLMs) Strong proficiency in SQL and Python Hands-on experience with data orchestration tools
Posted 9 hours ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The ETL (Extract, Transform, Load) job market in India is thriving with numerous opportunities for job seekers. ETL professionals play a crucial role in managing and analyzing data effectively for organizations across various industries. If you are considering a career in ETL, this article will provide you with valuable insights into the job market in India.
These cities are known for their thriving tech industries and often have a high demand for ETL professionals.
The average salary range for ETL professionals in India varies based on experience levels. Entry-level positions typically start at around ₹3-5 lakhs per annum, while experienced professionals can earn upwards of ₹10-15 lakhs per annum.
In the ETL field, a typical career path may include roles such as: - Junior ETL Developer - ETL Developer - Senior ETL Developer - ETL Tech Lead - ETL Architect
As you gain experience and expertise, you can progress to higher-level roles within the ETL domain.
Alongside ETL, professionals in this field are often expected to have skills in: - SQL - Data Warehousing - Data Modeling - ETL Tools (e.g., Informatica, Talend) - Database Management Systems (e.g., Oracle, SQL Server)
Having a strong foundation in these related skills can enhance your capabilities as an ETL professional.
Here are 25 interview questions that you may encounter in ETL job interviews:
As you explore ETL jobs in India, remember to showcase your skills and expertise confidently during interviews. With the right preparation and a solid understanding of ETL concepts, you can embark on a rewarding career in this dynamic field. Good luck with your job search!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough