Jobs
Interviews

24278 Etl Jobs - Page 22

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0 years

0 Lacs

Hyderabad, Telangana, India

Remote

When you join Verizon You want more out of a career. A place to share your ideas freely — even if they’re daring or different. Where the true you can learn, grow, and thrive. At Verizon, we power and empower how people live, work and play by connecting them to what brings them joy. We do what we love — driving innovation, creativity, and impact in the world. Our V Team is a community of people who anticipate, lead, and believe that listening is where learning begins. In crisis and in celebration, we come together — lifting our communities and building trust in how we show up, everywhere & always. Want in? Join the #VTeamLife. What You’ll Be Doing... As a Senior Data Engineer with ETL/ELT expertise for our growing data platform and analytics teams, you will understand and enable the required data sets from different sources. You will take both structured and unstructured data into our data warehouse and data lake with real-time streaming and/or batch processing to generate insights and perform analytics for business teams within Verizon. Understanding the business requirements. Providing transformed technical design. Working on Data Ingestion, Preparation and Transformation. Developing the scripts for Data Sourcing and Parsing. Developing data streaming applications. Debugging the production failures and identifying the solution. Working on ETL/ELT development. What We’re Looking For... You’re curious about new technologies and the game-changing possibilities it creates. You like to stay up-to-date with the latest trends and apply your technical expertise to solving business problems. You'll Need To Have Bachelors degree or four or more years of work experience. Four or more years of relevant experience required, demonstrated through work experience and/or military experience. Experience with Data Warehouse concepts and Data Management life cycle. Even better if you have one or more of the following: Any related Certification on ETL/ELT developer. Accuracy and attention to detail. Strong problem solving, analytical, and research capabilities. Strong verbal and written communication skills. Experience presenting to and influencing partners. Why Verizon? Verizon is committed to maintaining a Total Rewards package which is competitive, valued by our employees, and differentiates us as an Employer of Choice. We are a ‘pay for performance’ company and your contribution is rewarded through competitive salaries, performance-based incentives and an employee Stock Program. We create an opportunity for us all to share in the success of Verizon and the value we help to create through this broad-based discretionary equity award program. Your benefits are market competitive and delivered by some of the best providers. You are provided with a full spectrum of health and wellbeing resources, including a first in-class Employee Assistance Program, to empower you to make positive health decisions. We offer generous paid time off benefits to help you manage your work life balance and opportunities for flexible working arrangements*. Verizon provides training and development for all levels, to help you enhance your skills and develop your career, from funding towards education assistance, award-winning training, online development tools and access to industry research. You will be able to take part in volunteering opportunities as part of our environmental, community and sustainability commitment. Your benefits package will vary depending on the country in which you work. subject to business approval If Verizon and this role sound like a fit for you, we encourage you to apply even if you don’t meet every “even better” qualification listed above. Where you’ll be working In this hybrid role, you'll have a defined work location that includes work from home and assigned office days set by your manager. Scheduled Weekly Hours 40 Equal Employment Opportunity Verizon is an equal opportunity employer. We evaluate qualified applicants without regard to race, gender, disability or any other legally protected characteristics.

Posted 4 days ago

Apply

6.0 years

0 Lacs

India

Remote

Experience : 6.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: Airflow, LLMs, MLOps, Generative AI, Python Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Engineering team builds and optimizes systems spanning data ingestion, processing, storage optimization and more. We work closely with engineers and the product team to build highly scalable systems that tackle real-world data problems and provide our customers with accurate, real-time, fault tolerant solutions to their ever-growing data needs. We support various OLTP and analytics environments, including our Advanced Analytics and Digital Experience Management products. We are looking for skilled engineers experienced with building and optimizing cloud-scale distributed systems to develop our next-generation ingestion, processing and storage solutions. You will work closely with other engineers and the product team to build highly scalable systems that tackle real-world data problems. Our customers depend on us to provide accurate, real-time and fault tolerant solutions to their ever growing data needs. This is a hands-on, impactful role that will help lead development, validation, publishing and maintenance of logical and physical data models that support various OLTP and analytics environments. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What You Will Be Doing Lead the design, development, and deployment of AI/ML models for threat detection, anomaly detection, and predictive analytics in cloud and network security. Architect and implement scalable data pipelines for processing large-scale datasets from logs, network traffic, and cloud environments. Apply MLOps best practices to deploy and monitor machine learning models in production. Collaborate with cloud architects and security analysts to develop cloud-native security solutions leveraging platforms like AWS, Azure, or GCP. Build and optimize Retrieval-Augmented Generation (RAG) systems by integrating large language models (LLMs) with vector databases for real-time, context-aware applications. Analyze network traffic, log data, and other telemetry to identify and mitigate cybersecurity threats. Ensure data quality, integrity, and compliance with GDPR, HIPAA, or SOC 2 standards. Drive innovation by integrating the latest AI/ML techniques into security products and services. Mentor junior engineers and provide technical leadership across projects. Required Skills And Experience AI/ML Expertise Proficiency in advanced machine learning techniques, including neural networks (e.g., CNNs, Transformers) and anomaly detection. Experience with AI frameworks like TensorFlow, PyTorch, and Scikit-learn. Strong understanding of MLOps practices and tools (e.g., MLflow, Kubeflow). Experience building and deploying Retrieval-Augmented Generation (RAG) systems, including integration with LLMs and vector databases. Data Engineering Expertise designing and optimizing ETL/ELT pipelines for large-scale data processing. Hands-on experience with big data technologies (e.g., Apache Spark, Kafka, Flink). Proficiency in working with relational and non-relational databases, including ClickHouse and BigQuery. Familiarity with vector databases such as Pinecone and PGVector and their application in RAG systems. Experience with cloud-native data tools like AWS Glue, BigQuery, or Snowflake. Cloud and Security Knowledge Strong understanding of cloud platforms (AWS, Azure, GCP) and their services. Experience with network security concepts, extended detection and response, and threat modeling. Software Engineering Proficiency in Python, Java, or Scala for data and ML solution development. Expertise in scalable system design and performance optimization for high-throughput applications. Leadership and Collaboration Proven ability to lead cross-functional teams and mentor engineers. Strong communication skills to present complex technical concepts to stakeholders. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 4 days ago

Apply

5.0 years

6 - 8 Lacs

Jaipur

On-site

Role: Data Engineer - Python Employment: Full Time Experience: 5 To 10 Years Salary: Not Disclosed Location: Jaipur, India Programmers.IO is currently looking to hire Data Engineer - Python on Python Libraries like Pandas and Pyspark, Data Ingestion, ETL Process, Data Migration Technology. If you think you are a good fit and willing to work from Jaipur, India location.Please apply with you resume or share your resume at anjali.shah@programmers.io Experience Required: 5 to 10 Years Job Overview The Data Engineer (Python) will support the RAN Cutover and Data Ingestion workstream by developing and maintaining data pipelines for migrating data from legacy OSS systems to a new platform. This role involves handling vendor-specific data formats and supporting market-by-market cutover processes. Responsibilities Develop and maintain Python-based data pipelines for extracting data from 360 OSS servers. Process and transform vendor-specific data formats (Ericsson & Nokia XML). Support market-by-market data migration and cutover processes. Collaborate with architects and SMEs to ensure data integrity. Optimize data pipelines for performance and scalability. Assist in decommissioning legacy systems. Qualifications Bachelor’s degree in Computer Science, Engineering, or a related field. 4+ years of experience in data engineering with a focus on Python. Proficiency in Python and related libraries (e.g., Pandas, PySpark). Experience with data ingestion and ETL processes. Familiarity with XML data formats. Must be located in India and eligible to work. Preferred Skills Experience in telecommunications or OSS migrations. Knowledge of Databricks or Snowflake. Familiarity with cloud-based data platforms. Experience with large-scale data migrations. Skills and Knowledge: Python Libraries like Pandas and Pyspark, Data Ingestion, ETL Process, Data Migration

Posted 4 days ago

Apply

15.0 years

0 Lacs

Indore

On-site

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Apache Spark Good to have skills : MySQL, Python (Programming Language), Google BigQuery Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will engage in the design, construction, and configuration of applications tailored to fulfill specific business processes and application requirements. Your typical day will involve collaborating with team members to understand project needs, developing innovative solutions, and ensuring that applications are optimized for performance and usability. You will also participate in testing and debugging processes to ensure the applications function as intended, while continuously seeking ways to enhance application efficiency and user experience. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Assist in the documentation of application specifications and user guides. - Collaborate with cross-functional teams to gather requirements and provide technical support. Professional & Technical Skills: - Must To Have Skills: Proficiency in Apache Spark. - Good To Have Skills: Experience with MySQL, Python (Programming Language), Google BigQuery. - Strong understanding of data processing frameworks and distributed computing. - Experience in developing and deploying applications in cloud environments. - Familiarity with data integration and ETL processes. Additional Information: - The candidate should have minimum 3 years of experience in Apache Spark. - This position is based at our Indore office. - A 15 years full time education is required. 15 years full time education

Posted 4 days ago

Apply

7.0 years

0 Lacs

Tirupati

On-site

ETL ODI Developer – Night Shift open position. Experience:7 + Years Shift: 9PM IST to 6 AM IST Requirements We are seeking a skilled and experienced Data Integration Specialist with over 5 years of experience in designing and developing data solutions using Oracle Data Integrator (ODI). The ideal candidate will have strong expertise in data modeling, ETL/ELT processes, and SQL, along with exposure to Python scripting for API-based data ingestion. Key Responsibilities: Design, develop, and maintain functions and stored procedures using Oracle Data Integrator (ODI). Create and document data warehouse schemas, including fact and dimension tables, based on business requirements. Develop and execute SQL scripts for table creation and collaborate with Database Administrators (DBAs) for deployment. Analyze various data sources to identify relationships and align them with Business Requirements Documentation (BRD). Design and implement Extract, Load, Transform (ELT) processes to load data from source systems into staging and target environments. Validate and profile data using Structured Query Language (SQL) and other analytical tools to ensure data accuracy and completeness. Apply best practices in data governance, including query optimization, metadata management, and data quality monitoring. Demonstrate strong data modeling skills to support scalable and efficient data architecture. Utilize Python to automate data collection from APIs, enhancing integration workflows and enabling real-time data ingestion. Investigate and resolve data quality issues through detailed analysis and root cause identification. Communicate effectively with stakeholders through strong written, verbal, and analytical skills. Exhibit excellent problem-solving and research capabilities in a fast-paced, data-driven environment. Job Type: Full-time Work Location: In person

Posted 4 days ago

Apply

3.0 years

0 Lacs

Bangalore Urban, Karnataka, India

On-site

We are looking for a detail-oriented QA Engineer to ensure the quality and accuracy of data migration projects. The ideal candidate will be responsible for validating data integrity, testing migration processes, and identifying discrepancies or issues. This role requires expertise in QA methodologies, strong analytical skills, and familiarity with data migration processes and tools. Key Responsibilities Data Validation and Testing Develop and execute comprehensive test plans and test cases to validate data migration processes. Ensure data integrity, accuracy, and consistency across source and target systems. Perform pre- and post-migration data checks to verify successful migration. Test Automation Design and implement automated test scripts for data validation and reconciliation. Use appropriate tools to streamline testing processes and reduce manual effort. Defect Identification and Resolution Identify, document, and report issues or discrepancies in the data migration process. Collaborate with development teams to troubleshoot and resolve data-related defects. Collaboration and Communication Work closely with data engineers, business analysts, and stakeholders to understand migration requirements and objectives. Provide regular updates on testing progress, results, and identified risks. Process Improvement Recommend and implement best practices for data migration testing and validation. Continuously improve QA processes to enhance efficiency and effectiveness. Documentation Maintain clear and detailed documentation of test plans, test cases, and test results. Ensure proper tracking and reporting of issues using defect management tools. Requirements Bachelors degree in Computer Science, Information Technology, or a related field. 3+ years of experience in quality assurance or data testing, preferably in data migration projects. Strong knowledge of SQL for querying and validating data. Familiarity with data migration tools and ETL processes (e.g., Informatica, Talend, or similar). Hands-on experience with test automation tools (e.g., Selenium, TestNG, or similar). Understanding of data governance, privacy, and security principles. Strong analytical skills with attention to detail. Excellent communication and collaboration abilities. Preferred Qualifications Experience with cloud-based data migration (e.g., AWS, Azure, GCP). Familiarity with big data frameworks and tools (e.g., Hadoop, Spark). Knowledge of Agile methodologies and tools like Jira or Confluence.

Posted 4 days ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Job Description Job Description Common Skils - SQL, GCP BQ, ETL pipelines using Pythin/Airflow, Experience on Spark/Hive/HDFS, Data modeling for Data conversion Resources (4) Prior experience working on a conv/migration HR project is additional skill needed along with above mentioned skills Common Skils - SQL, GCP BQ, ETL pipelines using Pythin/Airflow, Experience on Spark/Hive/HDFS, Data modeling for Data conversion Resources (4) Prior experience working on a conv/migration HR project is additional skill needed along with above mentioned skills Data Engineer - Knows HR Knowledge , all other requirement from Functional Area given by client

Posted 4 days ago

Apply

0 years

0 Lacs

Kanayannur, Kerala, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Fueled by strategic investment in technology and innovation, Client Technology seeks to drive growth opportunities and solve complex business problems for our clients through building a robust platform for business and powerful product engine that are integral to innovation at scale. You will work with technologists and business specialists, blending EY’s deep industry knowledge and innovative ideas with our platforms, capabilities, and technical expertise. As a champion for change and growth, you will be at the forefront of integrating emerging technologies from AI to Data Analytics into every corner of what we do at EY. That means more growth for you, exciting learning opportunities, career choices, and the chance to make a real impact. The opportunity: We are looking for a highly experienced Power BI developer who will be part of the Data Engineering tam within EY Client Technology’s Advanced Analytics Team. The candidate will be responsible for designing, developing and maintaining Power BI data models, reports and dashboards. If you are passionate about business intelligence, analytics and have a knack for turning complex data into actionable insights, we want to hear from you. To qualify for the role, you must have: Strong proficiency in Power BI, including DAX and the Power Query formula language (M-language). Advanced understanding of data modeling, data warehousing and ETL techniques. Designed, developed and maintained Power BI reports (including paginated reports) and dashboards to support business decision-making processes. Designed, developed and implemented Power BI data models for complex and large-scale enterprise environments. Proven experience with deploying and optimizing large datasets. Proficiency in SQL and other data querying languages. Strong collaboration, analytical, interpersonal and communication abilities. Ideally, you’ll also have: Bachelor's degree in Computer Science, Information Technology, Data Science, or a related field. Microsoft Power BI certification. Experience with other BI tools. Worked within large teams to successfully implement Power BI solutions. Sound knowledge of the software development lifecycle and experience with Git. Ability to propose solutions by recalling best practices learned from Microsoft documentation, whitepapers and community publications. What we look for: We want people who are self-starters, who can take initiative and get things done. If you can think critically and creatively to solve problems, you will excel. You should be comfortable working with culturally diverse outsourced on/offshore team members which means you may need to work outside of the normal working hours in your time zone to partner with other Client Technology staff globally. Some travel may also be required, both domestic and international. What we offer: As part of this role, you'll work in a highly integrated, global team with the opportunity and tools to grow, develop and drive your career forward. Here, you can combine global opportunity with flexible working. The EY benefits package goes above and beyond too, focusing on your physical, emotional, financial and social well-being. Your recruiter can talk to you about the benefits available in your country. Here’s a snapshot of what we offer: Continuous learning: You’ll develop the mindset and skills to navigate whatever comes next. Success as defined by you: We’ll provide the tools and flexibility, so you can make a meaningful impact, your way. Transformative leadership: We’ll give you the insights, coaching and confidence to be the leader the world needs. Diverse and inclusive culture: You’ll be embraced for who you are and empowered to use your voice to help others find theirs. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 4 days ago

Apply

7.0 years

15 - 18 Lacs

Mumbai Metropolitan Region

On-site

Position: Business Intelligence Developer 27165 Location: India (Multiple Offices) Overview A leading global consulting and advisory firm is seeking a Business Intelligence Developer to join its expanding Technology Organization. This role will be part of the Information Solutions team and will report directly to the Head of Information Solutions. The successful candidate will play a pivotal role in building and operating modern data platforms, pipelines, and analytics solutions aligned with the enterprise’s data strategy. This position requires strong cross-functional collaboration, technical expertise, and a problem-solving mindset to translate business requirements into actionable intelligence. Key Responsibilities Design and build ETL processes to ingest and transform data from multiple source systems into integrated business intelligence environments. Develop reports and dashboards using tools such as Power BI, SSRS, and related BI technologies. Ensure data quality through automated processes and validation routines. Contribute to the creation and maintenance of data dictionaries and catalogs. Support the development of data marts and data lakes to empower strategic business initiatives. Translate business problems into analytics solutions and interpret findings into actionable business insights. Conduct requirement-gathering sessions and propose innovative, data-driven solutions. Lead or participate in the design, development, and maintenance of complex BI dashboards and integrated applications. Manage development resources when required to deliver BI products and services. Conduct in-depth analysis and support the interpretation and adoption of BI tools across stakeholders. Proactively identify opportunities for process optimization, risk mitigation, and revenue growth through data insights. Provide technical support for BI platforms and assist with troubleshooting and performance tuning. Lead or support design sessions for end-to-end data integration solutions. Support the delivery of scalable, reusable, and sustainable BI architecture for the firm. Required Qualifications 5–7+ years of experience in business intelligence using Microsoft technologies, including SQL Server, SSIS, Power BI, SSRS, SSAS, or cloud-based equivalents (e.g., Azure). Hands-on experience with large-scale ETL pipelines and data integration processes. In-depth experience working with data warehouses, dimensional modeling, and analytics architecture. Proficiency in developing paginated reports and dashboards using Power BI or comparable tools (Tableau, Qlik, etc.). Familiarity with Power BI Cloud Services and Power BI Report Server. Strong command of Excel for advanced data manipulation and reporting. Skilled in automation, performance tuning, and monitoring of data pipelines. Strong communication and documentation skills. Ability to operate independently and manage competing priorities in a dynamic environment. Preferred Qualifications Experience with advanced analytics using R, Python, Scala, or similar tools. Experience with cloud data platforms such as Azure, AWS, or Snowflake. Familiarity with DevOps practices and tools, including CI/CD pipelines. Experience working in or with data lake environments and reference data architectures. Experience setting up and maintaining Power BI Report Server is advantageous. Skills: data warehousing,report development,excel,power bi,intelligence,dimensional modeling,etl processes,automation,data integration,azure,communication,ssrs,sql server,ssis,business intelligence,performance tuning,ssas,data,analytics

Posted 4 days ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Fueled by strategic investment in technology and innovation, Client Technology seeks to drive growth opportunities and solve complex business problems for our clients through building a robust platform for business and powerful product engine that are integral to innovation at scale. You will work with technologists and business specialists, blending EY’s deep industry knowledge and innovative ideas with our platforms, capabilities, and technical expertise. As a champion for change and growth, you will be at the forefront of integrating emerging technologies from AI to Data Analytics into every corner of what we do at EY. That means more growth for you, exciting learning opportunities, career choices, and the chance to make a real impact. The opportunity: We are looking for a highly experienced Power BI developer who will be part of the Data Engineering tam within EY Client Technology’s Advanced Analytics Team. The candidate will be responsible for designing, developing and maintaining Power BI data models, reports and dashboards. If you are passionate about business intelligence, analytics and have a knack for turning complex data into actionable insights, we want to hear from you. To qualify for the role, you must have: Strong proficiency in Power BI, including DAX and the Power Query formula language (M-language). Advanced understanding of data modeling, data warehousing and ETL techniques. Designed, developed and maintained Power BI reports (including paginated reports) and dashboards to support business decision-making processes. Designed, developed and implemented Power BI data models for complex and large-scale enterprise environments. Proven experience with deploying and optimizing large datasets. Proficiency in SQL and other data querying languages. Strong collaboration, analytical, interpersonal and communication abilities. Ideally, you’ll also have: Bachelor's degree in Computer Science, Information Technology, Data Science, or a related field. Microsoft Power BI certification. Experience with other BI tools. Worked within large teams to successfully implement Power BI solutions. Sound knowledge of the software development lifecycle and experience with Git. Ability to propose solutions by recalling best practices learned from Microsoft documentation, whitepapers and community publications. What we look for: We want people who are self-starters, who can take initiative and get things done. If you can think critically and creatively to solve problems, you will excel. You should be comfortable working with culturally diverse outsourced on/offshore team members which means you may need to work outside of the normal working hours in your time zone to partner with other Client Technology staff globally. Some travel may also be required, both domestic and international. What we offer: As part of this role, you'll work in a highly integrated, global team with the opportunity and tools to grow, develop and drive your career forward. Here, you can combine global opportunity with flexible working. The EY benefits package goes above and beyond too, focusing on your physical, emotional, financial and social well-being. Your recruiter can talk to you about the benefits available in your country. Here’s a snapshot of what we offer: Continuous learning: You’ll develop the mindset and skills to navigate whatever comes next. Success as defined by you: We’ll provide the tools and flexibility, so you can make a meaningful impact, your way. Transformative leadership: We’ll give you the insights, coaching and confidence to be the leader the world needs. Diverse and inclusive culture: You’ll be embraced for who you are and empowered to use your voice to help others find theirs. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 4 days ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Skill Name: Snowflake Developer with Python & Data warehousing and Mandate Certification (Snowpro Advanced Architecture or Snowpro Advanced Administrator) Experience: 5 to 10 Years Mandate Skills: Snowflake + Python + Data warehousing + Certification (Snowpro Advance Architecture or Snowpro Advanced Administrator) Location: Preferably Hyderabad or Pune/Bangalore) NP: Immediate/Early Joiners only (Max 15 to 20 Days) JD: Looking for a skilled Data Engineer with hands-on experience in Snowflake, Python, and Snowpark to join data platform team. The ideal candidate will work on developing scalable data pipelines, optimizing data flows, managing semi-structured data, and building internal tools using Streamlit within Snowflake. A strong understanding of data modeling and SQL is essential. Key Responsibilities: Design and develop Python-based stored procedures using Snowpark for data ingestion, transformation, and automation tasks. Work with metadata-driven frameworks to dynamically create or recreate data tables and manage ETL flows. Build internal apps and tools using Streamlit in Snowflake to enable self-service and improve data accessibility. Work extensively with semi-structured data (e.g., JSON in VARIANT columns), including flattening, transformation, and enrichment. Write complex SQL queries involving CTEs, joins, window functions like ROW_NUMBER(), and performance tuning. Contribute to data modeling activities including star/snowflake schema designs, fact/dimension tables, and data lineage documentation. Collaborate closely with data analysts, business users, and product teams to understand data requirements and deliver data-driven solutions. Required Skills: Strong hands-on experience with Snowflake and its features (warehouse management, tasks, streams, etc.). Proficiency in Python, especially in using it within Snowflake’s Snowpark framework. Experience handling and transforming semi-structured data (JSON/VARIANT columns). Strong SQL skills including writing and optimizing queries with CTEs, JOINs, and window functions. Experience with data modeling concepts and best practices. Familiarity with metadata-driven ETL design patterns is a plus.

Posted 4 days ago

Apply

0 years

0 Lacs

Trivandrum, Kerala, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Fueled by strategic investment in technology and innovation, Client Technology seeks to drive growth opportunities and solve complex business problems for our clients through building a robust platform for business and powerful product engine that are integral to innovation at scale. You will work with technologists and business specialists, blending EY’s deep industry knowledge and innovative ideas with our platforms, capabilities, and technical expertise. As a champion for change and growth, you will be at the forefront of integrating emerging technologies from AI to Data Analytics into every corner of what we do at EY. That means more growth for you, exciting learning opportunities, career choices, and the chance to make a real impact. The opportunity: We are looking for a highly experienced Power BI developer who will be part of the Data Engineering tam within EY Client Technology’s Advanced Analytics Team. The candidate will be responsible for designing, developing and maintaining Power BI data models, reports and dashboards. If you are passionate about business intelligence, analytics and have a knack for turning complex data into actionable insights, we want to hear from you. To qualify for the role, you must have: Strong proficiency in Power BI, including DAX and the Power Query formula language (M-language). Advanced understanding of data modeling, data warehousing and ETL techniques. Designed, developed and maintained Power BI reports (including paginated reports) and dashboards to support business decision-making processes. Designed, developed and implemented Power BI data models for complex and large-scale enterprise environments. Proven experience with deploying and optimizing large datasets. Proficiency in SQL and other data querying languages. Strong collaboration, analytical, interpersonal and communication abilities. Ideally, you’ll also have: Bachelor's degree in Computer Science, Information Technology, Data Science, or a related field. Microsoft Power BI certification. Experience with other BI tools. Worked within large teams to successfully implement Power BI solutions. Sound knowledge of the software development lifecycle and experience with Git. Ability to propose solutions by recalling best practices learned from Microsoft documentation, whitepapers and community publications. What we look for: We want people who are self-starters, who can take initiative and get things done. If you can think critically and creatively to solve problems, you will excel. You should be comfortable working with culturally diverse outsourced on/offshore team members which means you may need to work outside of the normal working hours in your time zone to partner with other Client Technology staff globally. Some travel may also be required, both domestic and international. What we offer: As part of this role, you'll work in a highly integrated, global team with the opportunity and tools to grow, develop and drive your career forward. Here, you can combine global opportunity with flexible working. The EY benefits package goes above and beyond too, focusing on your physical, emotional, financial and social well-being. Your recruiter can talk to you about the benefits available in your country. Here’s a snapshot of what we offer: Continuous learning: You’ll develop the mindset and skills to navigate whatever comes next. Success as defined by you: We’ll provide the tools and flexibility, so you can make a meaningful impact, your way. Transformative leadership: We’ll give you the insights, coaching and confidence to be the leader the world needs. Diverse and inclusive culture: You’ll be embraced for who you are and empowered to use your voice to help others find theirs. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 4 days ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Mediaocean is powering the future of the advertising ecosystem with technology that empowers brands and agencies to deliver impactful omnichannel marketing experiences. With over $200 billion in annualized ad spend running through its software products, Mediaocean deploys AI and automation to optimize investments and outcomes. The company’s advertising infrastructure and ad tech tools are used by more than 100,000 people across the globe. Mediaocean owns and operates Prisma , the industry’s trusted system of record for media management and finance, Innovid , the leading independent ad tech platform for creative, delivery, measurement, and optimization, as well as Protected by Mediaocean , an integrated solution for ad verification and brand safety. Visit www.mediaocean.com for more information. We’re seeking a skilled and proactive Software Developer to join our Business Intelligence team in a fast-paced Adtech environment. You’ll be responsible for building scalable data-driven solutions, optimizing pipelines, and enabling actionable insights across our advertising platforms. If you thrive on transforming raw data into meaningful intelligence and enjoy working with modern tech stacks, we’d love to meet you. What Will You Do Design, develop, and maintain microservices that support BI data pipelines and analytics platforms. Build and optimize ETL processes using Python and SQL to ingest, transform, and serve data. Collaborate with Data Engineers, Analysts, and Product teams to deliver high-impact BI solutions. Implement CI/CD pipelines using Jenkins and Git for seamless deployment and version control. Apply OOP principles to develop modular, reusable, and testable code. Monitor and troubleshoot production systems to ensure high availability and performance. Contribute to data modeling and architecture decisions for scalable BI infrastructure. Required Skills & Qualifications 3–5 years of hands-on experience in software development with a focus on BI or data-centric applications. Strong proficiency in Python, including OOPs concepts and frameworks. Solid understanding of SQL and relational databases (e.g., Oracle, MySQL, Snowflake). Experience with Jenkins for CI/CD and Git for version control. Familiarity with microservices architecture and containerization tools (e.g., Docker, Kubernetes). Knowledge of RESTful APIs and integration with third-party services. Excellent problem-solving skills and attention to detail. Strong communication and collaboration abilities. Who You Are Experience in Adtech or digital marketing domains. Exposure to BI tools like Sisense, Jasper. Understanding of data warehousing. Familiarity with Agile methodologies and sprint-based development. We would hate to miss out on your application because you do not meet every requirement – transferrable skills and education will also be considered, so please do not hesitate to apply! Mediaocean recognizes our true strength and value shine when all our team members feel there is space in the conversation for their voices, thoughts, ideas, perspectives, and concerns. Mediaocean is committed to being an equal opportunity employer, and we consider all applicants regardless of their age, race, color, gender, sexual orientation, ethnicity, religion, national origin, disability, or veteran status.

Posted 4 days ago

Apply

8.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Job Purpose: We are looking for a highly skilled and experienced Data Engineering professional to lead our data engineering team. The ideal candidate will possess a strong technical background, strong project management abilities, and excellent client handling/stakeholder management skills. This role requires a strategic thinker who can drive the design, development and implementation of data solutions that meet our clients’ needs while ensuring the highest standards of quality and efficiency. Job Responsibilities Technology Leadership – Lead guide the team independently or with little support to design, implement deliver complex cloud-based data engineering / data warehousing project assignments Managing projects in fast paced agile ecosystem and ensuring quality deliverables within stringent timelines Responsible for Risk Management, maintaining the Risk documentation and mitigations plan. Drive continuous improvement in a Lean/Agile environment, implementing DevOps delivery approaches encompassing CI/CD, build automation and deployments. Communication & Logical Thinking – Demonstrates strong analytical skills, employing a systematic and logical approach to data analysis, problem-solving, and situational assessment. Capable of effectively presenting and defending team viewpoints, while securing buy-in from both technical and client stakeholders. Handle Client Relationship – Manage client relationship and client expectations independently. Should be able to deliver results back to the Client independently. Should have excellent communication skills. Work Experience Should have expertise and 8+ years of working experience in at least two ETL tools among Matillion, DBT, Pyspark/python, Informatica, and Talend Should have expertise and working experience in at least two databases among Databricks, Redshift, Snowflake, SQL Server, Oracle Should have strong Data Warehousing, Data Integration and Data Modeling fundamentals like Star Schema, Snowflake Schema, Dimension Tables and Fact Tables. Strong experience on SQL building blocks. Creating complex SQL queries and Procedures. Experience in AWS or Azure cloud and its service offerings Aware of techniques such as: Data Modelling, Performance tuning and regression testing Willingness to learn and take ownership of tasks. Excellent written/verbal communication and problem-solving skills and Understanding and working experience on Pharma commercial data sets like IQVIA, Veeva, Symphony, Liquid Hub, Cegedim etc. would be an advantage Good experience working in pharma or life sciences domain projects Education BE/B.Tech, MCA, M.Sc., M. Tech with 60%+ Why Axtria: - Axtria is a global provider of cloud software and data analytics to the Life Sciences industry. We help Life Sciences companies transform the product commercialization journey to drive sales growth and improve healthcare outcomes for patients. We are acutely aware that our work impacts millions of patients and lead passionately to improve their lives. We will provide– (Employee Value Proposition) Offer an inclusive environment that encourages diverse perspectives and ideas Deliver challenging and unique opportunities to contribute to the success of a transforming organization Opportunity to work on technical challenges that may impact across geographies Vast opportunities for self-development: online Axtria Institute, knowledge sharing opportunities globally, learning opportunities through external certifications Sponsored Tech Talks & Hackathons Possibility to relocate to any Axtria office for short and long-term projects Benefit package: Health benefits Retirement benefits Paid time off Flexible Benefits Hybrid /FT Office Axtria is an equal-opportunity employer that values diversity and inclusiveness in the workplace. A few more links are mentioned below, you may want to go through to know more about Axtria’s journey as an Organization, its culture, products and solutions offerings. For White papers: Research Hub: https://www.axtria.com/axtria-research-hub-pharmaceutical-industry/ For Axtria product and capability related content: 5 step guides: https://www.axtria.com/axtria-5-step-guides-sales-marketing-data-management-best-practices/ For recent marketing videos, including Jassi’s public discussions: Video Wall: https://www.axtria.com/video-wall/ Infographic Points of view on industry, Therapy areas etc.: https://www.axtria.com/video-wall/

Posted 4 days ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

🚨 Urgent Hiring: Python Developer - Full time 💼 Immediate Joiners Only (0-15 Days NP) 🚨 📍 Location: Pan India 💼 Experience: 6+Yrs 💰 CTC: Up to ₹21 LPA 🕒 Join Within: Next 5 Days We're looking for Python Developers with strong experience in data warehousing applications, ideally with: ✅ 3-4 yrs Python (Pandas, Polars, etc.) ✅ 1-2 yrs Talend ETL tool (flexible) ✅ 1-2 yrs SQL/PLSQL development ⚡ Immediate joiners or serving notice period (0–15 days) only! 📩 DM me or share profiles ASAP. !: rajesh@reveilletechnologies.com./

Posted 4 days ago

Apply

6.0 years

0 Lacs

Ghaziabad, Uttar Pradesh, India

Remote

Experience : 6.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: Airflow, LLMs, MLOps, Generative AI, Python Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Engineering team builds and optimizes systems spanning data ingestion, processing, storage optimization and more. We work closely with engineers and the product team to build highly scalable systems that tackle real-world data problems and provide our customers with accurate, real-time, fault tolerant solutions to their ever-growing data needs. We support various OLTP and analytics environments, including our Advanced Analytics and Digital Experience Management products. We are looking for skilled engineers experienced with building and optimizing cloud-scale distributed systems to develop our next-generation ingestion, processing and storage solutions. You will work closely with other engineers and the product team to build highly scalable systems that tackle real-world data problems. Our customers depend on us to provide accurate, real-time and fault tolerant solutions to their ever growing data needs. This is a hands-on, impactful role that will help lead development, validation, publishing and maintenance of logical and physical data models that support various OLTP and analytics environments. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What You Will Be Doing Lead the design, development, and deployment of AI/ML models for threat detection, anomaly detection, and predictive analytics in cloud and network security. Architect and implement scalable data pipelines for processing large-scale datasets from logs, network traffic, and cloud environments. Apply MLOps best practices to deploy and monitor machine learning models in production. Collaborate with cloud architects and security analysts to develop cloud-native security solutions leveraging platforms like AWS, Azure, or GCP. Build and optimize Retrieval-Augmented Generation (RAG) systems by integrating large language models (LLMs) with vector databases for real-time, context-aware applications. Analyze network traffic, log data, and other telemetry to identify and mitigate cybersecurity threats. Ensure data quality, integrity, and compliance with GDPR, HIPAA, or SOC 2 standards. Drive innovation by integrating the latest AI/ML techniques into security products and services. Mentor junior engineers and provide technical leadership across projects. Required Skills And Experience AI/ML Expertise Proficiency in advanced machine learning techniques, including neural networks (e.g., CNNs, Transformers) and anomaly detection. Experience with AI frameworks like TensorFlow, PyTorch, and Scikit-learn. Strong understanding of MLOps practices and tools (e.g., MLflow, Kubeflow). Experience building and deploying Retrieval-Augmented Generation (RAG) systems, including integration with LLMs and vector databases. Data Engineering Expertise designing and optimizing ETL/ELT pipelines for large-scale data processing. Hands-on experience with big data technologies (e.g., Apache Spark, Kafka, Flink). Proficiency in working with relational and non-relational databases, including ClickHouse and BigQuery. Familiarity with vector databases such as Pinecone and PGVector and their application in RAG systems. Experience with cloud-native data tools like AWS Glue, BigQuery, or Snowflake. Cloud and Security Knowledge Strong understanding of cloud platforms (AWS, Azure, GCP) and their services. Experience with network security concepts, extended detection and response, and threat modeling. Software Engineering Proficiency in Python, Java, or Scala for data and ML solution development. Expertise in scalable system design and performance optimization for high-throughput applications. Leadership and Collaboration Proven ability to lead cross-functional teams and mentor engineers. Strong communication skills to present complex technical concepts to stakeholders. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 4 days ago

Apply

6.0 years

0 Lacs

Noida, Uttar Pradesh, India

Remote

Experience : 6.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: Airflow, LLMs, MLOps, Generative AI, Python Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Engineering team builds and optimizes systems spanning data ingestion, processing, storage optimization and more. We work closely with engineers and the product team to build highly scalable systems that tackle real-world data problems and provide our customers with accurate, real-time, fault tolerant solutions to their ever-growing data needs. We support various OLTP and analytics environments, including our Advanced Analytics and Digital Experience Management products. We are looking for skilled engineers experienced with building and optimizing cloud-scale distributed systems to develop our next-generation ingestion, processing and storage solutions. You will work closely with other engineers and the product team to build highly scalable systems that tackle real-world data problems. Our customers depend on us to provide accurate, real-time and fault tolerant solutions to their ever growing data needs. This is a hands-on, impactful role that will help lead development, validation, publishing and maintenance of logical and physical data models that support various OLTP and analytics environments. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What You Will Be Doing Lead the design, development, and deployment of AI/ML models for threat detection, anomaly detection, and predictive analytics in cloud and network security. Architect and implement scalable data pipelines for processing large-scale datasets from logs, network traffic, and cloud environments. Apply MLOps best practices to deploy and monitor machine learning models in production. Collaborate with cloud architects and security analysts to develop cloud-native security solutions leveraging platforms like AWS, Azure, or GCP. Build and optimize Retrieval-Augmented Generation (RAG) systems by integrating large language models (LLMs) with vector databases for real-time, context-aware applications. Analyze network traffic, log data, and other telemetry to identify and mitigate cybersecurity threats. Ensure data quality, integrity, and compliance with GDPR, HIPAA, or SOC 2 standards. Drive innovation by integrating the latest AI/ML techniques into security products and services. Mentor junior engineers and provide technical leadership across projects. Required Skills And Experience AI/ML Expertise Proficiency in advanced machine learning techniques, including neural networks (e.g., CNNs, Transformers) and anomaly detection. Experience with AI frameworks like TensorFlow, PyTorch, and Scikit-learn. Strong understanding of MLOps practices and tools (e.g., MLflow, Kubeflow). Experience building and deploying Retrieval-Augmented Generation (RAG) systems, including integration with LLMs and vector databases. Data Engineering Expertise designing and optimizing ETL/ELT pipelines for large-scale data processing. Hands-on experience with big data technologies (e.g., Apache Spark, Kafka, Flink). Proficiency in working with relational and non-relational databases, including ClickHouse and BigQuery. Familiarity with vector databases such as Pinecone and PGVector and their application in RAG systems. Experience with cloud-native data tools like AWS Glue, BigQuery, or Snowflake. Cloud and Security Knowledge Strong understanding of cloud platforms (AWS, Azure, GCP) and their services. Experience with network security concepts, extended detection and response, and threat modeling. Software Engineering Proficiency in Python, Java, or Scala for data and ML solution development. Expertise in scalable system design and performance optimization for high-throughput applications. Leadership and Collaboration Proven ability to lead cross-functional teams and mentor engineers. Strong communication skills to present complex technical concepts to stakeholders. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 4 days ago

Apply

6.0 years

0 Lacs

Agra, Uttar Pradesh, India

Remote

Experience : 6.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: Airflow, LLMs, MLOps, Generative AI, Python Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Engineering team builds and optimizes systems spanning data ingestion, processing, storage optimization and more. We work closely with engineers and the product team to build highly scalable systems that tackle real-world data problems and provide our customers with accurate, real-time, fault tolerant solutions to their ever-growing data needs. We support various OLTP and analytics environments, including our Advanced Analytics and Digital Experience Management products. We are looking for skilled engineers experienced with building and optimizing cloud-scale distributed systems to develop our next-generation ingestion, processing and storage solutions. You will work closely with other engineers and the product team to build highly scalable systems that tackle real-world data problems. Our customers depend on us to provide accurate, real-time and fault tolerant solutions to their ever growing data needs. This is a hands-on, impactful role that will help lead development, validation, publishing and maintenance of logical and physical data models that support various OLTP and analytics environments. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What You Will Be Doing Lead the design, development, and deployment of AI/ML models for threat detection, anomaly detection, and predictive analytics in cloud and network security. Architect and implement scalable data pipelines for processing large-scale datasets from logs, network traffic, and cloud environments. Apply MLOps best practices to deploy and monitor machine learning models in production. Collaborate with cloud architects and security analysts to develop cloud-native security solutions leveraging platforms like AWS, Azure, or GCP. Build and optimize Retrieval-Augmented Generation (RAG) systems by integrating large language models (LLMs) with vector databases for real-time, context-aware applications. Analyze network traffic, log data, and other telemetry to identify and mitigate cybersecurity threats. Ensure data quality, integrity, and compliance with GDPR, HIPAA, or SOC 2 standards. Drive innovation by integrating the latest AI/ML techniques into security products and services. Mentor junior engineers and provide technical leadership across projects. Required Skills And Experience AI/ML Expertise Proficiency in advanced machine learning techniques, including neural networks (e.g., CNNs, Transformers) and anomaly detection. Experience with AI frameworks like TensorFlow, PyTorch, and Scikit-learn. Strong understanding of MLOps practices and tools (e.g., MLflow, Kubeflow). Experience building and deploying Retrieval-Augmented Generation (RAG) systems, including integration with LLMs and vector databases. Data Engineering Expertise designing and optimizing ETL/ELT pipelines for large-scale data processing. Hands-on experience with big data technologies (e.g., Apache Spark, Kafka, Flink). Proficiency in working with relational and non-relational databases, including ClickHouse and BigQuery. Familiarity with vector databases such as Pinecone and PGVector and their application in RAG systems. Experience with cloud-native data tools like AWS Glue, BigQuery, or Snowflake. Cloud and Security Knowledge Strong understanding of cloud platforms (AWS, Azure, GCP) and their services. Experience with network security concepts, extended detection and response, and threat modeling. Software Engineering Proficiency in Python, Java, or Scala for data and ML solution development. Expertise in scalable system design and performance optimization for high-throughput applications. Leadership and Collaboration Proven ability to lead cross-functional teams and mentor engineers. Strong communication skills to present complex technical concepts to stakeholders. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 4 days ago

Apply

7.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Position Overview: As a Data Architect, you are responsible for designing and managing scalable, secure, and high-performance data architectures that support GEDU and customer needs. This role ensures that the GEDU’s data assets are structured and managed in a way that enables the business to generate insights, make data-driven decisions, and maintain data integrity across the GEDU and Customers. The Data Architect will work closely with business leaders, data engineers, data scientists, and IT teams to align the data architecture with the GEDU’s strategic goals. Key Responsibilities: Data Architecture Design: Design, develop, and maintain the enterprise data architecture, including data models, database schemas, and data flow diagrams. Develop a data strategy and roadmap that aligns with GEDU business objectives and ensures the scalability of data systems. Architect both transactional (OLTP) and analytical (OLAP) databases, ensuring optimal performance and data consistency. Data Integration & Management: Oversee the integration of disparate data sources into a unified data platform, leveraging ETL/ELT processes and data integration tools. Design and implement data warehousing solutions, data lakes, and/or data marts that enable efficient storage and retrieval of large datasets. Ensure proper data governance, including the definition of data ownership, security, and privacy controls in accordance with compliance standards (GDPR, HIPAA, etc.). Collaboration with Stakeholders: Work closely with business stakeholders, including analysts, developers, and executives, to understand data requirements and ensure that the architecture supports analytics and reporting needs. Collaborate with DevOps and engineering teams to optimize database performance and support large-scale data processing pipelines. Technology Leadership: Guide the selection of data technologies, including databases (SQL/NoSQL), data processing frameworks (Hadoop, Spark), cloud platforms (Azure is a must), and analytics tools. Stay updated on emerging data management technologies, trends, and best practices, and assess their potential application within the organization. Data Quality & Security: Define data quality standards and implement processes to ensure the accuracy, completeness, and consistency of data across all systems. Establish protocols for data security, encryption, and backup/recovery to protect data assets and ensure business continuity. Mentorship & Leadership: Lead and mentor data engineers, data modelers, and other technical staff in best practices for data architecture and management. Provide strategic guidance on data-related projects and initiatives, ensuring that all efforts are aligned with the enterprise data strategy. Extensive Data Architecture Expertise: Over 7 years of experience in data architecture, data modeling, and database management. Proficiency in designing and implementing relational (SQL) and non-relational (NoSQL) database solutions. Strong experience with data integration tools (Azure Tools are a must + any other third-party tools), ETL/ELT processes, and data pipelines. Advanced Knowledge of Data Platforms: Expertise in Azure cloud data platform is a must. Other platforms such as AWS (Redshift, S3), Azure (Data Lake, Synapse), and/or Google Cloud Platform (BigQuery, Dataproc) is a bonus. Experience with big data technologies (Hadoop, Spark) and distributed systems for large-scale data processing. Hands-on experience with data warehousing solutions and BI tools (e.g., Power BI, Tableau, Looker). Data Governance & Compliance: Strong understanding of data governance principles, data lineage, and data stewardship. Knowledge of industry standards and compliance requirements (e.g., GDPR, HIPAA, SOX) and the ability to architect solutions that meet these standards. Technical Leadership: Proven ability to lead data-driven projects, manage stakeholders, and drive data strategies across the enterprise. Strong programming skills in languages such as Python, SQL, R, or Scala. Pre-Sales Responsibilities: Stakeholder Engagement: Work with product stakeholders to analyze functional and non-functional requirements, ensuring alignment with business objectives. Solution Development: Develop end-to-end solutions involving multiple products, ensuring security and performance benchmarks are established, achieved, and maintained. Proof of Concepts (POCs): Develop POCs to demonstrate the feasibility and benefits of proposed solutions. Client Communication: Communicate system requirements and solution architecture to clients and stakeholders, providing technical assistance and guidance throughout the pre-sales process. Technical Presentations: Prepare and deliver technical presentations to prospective clients, demonstrating how proposed solutions meet their needs and requirements. To know our privacy policy, please click the link below: https://gedu.global/wp-content/uploads/2023/09/GEDU-Privacy-Policy-22092023-V2.0-1.pdf

Posted 4 days ago

Apply

12.0 years

0 Lacs

Gurugram, Haryana, India

On-site

About The Role OSTTRA India The Role: Technical Architect The Team: The OSTTRA Technology team is composed of Capital Markets Technology professionals, who build, support and protect the applications that operate our network. The technology landscape includes high-performance, high-volume applications as well as compute intensive applications, leveraging contemporary microservices, cloud-based architectures. The Impact: Together, we build, support, protect and manage high-performance, resilient platforms that process more than 100 million messages a day. Our services are vital to automated trade processing around the globe, managing peak volumes and working with our customers and regulators to ensure the efficient settlement of trades and effective operation of global capital markets. What’s in it for you: The current objective is to identify individuals with 12+ years of experience who have high expertise, to join their existing team of experts who are spread across the world. This is your opportunity to start at the beginning and get the advantages of rapid early growth. This role is based out in Gurgaon and expected to work with different teams and colleagues across the globe. Responsibilities The role shall be responsible for establishing, maintaining, socialising, and realising the target state of Product Architecture for Post trade businesses of Osttra. This shall encompass all services that Osttra offers for these businesses and all the systems which enable those services. Looking for a person who is high on energy and motivation. Should feel challenged by difficult problems. The role shall partner with portfolio delivery leads, programme managers, portfolio business leads and horizontal technical architects to frame the strategy, to provide solutions for planned programmes and to guide the roadmaps. He/she shall able to build high level Design and log-level techicnal solutions, considerting factors such as scalablity, performance, security, maintanlibity and cost-effectiveness The role shall own the technical and architectural decisions for the projects & products. He / she shall review the designs and own the design quality. They will ensure that there is a robust code / implementation review practice in the product. Likewise, they shall be responsible for the robust CI / CD and robust DevSecOps engineering pipelines being used in the projects. He / she shall provide the ongoing support on design and architecture problems to the delivery teams The role shall manage the tech debt log and plan for their remediation across deliveries and roadmaps. The role shall maintain the living Architecture Reference Documents for the Products. They shall actively partner with Horizontal Technical Architects to factor tech constructs within their portfolios and also to ensure the vibrant feedback to the technical strategies. They shall be responsible for guiding the L3 / L2 teams when needed in the resolution of the production situations and incidents. They shall be responsible for various define guidelines and system design for DR strategies and BCP plan for the proudcts. They shall be responsible for architecting key mission critical systems components, review designs and help uplift He/ She should performs critical technical review of changes on app or infra on system. The role shall enable an ecosystem such that the functional API, message, data and flow models within the products of the portfolio are well documented. And shall also provide the strong governance / oversight of the same What We’re Looking For Rich domain experience of financial services industry preferably with financial markets within Pre/post trade life cycles or large-scale Buy / Sell / Brokerage organisations Should have experience architecture design for the muitple products and of large-scale change programmes. Should be adept with application development and engineering methods and tools. Should have robust experience with micro services applications and services development and integration. Should be adept with development tools, contemporary runtime, and observability stacks for micro services. Should have experience of modelling for APIs, Messages and may be data. Should have experience of complex migration, which include data migration Should have experience in architecture & design of highly resilient, high availability, high volume applications. Should be able to initiates or contributes to initiatives around reliability & resilience of application Rich experience of architectural patterns like MVC based front end applications, API & Event driven architectures, Event streaming, Message processing/orchestrations, CQRS and possibly Event sourcing etc. Experience of protocols or integration technologies like HTTP, MQ, FTP, REST/API and possibly FIX/SWIFT etc. Experience of messaging formats and paradigms like XSD, XML, XSLT, JSON, REST and possibly gRPC, GraphQL etc. Experience of technology like Kafka, Spark streams, Kubernetes / EKS, API Gateways, Web & Application servers, message queuing infrastructure, data transformation / ETL tools Experience of languages like Java, python; application development frameworks like Spring Boot/Family, Apache family and common place AWS / other cloud provider services. Experience of engineering methods like CI/CD, build deploy automation, infra as code and unit / integration testing methods and tools. Should have appetite to review / code for complex problems and should find interests / energy in doing design discussions and reviews. Experience of development with NoSQL and Relational databases is required. Should have an active/prior experience with MVC web development or with contemporary React/Angular frameworks. Should have an experice of migrating monolithic application to a cloud based solution with understanding of defning domain based services responsibliity. Should have an rich experience of designing cloud-natvie architecture including microservices, serverless computing, containerization( docker, kubernets ) on relevent platforms ( GCP/AWS) and monitoring aspects. The Location: Gurgaon, India About Company Statement OSTTRA is a market leader in derivatives post-trade processing, bringing innovation, expertise, processes and networks together to solve the post-trade challenges of global financial markets. OSTTRA operates cross-asset post-trade processing networks, providing a proven suite of Credit Risk, Trade Workflow and Optimisation services. Together these solutions streamline post-trade workflows, enabling firms to connect to counterparties and utilities, manage credit risk, reduce operational risk and optimise processing to drive post-trade efficiencies. OSTTRA was formed in 2021 through the combination of four businesses that have been at the heart of post trade evolution and innovation for the last 20+ years: MarkitServ, Traiana, TriOptima and Reset. These businesses have an exemplary track record of developing and supporting critical market infrastructure and bring together an established community of market participants comprising all trading relationships and paradigms, connected using powerful integration and transformation capabilities. About OSTTRA Candidates should note that OSTTRA is an independent firm, jointly owned by S&P Global and CME Group. As part of the joint venture, S&P Global provides recruitment services to OSTTRA - however, successful candidates will be interviewed and directly employed by OSTTRA, joining our global team of more than 1,200 post trade experts. OSTTRA was formed in 2021 through the combination of four businesses that have been at the heart of post trade evolution and innovation for the last 20+ years: MarkitServ, Traiana, TriOptima and Reset. OSTTRA is a joint venture, owned 50/50 by S&P Global and CME Group. With an outstanding track record of developing and supporting critical market infrastructure, our combined network connects thousands of market participants to streamline end to end workflows - from trade capture at the point of execution, through portfolio optimization, to clearing and settlement. Joining the OSTTRA team is a unique opportunity to help build a bold new business with an outstanding heritage in financial technology, playing a central role in supporting global financial markets. Learn more at www.osttra.com. What’s In It For You? Benefits We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our Benefits Include Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Recruitment Fraud Alert If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com. S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, “pre-employment training” or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here. Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf 20 - Professional (EEO-2 Job Categories-United States of America), BSMGMT203 - Entry Professional (EEO Job Group) Job ID: 315820 Posted On: 2025-07-10 Location: Gurgaon, Haryana, India

Posted 4 days ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Fueled by strategic investment in technology and innovation, Client Technology seeks to drive growth opportunities and solve complex business problems for our clients through building a robust platform for business and powerful product engine that are integral to innovation at scale. You will work with technologists and business specialists, blending EY’s deep industry knowledge and innovative ideas with our platforms, capabilities, and technical expertise. As a champion for change and growth, you will be at the forefront of integrating emerging technologies from AI to Data Analytics into every corner of what we do at EY. That means more growth for you, exciting learning opportunities, career choices, and the chance to make a real impact. The opportunity: We are looking for a highly experienced Power BI developer who will be part of the Data Engineering tam within EY Client Technology’s Advanced Analytics Team. The candidate will be responsible for designing, developing and maintaining Power BI data models, reports and dashboards. If you are passionate about business intelligence, analytics and have a knack for turning complex data into actionable insights, we want to hear from you. To qualify for the role, you must have: Strong proficiency in Power BI, including DAX and the Power Query formula language (M-language). Advanced understanding of data modeling, data warehousing and ETL techniques. Designed, developed and maintained Power BI reports (including paginated reports) and dashboards to support business decision-making processes. Designed, developed and implemented Power BI data models for complex and large-scale enterprise environments. Proven experience with deploying and optimizing large datasets. Proficiency in SQL and other data querying languages. Strong collaboration, analytical, interpersonal and communication abilities. Ideally, you’ll also have: Bachelor's degree in Computer Science, Information Technology, Data Science, or a related field. Microsoft Power BI certification. Experience with other BI tools. Worked within large teams to successfully implement Power BI solutions. Sound knowledge of the software development lifecycle and experience with Git. Ability to propose solutions by recalling best practices learned from Microsoft documentation, whitepapers and community publications. What we look for: We want people who are self-starters, who can take initiative and get things done. If you can think critically and creatively to solve problems, you will excel. You should be comfortable working with culturally diverse outsourced on/offshore team members which means you may need to work outside of the normal working hours in your time zone to partner with other Client Technology staff globally. Some travel may also be required, both domestic and international. What we offer: As part of this role, you'll work in a highly integrated, global team with the opportunity and tools to grow, develop and drive your career forward. Here, you can combine global opportunity with flexible working. The EY benefits package goes above and beyond too, focusing on your physical, emotional, financial and social well-being. Your recruiter can talk to you about the benefits available in your country. Here’s a snapshot of what we offer: Continuous learning: You’ll develop the mindset and skills to navigate whatever comes next. Success as defined by you: We’ll provide the tools and flexibility, so you can make a meaningful impact, your way. Transformative leadership: We’ll give you the insights, coaching and confidence to be the leader the world needs. Diverse and inclusive culture: You’ll be embraced for who you are and empowered to use your voice to help others find theirs. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 4 days ago

Apply

6.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Your Responsibilities includes, but not limited to: Participate in overall architecture, Capacity planning, development, and implementation of Master Data Management solutions (MDM). Using MDM technologies and tools across an enterprise to enable the management and integration of master data. Understand the technical landscape current as well as desired future state Assess the current state architecture & understand current business processes for managing Master Data Management solutions. Assess the functional and non-functional requirements of desired future state MDM solution Prepare the to-be architecture including data ingestion, data quality rules, data model, match/merge, workflows, UI, batch integration and real-time services Extensive hands on experience in installation and configuration of core Informatica MDM Hub components such as Hub console, Hub Store, Hub Server, Cleanse/Match Server and Cleanse Adaptor. Ability to deliver full lifecycle MDM projects for clients including Data modeling, Metadata management, design and configuration of matching and merging rules, design and configuration of standardizing, cleansing and deduplication rules. Ability to fine-tune and optimize MDM Hub performance Hands-on experience with ActiveVOS, Informatica MDM Service Integration Framework (SIF) and Informatica Business Entity Services (BES) Create Design Document and data models addressing business needs for the client MDM environment - Contribute to creating reusable assets and accelerators for MDM platforms. Will also be involved in integration/transfer of data across multiple systems, streamlining data processes and providing access to MDM data across the enterprise. Make technology decisions related to the Client MDM environment & Interpret requirements and architect MDM solutions. Provide subject matter expertise on data architecture and data integration implementations across various downstream system. Coordinate with Project Managers and participate in project planning and recurring meetings Collaborate with other team members to review prototypes and develop iterative revisions Must have Skills : 6+ years of experience & should have hands on experience of working in MDM Projects and hands on experience in one or more MDM tools like Informatica or Reltio and has expertise in defining matching/ merging & survivor-ship rules Hands on experience in industry data quality tools like Informatica IDQ, IBM Data Quality. Must be proficient reading and understanding data models and experience working with data and databases. Must be comfortable with the concept of services (important for integrating with an ESB for operational MDM) Strong technical experience in the areas of Master Data Management, Meta data management, Data Quality, Data Governance, Data Integration (ETL) and Data Security Experience with (all stages of MDM SDLC) planning, designing, building, deploying and maintaining scalable, highly available, mission critical enterprise wide applications for large enterprises Should have experience in integrating MDM with Data Warehouses and Data Lakes Excellent query writing skills with Working knowledge of Oracle, SQL server, and other major databases Good knowledge of SOA/Real-time integration , Pub-Sub Model and Data Integration with Various CRM systems like Veeva, siebel Should have strong commercial knowledge of key business processes & compliance requirements within Pharma Industry across multiple master data domains like Physician & Product Should have experience working with 3rd Party Data Providers like IMS, LASH, HMS etc Expertise in engaging with business users to understand the business requirements and articulate the value proposition Why consider Axtria? Axtria is a data analytics and software technology company – we focus heavily on sales and marketing functions in the life sciences. We provide cloud-based solutions to help life science clients with digital transformation of their commercial operations. Axtria combines strong process knowledge of Pharma commercial operations, data analytics and software. Our cloud platforms SalesIQ™, MarketingIQ™ and DataMAx™ are the most advanced and built specifically for the life-science industry. A strong track record in delivering customer value by being innovative, flexible and transparent has enabled Axtria to become the fastest and now one of the largest providers of Pharma commercial operations solutions. In short eight years, the company has grown to over 950 employees. We work with over 75 life-science customers including many large and specialty companies. Axtria is looking for exceptional talent to join our rapidly growing global team. People are our biggest perk! Our transparent and collaborative culture offers a chance to work with some of the brightest minds in the industry. Axtria Institute, our in-house university, offers the best training in the industry and an opportunity to learn in a structured environment. A customized career progression plan ensures every associate is setup for success and able to do meaningful work in a fun environment. We want our legacy to be the leaders we produce for the industry. Will you be next?

Posted 4 days ago

Apply

0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Fueled by strategic investment in technology and innovation, Client Technology seeks to drive growth opportunities and solve complex business problems for our clients through building a robust platform for business and powerful product engine that are integral to innovation at scale. You will work with technologists and business specialists, blending EY’s deep industry knowledge and innovative ideas with our platforms, capabilities, and technical expertise. As a champion for change and growth, you will be at the forefront of integrating emerging technologies from AI to Data Analytics into every corner of what we do at EY. That means more growth for you, exciting learning opportunities, career choices, and the chance to make a real impact. The opportunity: We are looking for a highly experienced Power BI developer who will be part of the Data Engineering tam within EY Client Technology’s Advanced Analytics Team. The candidate will be responsible for designing, developing and maintaining Power BI data models, reports and dashboards. If you are passionate about business intelligence, analytics and have a knack for turning complex data into actionable insights, we want to hear from you. To qualify for the role, you must have: Strong proficiency in Power BI, including DAX and the Power Query formula language (M-language). Advanced understanding of data modeling, data warehousing and ETL techniques. Designed, developed and maintained Power BI reports (including paginated reports) and dashboards to support business decision-making processes. Designed, developed and implemented Power BI data models for complex and large-scale enterprise environments. Proven experience with deploying and optimizing large datasets. Proficiency in SQL and other data querying languages. Strong collaboration, analytical, interpersonal and communication abilities. Ideally, you’ll also have: Bachelor's degree in Computer Science, Information Technology, Data Science, or a related field. Microsoft Power BI certification. Experience with other BI tools. Worked within large teams to successfully implement Power BI solutions. Sound knowledge of the software development lifecycle and experience with Git. Ability to propose solutions by recalling best practices learned from Microsoft documentation, whitepapers and community publications. What we look for: We want people who are self-starters, who can take initiative and get things done. If you can think critically and creatively to solve problems, you will excel. You should be comfortable working with culturally diverse outsourced on/offshore team members which means you may need to work outside of the normal working hours in your time zone to partner with other Client Technology staff globally. Some travel may also be required, both domestic and international. What we offer: As part of this role, you'll work in a highly integrated, global team with the opportunity and tools to grow, develop and drive your career forward. Here, you can combine global opportunity with flexible working. The EY benefits package goes above and beyond too, focusing on your physical, emotional, financial and social well-being. Your recruiter can talk to you about the benefits available in your country. Here’s a snapshot of what we offer: Continuous learning: You’ll develop the mindset and skills to navigate whatever comes next. Success as defined by you: We’ll provide the tools and flexibility, so you can make a meaningful impact, your way. Transformative leadership: We’ll give you the insights, coaching and confidence to be the leader the world needs. Diverse and inclusive culture: You’ll be embraced for who you are and empowered to use your voice to help others find theirs. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 4 days ago

Apply

0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Fueled by strategic investment in technology and innovation, Client Technology seeks to drive growth opportunities and solve complex business problems for our clients through building a robust platform for business and powerful product engine that are integral to innovation at scale. You will work with technologists and business specialists, blending EY’s deep industry knowledge and innovative ideas with our platforms, capabilities, and technical expertise. As a champion for change and growth, you will be at the forefront of integrating emerging technologies from AI to Data Analytics into every corner of what we do at EY. That means more growth for you, exciting learning opportunities, career choices, and the chance to make a real impact. The opportunity: We are looking for a highly experienced Power BI developer who will be part of the Data Engineering tam within EY Client Technology’s Advanced Analytics Team. The candidate will be responsible for designing, developing and maintaining Power BI data models, reports and dashboards. If you are passionate about business intelligence, analytics and have a knack for turning complex data into actionable insights, we want to hear from you. To qualify for the role, you must have: Strong proficiency in Power BI, including DAX and the Power Query formula language (M-language). Advanced understanding of data modeling, data warehousing and ETL techniques. Designed, developed and maintained Power BI reports (including paginated reports) and dashboards to support business decision-making processes. Designed, developed and implemented Power BI data models for complex and large-scale enterprise environments. Proven experience with deploying and optimizing large datasets. Proficiency in SQL and other data querying languages. Strong collaboration, analytical, interpersonal and communication abilities. Ideally, you’ll also have: Bachelor's degree in Computer Science, Information Technology, Data Science, or a related field. Microsoft Power BI certification. Experience with other BI tools. Worked within large teams to successfully implement Power BI solutions. Sound knowledge of the software development lifecycle and experience with Git. Ability to propose solutions by recalling best practices learned from Microsoft documentation, whitepapers and community publications. What we look for: We want people who are self-starters, who can take initiative and get things done. If you can think critically and creatively to solve problems, you will excel. You should be comfortable working with culturally diverse outsourced on/offshore team members which means you may need to work outside of the normal working hours in your time zone to partner with other Client Technology staff globally. Some travel may also be required, both domestic and international. What we offer: As part of this role, you'll work in a highly integrated, global team with the opportunity and tools to grow, develop and drive your career forward. Here, you can combine global opportunity with flexible working. The EY benefits package goes above and beyond too, focusing on your physical, emotional, financial and social well-being. Your recruiter can talk to you about the benefits available in your country. Here’s a snapshot of what we offer: Continuous learning: You’ll develop the mindset and skills to navigate whatever comes next. Success as defined by you: We’ll provide the tools and flexibility, so you can make a meaningful impact, your way. Transformative leadership: We’ll give you the insights, coaching and confidence to be the leader the world needs. Diverse and inclusive culture: You’ll be embraced for who you are and empowered to use your voice to help others find theirs. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 4 days ago

Apply

0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Key Responsibilities Designed and developed scalable ETL pipelines using Cloud Functions, Cloud Dataproc (Spark), and BigQuery as the central data warehouse for large-scale batch and transformation workloads. Implemented efficient data modeling techniques in BigQuery (including star/snowflake schemas, partitioning, and clustering) to support high-performance analytics and reduce query costs. Built end-to-end ingestion frameworks leveraging Cloud Pub/Sub and Cloud Functions for real-time and event-driven data capture. Used Apache Airflow (Cloud Composer) for orchestration of complex data workflows and dependency management. Applied Cloud Data Fusion and Datastream selectively for integrating specific sources (e.g., databases and legacy systems) into the pipeline. Developed strong backtracking and troubleshooting workflows to quickly identify data issues, job failures, and pipeline bottlenecks, ensuring consistent data delivery and SLA compliance. Integrated robust monitoring, alerting, and logging to ensure data quality, integrity, and observability. Tech stack GCP: BigQuery, Cloud Functions, Cloud Dataproc (Spark), Pub/Sub, Data Fusion, Datastream Orchestration: Apache Airflow (Cloud Composer) Languages: Python, SQL, PySpark Concepts: Data Modeling, ETL/ELT, Streaming & Batch Processing, Schema Management, Monitoring & Logging Some of the most important data sources: (need to know ingestion technique on these) CRM Systems (cloud-based and internal) Salesforce Teradata MySQL API Other 3rd-party and internal operational systems Skills: etl/elt,cloud data fusion,schema management,sql,pyspark,cloud dataproc (spark),monitoring & logging,data modeling,bigquery,etl,cloud pub/sub,python,gcp,bigquerry,streaming & batch processing,datastream,cloud functions,spark,apache airflow (cloud composer)

Posted 4 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies