Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
8.0 years
0 Lacs
Greater Kolkata Area
On-site
Location: PAN India Duration: 6 Months Experience Required: 7–8 years Job Summary We are looking for an experienced SSAS Developer with strong expertise in developing both OLAP and Tabular models using SQL Server Analysis Services (SSAS), alongside advanced ETL development skills using tools like SSIS, Informatica, or Azure Data Factory . The ideal candidate will be well-versed in T-SQL , dimensional modeling, and building high-performance, scalable data solutions. Key Responsibilities Design, build, and maintain SSAS OLAP cubes and Tabular models Create complex DAX and MDX queries for analytical use cases Develop robust ETL workflows and pipelines using SSIS, Informatica, or ADF Collaborate with cross-functional teams to translate business requirements into BI solutions Optimize SSAS models for scalability and performance Implement best practices in data modeling, version control, and deployment automation Support dashboarding and reporting needs via Power BI, Excel, or Tableau Maintain and troubleshoot data quality, performance, and integration issues Must-Have Skills Hands-on experience with SSAS (Tabular & Multidimensional) Proficient in DAX, MDX, and T-SQL Advanced ETL skills using SSIS / Informatica / Azure Data Factory Knowledge of dimensional modeling (star & snowflake schema) Experience with Azure SQL / MS SQL Server Familiarity with Git and CI/CD pipelines Nice to Have Exposure to cloud data platforms (Azure Synapse, Snowflake, AWS Redshift) Working knowledge of Power BI or similar BI tools Understanding of Agile/Scrum methodology Bachelor's degree in Computer Science, Information Systems, or equivalent Show more Show less
Posted 1 week ago
8.0 years
0 Lacs
Greater Kolkata Area
On-site
Location: PAN India Duration: 6 Months Experience Required: 7–8 years Job Summary We are looking for an experienced SSAS Developer with strong expertise in developing both OLAP and Tabular models using SQL Server Analysis Services (SSAS), alongside advanced ETL development skills using tools like SSIS, Informatica, or Azure Data Factory . The ideal candidate will be well-versed in T-SQL , dimensional modeling, and building high-performance, scalable data solutions. Key Responsibilities Design, build, and maintain SSAS OLAP cubes and Tabular models Create complex DAX and MDX queries for analytical use cases Develop robust ETL workflows and pipelines using SSIS, Informatica, or ADF Collaborate with cross-functional teams to translate business requirements into BI solutions Optimize SSAS models for scalability and performance Implement best practices in data modeling, version control, and deployment automation Support dashboarding and reporting needs via Power BI, Excel, or Tableau Maintain and troubleshoot data quality, performance, and integration issues Must-Have Skills Hands-on experience with SSAS (Tabular & Multidimensional) Proficient in DAX, MDX, and T-SQL Advanced ETL skills using SSIS / Informatica / Azure Data Factory Knowledge of dimensional modeling (star & snowflake schema) Experience with Azure SQL / MS SQL Server Familiarity with Git and CI/CD pipelines Nice to Have Exposure to cloud data platforms (Azure Synapse, Snowflake, AWS Redshift) Working knowledge of Power BI or similar BI tools Understanding of Agile/Scrum methodology Bachelor's degree in Computer Science, Information Systems, or equivalent Show more Show less
Posted 1 week ago
8.0 years
0 Lacs
Greater Kolkata Area
On-site
Location: PAN India Duration: 6 Months Experience Required: 7–8 years Job Summary We are looking for an experienced SSAS Developer with strong expertise in developing both OLAP and Tabular models using SQL Server Analysis Services (SSAS), alongside advanced ETL development skills using tools like SSIS, Informatica, or Azure Data Factory . The ideal candidate will be well-versed in T-SQL , dimensional modeling, and building high-performance, scalable data solutions. Key Responsibilities Design, build, and maintain SSAS OLAP cubes and Tabular models Create complex DAX and MDX queries for analytical use cases Develop robust ETL workflows and pipelines using SSIS, Informatica, or ADF Collaborate with cross-functional teams to translate business requirements into BI solutions Optimize SSAS models for scalability and performance Implement best practices in data modeling, version control, and deployment automation Support dashboarding and reporting needs via Power BI, Excel, or Tableau Maintain and troubleshoot data quality, performance, and integration issues Must-Have Skills Hands-on experience with SSAS (Tabular & Multidimensional) Proficient in DAX, MDX, and T-SQL Advanced ETL skills using SSIS / Informatica / Azure Data Factory Knowledge of dimensional modeling (star & snowflake schema) Experience with Azure SQL / MS SQL Server Familiarity with Git and CI/CD pipelines Nice to Have Exposure to cloud data platforms (Azure Synapse, Snowflake, AWS Redshift) Working knowledge of Power BI or similar BI tools Understanding of Agile/Scrum methodology Bachelor's degree in Computer Science, Information Systems, or equivalent Show more Show less
Posted 1 week ago
8.0 years
0 Lacs
Greater Kolkata Area
On-site
Location: PAN India Duration: 6 Months Experience Required: 7–8 years Job Summary We are looking for an experienced SSAS Developer with strong expertise in developing both OLAP and Tabular models using SQL Server Analysis Services (SSAS), alongside advanced ETL development skills using tools like SSIS, Informatica, or Azure Data Factory . The ideal candidate will be well-versed in T-SQL , dimensional modeling, and building high-performance, scalable data solutions. Key Responsibilities Design, build, and maintain SSAS OLAP cubes and Tabular models Create complex DAX and MDX queries for analytical use cases Develop robust ETL workflows and pipelines using SSIS, Informatica, or ADF Collaborate with cross-functional teams to translate business requirements into BI solutions Optimize SSAS models for scalability and performance Implement best practices in data modeling, version control, and deployment automation Support dashboarding and reporting needs via Power BI, Excel, or Tableau Maintain and troubleshoot data quality, performance, and integration issues Must-Have Skills Hands-on experience with SSAS (Tabular & Multidimensional) Proficient in DAX, MDX, and T-SQL Advanced ETL skills using SSIS / Informatica / Azure Data Factory Knowledge of dimensional modeling (star & snowflake schema) Experience with Azure SQL / MS SQL Server Familiarity with Git and CI/CD pipelines Nice to Have Exposure to cloud data platforms (Azure Synapse, Snowflake, AWS Redshift) Working knowledge of Power BI or similar BI tools Understanding of Agile/Scrum methodology Bachelor's degree in Computer Science, Information Systems, or equivalent Show more Show less
Posted 1 week ago
8.0 years
0 Lacs
Greater Kolkata Area
On-site
Location: PAN India Duration: 6 Months Experience Required: 7–8 years Job Summary We are looking for an experienced SSAS Developer with strong expertise in developing both OLAP and Tabular models using SQL Server Analysis Services (SSAS), alongside advanced ETL development skills using tools like SSIS, Informatica, or Azure Data Factory . The ideal candidate will be well-versed in T-SQL , dimensional modeling, and building high-performance, scalable data solutions. Key Responsibilities Design, build, and maintain SSAS OLAP cubes and Tabular models Create complex DAX and MDX queries for analytical use cases Develop robust ETL workflows and pipelines using SSIS, Informatica, or ADF Collaborate with cross-functional teams to translate business requirements into BI solutions Optimize SSAS models for scalability and performance Implement best practices in data modeling, version control, and deployment automation Support dashboarding and reporting needs via Power BI, Excel, or Tableau Maintain and troubleshoot data quality, performance, and integration issues Must-Have Skills Hands-on experience with SSAS (Tabular & Multidimensional) Proficient in DAX, MDX, and T-SQL Advanced ETL skills using SSIS / Informatica / Azure Data Factory Knowledge of dimensional modeling (star & snowflake schema) Experience with Azure SQL / MS SQL Server Familiarity with Git and CI/CD pipelines Nice to Have Exposure to cloud data platforms (Azure Synapse, Snowflake, AWS Redshift) Working knowledge of Power BI or similar BI tools Understanding of Agile/Scrum methodology Bachelor's degree in Computer Science, Information Systems, or equivalent Show more Show less
Posted 1 week ago
10.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Description SSIS ADF ETL Requirements Bachelor’s degree in computer science or related field, or equivalent work experience with 10+ years in SQL Servers and relational databases Strong expertise in data engineering, including querying and optimizing complex data sets, and experience with ETL tools like SSIS and ADF Proficiency in Azure cloud technologies, data visualization and reporting tools (e.g., SSRS, Power BI), and a solid understanding of OLAP structures and relational database design Excellent communication skills and ability to work independently or collaboratively in a team, with preferred experience in data warehousing, .NET, Azure, AWS, and agile environments Job responsibilities Bachelor’s degree in computer science or related field, or equivalent work experience with 6 years in SQL Servers and relational databases Strong expertise in data engineering, including querying and optimizing complex data sets, and experience with ETL tools like SSIS and ADF Proficiency in Azure cloud technologies, data visualization and reporting tools (e.g., SSRS, Power BI), and a solid understanding of OLAP structures and relational database design Excellent communication skills and ability to work independently or collaboratively in a team, with preferred experience in data warehousing, .NET, Azure, AWS, and agile environments What we offer Culture of caring. At GlobalLogic, we prioritize a culture of caring. Across every region and department, at every level, we consistently put people first. From day one, you’ll experience an inclusive culture of acceptance and belonging, where you’ll have the chance to build meaningful connections with collaborative teammates, supportive managers, and compassionate leaders. Learning and development. We are committed to your continuous learning and development. You’ll learn and grow daily in an environment with many opportunities to try new things, sharpen your skills, and advance your career at GlobalLogic. With our Career Navigator tool as just one example, GlobalLogic offers a rich array of programs, training curricula, and hands-on opportunities to grow personally and professionally. Interesting & meaningful work. GlobalLogic is known for engineering impact for and with clients around the world. As part of our team, you’ll have the chance to work on projects that matter. Each is a unique opportunity to engage your curiosity and creative problem-solving skills as you help clients reimagine what’s possible and bring new solutions to market. In the process, you’ll have the privilege of working on some of the most cutting-edge and impactful solutions shaping the world today. Balance and flexibility. We believe in the importance of balance and flexibility. With many functional career areas, roles, and work arrangements, you can explore ways of achieving the perfect balance between your work and life. Your life extends beyond the office, and we always do our best to help you integrate and balance the best of work and life, having fun along the way! High-trust organization. We are a high-trust organization where integrity is key. By joining GlobalLogic, you’re placing your trust in a safe, reliable, and ethical global company. Integrity and trust are a cornerstone of our value proposition to our employees and clients. You will find truthfulness, candor, and integrity in everything we do. About GlobalLogic GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner to the world’s largest and most forward-thinking companies. Since 2000, we’ve been at the forefront of the digital revolution – helping create some of the most innovative and widely used digital products and experiences. Today we continue to collaborate with clients in transforming businesses and redefining industries through intelligent products, platforms, and services. Show more Show less
Posted 1 week ago
9.0 - 14.0 years
22 - 27 Lacs
Hyderabad
Hybrid
Job Description Job Title: SQL Database Architect (SQL Server, Azure SQL Server, SSIS, SSRS, Data Migration, Azure Data Factory, Power BI) Job Summary: We are seeking a highly skilled SQL Database Architect with expertise in SQL Server, Azure SQL Server, SSIS, SSRS, Data Migration, and Power BI to design, develop, and maintain scalable database solutions. The ideal candidate will have experience in database architecture, data integration, ETL processes, cloud-based solutions, and business intelligence reporting. Excellent communication and documentation skills are essential for collaborating with cross-functional teams and maintaining structured database records. Key Responsibilities: Database Design & Architecture: Develop highly available, scalable, and secure database solutions using Azure SQL Server. ETL & Data Integration: Design and implement SSIS packages for data movement, transformations, and automation. Data Migration: Oversee database migration projects, including on-premises to cloud transitions and cloud to cloud transitions, data conversion, and validation processes. Azure Data Factory (ADF): Build and manage data pipelines, integrating various data sources and orchestrating ETL workflows in cloud environments Reporting & Business Intelligence: Develop SSRS reports and leverage Power BI for creating interactive dashboards and data visualizations. Performance Optimization: Analyze and optimize query performance, indexing strategies, and database configurations. Cloud Integration: Architect Azure-based database solutions, including Azure SQL Database, Managed Instances, and Synapse Analytics. Security & Compliance: Ensure data security, encryption, and compliance with industry standards. Backup & Disaster Recovery: Design and implement backup strategies, high availability, and disaster recovery solutions. Automation & Monitoring: Utilize Azure Monitor, SQL Profiler, and other tools to automate and monitor database performance. Collaboration & Communication: Work closely with developers, BI teams, DevOps, and business stakeholders, explaining complex database concepts in a clear and concise manner. Documentation & Best Practices: Maintain comprehensive database documentation, including design specifications, technical workflows, and troubleshooting guides. Required Skills & Qualifications: Expertise in SQL Server & Azure SQL Database. Experience with SSIS for ETL processes, data transformations, and automation. Proficiency in SSRS for creating, deploying, and managing reports. Strong expertise in Data Migration, including cloud and on-premises database transitions. Power BI skills for developing dashboards, reports, and data visualizations. Database modeling, indexing, and query optimization expertise. Knowledge of cloud-based architecture, including Azure SQL Managed Instance. Proficiency in T-SQL, stored procedures, Triggers, and database scripting. Understanding of security best practices, including role-based access control (RBAC). Excellent communication skills to explain database solutions to technical and non-technical stakeholders. Strong documentation skills to create and maintain database design specs, process documents, and reports. Preferred Qualifications: Knowledge of CI/CD pipelines for database deployments. Familiarity with Power BI and other data visualization tools. Experience with Azure Data Factory and Synapse Analytics for advanced data engineering workflows Qualifications Bachelor's or Master's degree in Business, Computer Science, Engineering, or a related field.
Posted 1 week ago
10.0 - 15.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
About the company: With over 2.5 crore customers, over 5,000 distribution points and nearly 2,000 branches, IndusInd Bank is a universal bank with a widespread banking footprint across the country. IndusInd offers a wide array of products and services for individuals and corporates including microfinance, personal loans, personal and commercial vehicles loans, credit cards, SME loans. Over the years, IndusInd has grown ceaselessly and dynamically, driven by zeal to offer our customers banking services at par with the highest quality standards in the industry. IndusInd is a pioneer in digital first solutions to bring together the power of next-gen digital product stack, customer excellence and trust of an established bank. Job Purpose: To work on implementing data modeling solutions To design data flow and structure to reduce data redundancy and improving data movement among systems defining a data lineage To work in the Azure Data Warehouse To work with large data volume of data integration Experience With overall experience between 10 to 15 years, applicant must have minimum 8 to 11 years of hard core professional experience in data modeling for large Data Warehouse with multiple Sources. Technical Skills Expertise in core skill of data modeling principles/methods including conceptual, logical & physical Data Models Ability to utilize BI tools like Power BI, Tableau, etc to represent insights Experience in translating/mapping relational data models into XML and Schemas Expert knowledge of metadata management, relational & data modeling tools like ER Studio, Erwin or others. Hands-on experience in relational, dimensional and/or analytical experience (using RDBMS, dimensional, NoSQL, ETL and data ingestion protocols). Very strong in SQL queries Expertise in performance tuning of SQL queries. Ability to analyse source system and create Source to Target mapping. Ability to understand the business use case and create data models or joined data in Datawarehouse. Preferred experience in banking domain and experience in building data models/marts for various banking functions. Good to have knowledge of – -Azure powershell scripting or Python scripting for data transformation in ADF - SSIS, SSAS, BI tools like Power BI -Azure PaaS components like Azure Data Factory, Azure Data Bricks, Azure Data Lake, Azure Synapse (DWH), Polybase, ExpressRoute tunneling, etc. -API integration Responsibility Understanding the existing data model, existing data warehouse design, functional domain subject areas of data, documenting the same with as is architecture and proposed one. Understanding existing ETL process, various sources and analyzing, documenting the best approach to design logical data model where required Work with development team to implement the proposed data model into physical data model, build data flows Work with development team to optimize the database structure with best practices applying optimization methods. Analyze, document and implement to re-use of data model for new initiatives. Will interact with stakeholder, Users, other IT teams to understand the eco system and analyze for solutions Work on user requirements and create queries for creating consumption views for users from the existing DW data. Will train and lead a small team of data engineers. Qualifications Bachelors of Computer Science or Equivalent Should have certification done on Data Modeling and Data Analyst. Good to have a certification of Azure Fundamental and Azure Engineer courses (AZ900 or DP200/201) Behavioral Competencies Should have excellent problem-solving and time management skills Strong analytical thinking skills Applicant should have excellent communication skill and process oriented with flexible execution mindset. Strategic Thinking with Research and Development mindset. Clear and demonstrative communication Efficiently identify and solves issues Identify, track and escalate risks in a timely manner Selection Process: Interested Candidates are mandatorily required to apply through the listing on Jigya. Only applications received through Jigya will be evaluated further. Shortlisted candidates may need to appear in an Online Assessment and/or a Technical Screening interview administered by Jigya, on behalf on IndusInd Bank Candidates selected after the screening rounds will be processed further by IndusInd Bank Show more Show less
Posted 1 week ago
7.0 - 8.0 years
4 - 6 Lacs
Mumbai, Hyderabad, Chennai
Work from Office
Data Engineer (Contract | 6 Months) We are seeking an experienced Data Engineer to join our team for a 6-month contract assignment. The ideal candidate will work on data warehouse development, ETL pipelines, and analytics enablement using Snowflake, Azure Data Factory (ADF), dbt, and other tools. This role requires strong hands-on experience with data integration platforms, documentation, and pipeline optimizationespecially in cloud environments such as Azure and AWS. #KeyResponsibilities Build and maintain ETL pipelines using Fivetran, dbt, and Azure Data Factory Monitor and support production ETL jobs Develop and maintain data lineage documentation for all systems Design data mapping and documentation to aid QA/UAT testing Evaluate and recommend modern data integration tools Optimize shared data workflows and batch schedules Collaborate with Data Quality Analysts to ensure accuracy and integrity of data flows Participate in performance tuning and improvement recommendations Support BI/MDM initiatives including Data Vault and Data Lakes #RequiredSkills 7+ years of experience in data engineering roles Strong command of SQL, with 5+ years of hands-on development Deep experience with Snowflake, Azure Data Factory, dbt Strong background with ETL tools (Informatica, Talend, ADF, dbt, etc.) Bachelor's in CS, Engineering, Math, or related field Experience in healthcare domain (working with PHI/PII data) Familiarity with scripting/programming (Python, Perl, Java, Linux-based environments) Excellent communication and documentation skills Experience with BI tools like Power BI, Cognos, etc. Organized, self-starter with strong time-management and critical thinking abilities #NiceToHave Experience with Data Lakes and Data Vaults QA & UAT alignment with clear development documentation Multi-cloud experience (especially Azure, AWS)
Posted 1 week ago
7.0 - 12.0 years
20 - 35 Lacs
Pune
Work from Office
Job Title: Data Engineer Informatica IDMC Location: Remote/Contract Experience Level: Mid to Senior (7+ years) Job Summary: We are seeking a highly skilled Data Engineer with a minimum of 5 years of hands-on experience in Informatica Intelligent Data Management Cloud (IDMC) . The successful candidate will design, implement, and maintain scalable data integration solutions using Informatica Cloud services. Experience with CI/CD pipelines is required to ensure efficient development and deployment cycles. Familiarity with Informatica Catalog , Data Governance , and Data Quality or Azure Data Factory is considered a strong advantage. Key Responsibilities: Design, build, and optimize end-to-end data pipelines using Informatica IDMC , including Cloud Data Integration and Cloud Application Integration. Implement ETL/ELT processes to support data lakehouse, and EDW use cases. Develop and maintain CI/CD pipelines to support automated deployment and version control. Work closely with data architects, analysts, and business stakeholders to translate data requirements into scalable solutions. Monitor job performance, troubleshoot issues, and ensure compliance with SLAs and data quality standards. Document technical designs, workflows, and integration processes following best practices. Collaborate with DevOps and cloud engineering teams to ensure seamless integration within the cloud ecosystem. Required Qualifications: Bachelor’s degree in Computer Science, Information Systems, Engineering, or a related field. Minimum 5 years of hands-on experience with Informatica IDMC . Experience in building and deploying CI/CD pipelines using tools such as Git, or Azure DevOps. Proficient in SQL , data modeling, and transformation logic. Experience with cloud platforms (Azure or OCI). Strong problem-solving skills in data operations and pipeline performance. Preferred / Nice-to-Have Skills: Experience with Informatica Data Catalog for metadata and lineage tracking. Familiarity with Informatica Data Governance tools such as Axon and Business Glossary. Hands-on experience with Informatica Data Quality (IDQ) for data profiling, cleansing. Experience developing data pipelines using Azure Data Factory.
Posted 1 week ago
0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Reporting Data Engineer Join EY as a MARS Data Engineer and be at the forefront of providing and implementing innovative data insights, data products, and data services. MARS is a data platform providing custom data insights, DaaS and DaaP for a variety of EY departments and staff. We leverage software development practices to develop intricate data insights and develop data products. Your Key Responsibilities As a member of the MARS team, you will play a critical role in our mission of providing innovative data insights, the operations and support of the MARS data platform. This includes supporting customers, internal team members, and management. Operations and support include estimating, designing, developing and delivery of data products and services. You will contribute your creative solutions and knowledge to our data platform which features 2TB of mobile device data daily (300K+ devices). Our platform empowers our product managers and help enable our teams to build a better working world. As reporting engineer with the MARS team, the following activities are expected: Collaborate closely with the product manager to align activities to timelines and deadlines Proactively suggest new ideas and solutions, driving them to implementation with minimal guidance on technical delivery Provide input to the MARS roadmap and actively participate to bring it to life Collaborate with the Intune engineering team to get a clear understanding of the mobile device lifecycle and the relationship to Intune data and reporting Serve as the last level of support for all MARS data reporting questions and issues. Participate and contribute in the below activities: Customer discussions and requirement gathering sessions Application reports (daily, weekly, monthly, quarterly, annually) Custom reporting for manual reports, dashboards, exports, APIs, and semantic models Customer Service engagements Daily team meetings Work estimates and daily status Data & Dashboard monitoring & troubleshooting Automation Data management and classification Maintaining design documentation for Data schema, data models, data catalogue, and related products/services. Monitoring and integrating a variety of data sources Maintain and develop custom data quality tools Skills And Attributes For Success General Skills Analytical Ability: Strong analytical skills in supporting core technologies, particularly in managing large user bases, to effectively troubleshoot and optimize data solutions. Communication Skills: Excellent written and verbal communication skills, with the ability to articulate complex technical concepts clearly to both technical and non-technical stakeholders. Proficiency in English is required, with additional languages being a plus. Interpersonal Skills: Strong interpersonal skills, sound judgment, and tact to foster collaboration with colleagues and customers across diverse cultural backgrounds. Creative Problem-Solving: Ability to conceptualize innovative solutions that add value to end users, particularly in the context of mobile applications and services. Self-Starter Mentality: A proactive and self-motivated approach to work, with the ability to take initiative and drive projects forward independently. Documentation Skills: Clear and concise documentation skills, ensuring that all processes, solutions, and communications are well-documented for future reference. Organizational skills: The ability to define project plans, execute them, and manage ongoing risks and communications throughout the project lifecycle. Cross-Cultural Awareness: Awareness of and sensitivity to cross-cultural dynamics, enabling effective collaboration with global teams and clients. User Experience Focus: Passionate about improving user experience, with an understanding of how to measure, monitor, and enhance user satisfaction through feedback and analytics. To qualify for the role, you must have the following qualifications: At least three-years of experience in the following technologies and methodologies Hands-on experience in Microsoft Intune data, Mobile Device and Application Management data (MSFT APIs, Graph and IDW) Proven experience in mobile platform engineering or a related field. Strong understanding of mobile technologies and security protocols, particularly within an Intune-based environment. Experience with Microsoft Intune, including mobile device and application management. Proficient in supporting Modern Workplace tools and resources. Skilled in supporting Modern Workplace tools and resources Experience with iOS and Android operating systems. Proficient in PowerShell scripting for automation and management tasks. Ability to operate proactively and independently in a fast-paced environment. Solution oriented mindset with the capability to design and implement creative Mobile solutions and the ability to suggest and implement solutions that meet EY’s requirements Ability to work in UK working hours Specific technology skills include the following: Technical Skills Power BI - semantic models, Advanced Dashboards Power Bi Templates Intune Reporting and Intune Data Intune Compliance Intune Device Intune Policy management Intune Metrics Intune Monitoring SPLUNK data and reporting Sentinel data and reporting HR data and reporting Mobile Defender data and reporting AAD-Active Directory Data quality & data assurance Data Bricks Web Analytics Mobile Analytics Azure Data Factory Azure pipelines/synapses Azure SQL DB/Server ADF Automation Azure Kubernetes Service (KaaS) Key Vault management Azure Monitoring App Proxy & Azure Front Door data exports API Development Python, SQL, KQL, Power Apps MSFT Intune APIs, (Export, App Install) Virtual Machines SharePoint - General operations Data modeling ETL and related technologies Ideally, you’ll also have the following: Strong communication skills to effectively liaise with various stakeholders. A proactive approach to suggesting and implementing new ideas. Familiarity with the latest trends in mobile technology. Ability to explain very technical topics to non-technical stakeholders Experience in managing and supporting large mobile environments. Testing and Quality Assurance – ensure our mobile platform meets quality, performance and security standards. Implementation of new products and/or service offerings. Experience with working in a large global environment XML data formats Agile delivery Object-oriented design and programming Software development Mobile What we look for: A person that demonstrates a commitment to integrity, initiative, collaboration, efficiency and three or more years in the field of data analytics, and Intune data reporting. What We Offer EY Global Delivery Services (GDS) is a dynamic and truly global delivery network. We work across six locations – Argentina, China, India, the Philippines, Poland and the UK – and with teams from all EY service lines, geographies and sectors, playing a vital role in the delivery of the EY growth strategy. From accountants to coders to advisory consultants, we offer a wide variety of fulfilling career opportunities that span all business disciplines. In GDS, you will collaborate with EY teams on exciting projects and work with well-known brands from across the globe. We’ll introduce you to an ever-expanding ecosystem of people, learning, skills and insights that will stay with you throughout your career. Continuous learning: You’ll develop the mindset and skills to navigate whatever comes next. Success as defined by you: We’ll provide the tools and flexibility, so you can make a meaningful impact, your way. Transformative leadership: We’ll give you the insights, coaching and confidence to be the leader the world needs. Diverse and inclusive culture: You’ll be embraced for who you are and empowered to use your voice to help others find theirs. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 1 week ago
0 years
0 Lacs
Kochi, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Reporting Data Engineer Join EY as a MARS Data Engineer and be at the forefront of providing and implementing innovative data insights, data products, and data services. MARS is a data platform providing custom data insights, DaaS and DaaP for a variety of EY departments and staff. We leverage software development practices to develop intricate data insights and develop data products. Your Key Responsibilities As a member of the MARS team, you will play a critical role in our mission of providing innovative data insights, the operations and support of the MARS data platform. This includes supporting customers, internal team members, and management. Operations and support include estimating, designing, developing and delivery of data products and services. You will contribute your creative solutions and knowledge to our data platform which features 2TB of mobile device data daily (300K+ devices). Our platform empowers our product managers and help enable our teams to build a better working world. As reporting engineer with the MARS team, the following activities are expected: Collaborate closely with the product manager to align activities to timelines and deadlines Proactively suggest new ideas and solutions, driving them to implementation with minimal guidance on technical delivery Provide input to the MARS roadmap and actively participate to bring it to life Collaborate with the Intune engineering team to get a clear understanding of the mobile device lifecycle and the relationship to Intune data and reporting Serve as the last level of support for all MARS data reporting questions and issues. Participate and contribute in the below activities: Customer discussions and requirement gathering sessions Application reports (daily, weekly, monthly, quarterly, annually) Custom reporting for manual reports, dashboards, exports, APIs, and semantic models Customer Service engagements Daily team meetings Work estimates and daily status Data & Dashboard monitoring & troubleshooting Automation Data management and classification Maintaining design documentation for Data schema, data models, data catalogue, and related products/services. Monitoring and integrating a variety of data sources Maintain and develop custom data quality tools Skills And Attributes For Success General Skills Analytical Ability: Strong analytical skills in supporting core technologies, particularly in managing large user bases, to effectively troubleshoot and optimize data solutions. Communication Skills: Excellent written and verbal communication skills, with the ability to articulate complex technical concepts clearly to both technical and non-technical stakeholders. Proficiency in English is required, with additional languages being a plus. Interpersonal Skills: Strong interpersonal skills, sound judgment, and tact to foster collaboration with colleagues and customers across diverse cultural backgrounds. Creative Problem-Solving: Ability to conceptualize innovative solutions that add value to end users, particularly in the context of mobile applications and services. Self-Starter Mentality: A proactive and self-motivated approach to work, with the ability to take initiative and drive projects forward independently. Documentation Skills: Clear and concise documentation skills, ensuring that all processes, solutions, and communications are well-documented for future reference. Organizational skills: The ability to define project plans, execute them, and manage ongoing risks and communications throughout the project lifecycle. Cross-Cultural Awareness: Awareness of and sensitivity to cross-cultural dynamics, enabling effective collaboration with global teams and clients. User Experience Focus: Passionate about improving user experience, with an understanding of how to measure, monitor, and enhance user satisfaction through feedback and analytics. To qualify for the role, you must have the following qualifications: At least three-years of experience in the following technologies and methodologies Hands-on experience in Microsoft Intune data, Mobile Device and Application Management data (MSFT APIs, Graph and IDW) Proven experience in mobile platform engineering or a related field. Strong understanding of mobile technologies and security protocols, particularly within an Intune-based environment. Experience with Microsoft Intune, including mobile device and application management. Proficient in supporting Modern Workplace tools and resources. Skilled in supporting Modern Workplace tools and resources Experience with iOS and Android operating systems. Proficient in PowerShell scripting for automation and management tasks. Ability to operate proactively and independently in a fast-paced environment. Solution oriented mindset with the capability to design and implement creative Mobile solutions and the ability to suggest and implement solutions that meet EY’s requirements Ability to work in UK working hours Specific technology skills include the following: Technical Skills Power BI - semantic models, Advanced Dashboards Power Bi Templates Intune Reporting and Intune Data Intune Compliance Intune Device Intune Policy management Intune Metrics Intune Monitoring SPLUNK data and reporting Sentinel data and reporting HR data and reporting Mobile Defender data and reporting AAD-Active Directory Data quality & data assurance Data Bricks Web Analytics Mobile Analytics Azure Data Factory Azure pipelines/synapses Azure SQL DB/Server ADF Automation Azure Kubernetes Service (KaaS) Key Vault management Azure Monitoring App Proxy & Azure Front Door data exports API Development Python, SQL, KQL, Power Apps MSFT Intune APIs, (Export, App Install) Virtual Machines SharePoint - General operations Data modeling ETL and related technologies Ideally, you’ll also have the following: Strong communication skills to effectively liaise with various stakeholders. A proactive approach to suggesting and implementing new ideas. Familiarity with the latest trends in mobile technology. Ability to explain very technical topics to non-technical stakeholders Experience in managing and supporting large mobile environments. Testing and Quality Assurance – ensure our mobile platform meets quality, performance and security standards. Implementation of new products and/or service offerings. Experience with working in a large global environment XML data formats Agile delivery Object-oriented design and programming Software development Mobile What we look for: A person that demonstrates a commitment to integrity, initiative, collaboration, efficiency and three or more years in the field of data analytics, and Intune data reporting. What We Offer EY Global Delivery Services (GDS) is a dynamic and truly global delivery network. We work across six locations – Argentina, China, India, the Philippines, Poland and the UK – and with teams from all EY service lines, geographies and sectors, playing a vital role in the delivery of the EY growth strategy. From accountants to coders to advisory consultants, we offer a wide variety of fulfilling career opportunities that span all business disciplines. In GDS, you will collaborate with EY teams on exciting projects and work with well-known brands from across the globe. We’ll introduce you to an ever-expanding ecosystem of people, learning, skills and insights that will stay with you throughout your career. Continuous learning: You’ll develop the mindset and skills to navigate whatever comes next. Success as defined by you: We’ll provide the tools and flexibility, so you can make a meaningful impact, your way. Transformative leadership: We’ll give you the insights, coaching and confidence to be the leader the world needs. Diverse and inclusive culture: You’ll be embraced for who you are and empowered to use your voice to help others find theirs. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 1 week ago
3.0 years
0 Lacs
Bengaluru, Karnataka, India
Remote
Data Engineer – Azure This is a hands on data platform engineering role that places significant emphasis on consultative data engineering engagements with a wide range of customer stakeholders; Business Owners, Business Analytics, Data Engineering teams, Application Development, End Users and Management teams. You Will: Design and build resilient and efficient data pipelines for batch and real-time streaming Collaborate with product managers, software engineers, data analysts, and data scientists to build scalable and data-driven platforms and tools Provide technical product expertise, advise on deployment architectures, and handle in-depth technical questions around data infrastructure, PaaS services, design patterns and implementation approaches Collaborate with enterprise architects, data architects, ETL developers & engineers, data scientists, and information designers to lead the identification and definition of required data structures, formats, pipelines, metadata, and workload orchestration capabilities Address aspects such as data privacy & security, data ingestion & processing, data storage & compute, analytical & operational consumption, data modeling, data virtualization, self-service data preparation & analytics, AI enablement, and API integrations Execute projects with an Agile mindset Build software frameworks to solve data problems at scale Technical Requirements: 3+ years of data engineering experience leading implementations of large-scale lakehouses on Databricks, Snowflake, or Synapse. Prior experience using DBT and PowerBI will be a plus Extensive experience with Azure data services (Databricks, Synapse, ADF) and related azure infrastructure services like firewall, storage, key vault etc. is required Strong programming / scripting experience using SQL and python and Spark Knowledge of software configuration management environments and tools such as JIRA, Git, Jenkins, TFS, Shell, PowerShell, Bitbucket Experience with Agile development methods in data-oriented projects Other Requirements: Highly motivated self-starter and team player and demonstrated success in prior roles Track record of success working through technical challenges within enterprise organizations Ability to prioritize deals, training, and initiatives through highly effective time management Excellent problem solving, analytical, presentation, and whiteboarding skills Track record of success dealing with ambiguity (internal and external) and working collaboratively with other departments and organizations to solve challenging problems Strong knowledge of technology and industry trends that affect data analytics decisions for enterprise organizations Certifications on Azure Data Engineering and related technologies "Remote postings are limited to candidates residing within the country specified in the posting location" About Rackspace Technology We are the multicloud solutions experts. We combine our expertise with the world’s leading technologies — across applications, data and security — to deliver end-to-end solutions. We have a proven record of advising customers based on their business challenges, designing solutions that scale, building and managing those solutions, and optimizing returns into the future. Named a best place to work, year after year according to Fortune, Forbes and Glassdoor, we attract and develop world-class talent. Join us on our mission to embrace technology, empower customers and deliver the future. More on Rackspace Technology Though we’re all different, Rackers thrive through our connection to a central goal: to be a valued member of a winning team on an inspiring mission. We bring our whole selves to work every day. And we embrace the notion that unique perspectives fuel innovation and enable us to best serve our customers and communities around the globe. We welcome you to apply today and want you to know that we are committed to offering equal employment opportunity without regard to age, color, disability, gender reassignment or identity or expression, genetic information, marital or civil partner status, pregnancy or maternity status, military or veteran status, nationality, ethnic or national origin, race, religion or belief, sexual orientation, or any legally protected characteristic. If you have a disability or special need that requires accommodation, please let us know. Show more Show less
Posted 1 week ago
7.0 years
0 Lacs
Bengaluru, Karnataka, India
Remote
Senior Data Engineer – Azure This is a hands on data platform engineering role that places significant emphasis on consultative data engineering engagements with a wide range of customer stakeholders; Business Owners, Business Analytics, Data Engineering teams, Application Development, End Users and Management teams. You Will: Design and build resilient and efficient data pipelines for batch and real-time streaming Architect and design data infrastructure on cloud using Infrastructure-as-Code tools. Collaborate with product managers, software engineers, data analysts, and data scientists to build scalable and data-driven platforms and tools. Provide technical product expertise, advise on deployment architectures, and handle in-depth technical questions around data infrastructure, PaaS services, design patterns and implementation approaches. Collaborate with enterprise architects, data architects, ETL developers & engineers, data scientists, and information designers to lead the identification and definition of required data structures, formats, pipelines, metadata, and workload orchestration capabilities Address aspects such as data privacy & security, data ingestion & processing, data storage & compute, analytical & operational consumption, data modeling, data virtualization, self-service data preparation & analytics, AI enablement, and API integrations. Lead a team of engineers to deliver impactful results at scale. Execute projects with an Agile mindset. Build software frameworks to solve data problems at scale. Technical Requirements: 7+ years of data engineering experience leading implementations of large-scale lakehouses on Databricks, Snowflake, or Synapse. Prior experience using DBT and PowerBI will be a plus. 3+ years' experience architecting solutions for developing data pipelines from structured, unstructured sources for batch and realtime workloads. Extensive experience with Azure data services (Databricks, Synapse, ADF) and related azure infrastructure services like firewall, storage, key vault etc. is required. Strong programming / scripting experience using SQL and python and Spark. Strong Data Modeling, Data lakehouse concepts. Knowledge of software configuration management environments and tools such as JIRA, Git, Jenkins, TFS, Shell, PowerShell, Bitbucket. Experience with Agile development methods in data-oriented projects Other Requirements: Highly motivated self-starter and team player and demonstrated success in prior roles. Track record of success working through technical challenges within enterprise organizations Ability to prioritize deals, training, and initiatives through highly effective time management Excellent problem solving, analytical, presentation, and whiteboarding skills Track record of success dealing with ambiguity (internal and external) and working collaboratively with other departments and organizations to solve challenging problems Strong knowledge of technology and industry trends that affect data analytics decisions for enterprise organizations Certifications on Azure Data Engineering and related technologies. "Remote postings are limited to candidates residing within the country specified in the posting location" About Rackspace Technology We are the multicloud solutions experts. We combine our expertise with the world’s leading technologies — across applications, data and security — to deliver end-to-end solutions. We have a proven record of advising customers based on their business challenges, designing solutions that scale, building and managing those solutions, and optimizing returns into the future. Named a best place to work, year after year according to Fortune, Forbes and Glassdoor, we attract and develop world-class talent. Join us on our mission to embrace technology, empower customers and deliver the future. More on Rackspace Technology Though we’re all different, Rackers thrive through our connection to a central goal: to be a valued member of a winning team on an inspiring mission. We bring our whole selves to work every day. And we embrace the notion that unique perspectives fuel innovation and enable us to best serve our customers and communities around the globe. We welcome you to apply today and want you to know that we are committed to offering equal employment opportunity without regard to age, color, disability, gender reassignment or identity or expression, genetic information, marital or civil partner status, pregnancy or maternity status, military or veteran status, nationality, ethnic or national origin, race, religion or belief, sexual orientation, or any legally protected characteristic. If you have a disability or special need that requires accommodation, please let us know. Show more Show less
Posted 1 week ago
3.0 years
0 Lacs
Gurgaon, Haryana, India
Remote
Data Engineer – Azure This is a hands on data platform engineering role that places significant emphasis on consultative data engineering engagements with a wide range of customer stakeholders; Business Owners, Business Analytics, Data Engineering teams, Application Development, End Users and Management teams. You Will: Design and build resilient and efficient data pipelines for batch and real-time streaming Collaborate with product managers, software engineers, data analysts, and data scientists to build scalable and data-driven platforms and tools Provide technical product expertise, advise on deployment architectures, and handle in-depth technical questions around data infrastructure, PaaS services, design patterns and implementation approaches Collaborate with enterprise architects, data architects, ETL developers & engineers, data scientists, and information designers to lead the identification and definition of required data structures, formats, pipelines, metadata, and workload orchestration capabilities Address aspects such as data privacy & security, data ingestion & processing, data storage & compute, analytical & operational consumption, data modeling, data virtualization, self-service data preparation & analytics, AI enablement, and API integrations Execute projects with an Agile mindset Build software frameworks to solve data problems at scale Technical Requirements: 3+ years of data engineering experience leading implementations of large-scale lakehouses on Databricks, Snowflake, or Synapse. Prior experience using DBT and PowerBI will be a plus Extensive experience with Azure data services (Databricks, Synapse, ADF) and related azure infrastructure services like firewall, storage, key vault etc. is required Strong programming / scripting experience using SQL and python and Spark Knowledge of software configuration management environments and tools such as JIRA, Git, Jenkins, TFS, Shell, PowerShell, Bitbucket Experience with Agile development methods in data-oriented projects Other Requirements: Highly motivated self-starter and team player and demonstrated success in prior roles Track record of success working through technical challenges within enterprise organizations Ability to prioritize deals, training, and initiatives through highly effective time management Excellent problem solving, analytical, presentation, and whiteboarding skills Track record of success dealing with ambiguity (internal and external) and working collaboratively with other departments and organizations to solve challenging problems Strong knowledge of technology and industry trends that affect data analytics decisions for enterprise organizations Certifications on Azure Data Engineering and related technologies "Remote postings are limited to candidates residing within the country specified in the posting location" About Rackspace Technology We are the multicloud solutions experts. We combine our expertise with the world’s leading technologies — across applications, data and security — to deliver end-to-end solutions. We have a proven record of advising customers based on their business challenges, designing solutions that scale, building and managing those solutions, and optimizing returns into the future. Named a best place to work, year after year according to Fortune, Forbes and Glassdoor, we attract and develop world-class talent. Join us on our mission to embrace technology, empower customers and deliver the future. More on Rackspace Technology Though we’re all different, Rackers thrive through our connection to a central goal: to be a valued member of a winning team on an inspiring mission. We bring our whole selves to work every day. And we embrace the notion that unique perspectives fuel innovation and enable us to best serve our customers and communities around the globe. We welcome you to apply today and want you to know that we are committed to offering equal employment opportunity without regard to age, color, disability, gender reassignment or identity or expression, genetic information, marital or civil partner status, pregnancy or maternity status, military or veteran status, nationality, ethnic or national origin, race, religion or belief, sexual orientation, or any legally protected characteristic. If you have a disability or special need that requires accommodation, please let us know. Show more Show less
Posted 1 week ago
7.0 years
0 Lacs
Gurgaon, Haryana, India
Remote
Senior Data Engineer – Azure This is a hands on data platform engineering role that places significant emphasis on consultative data engineering engagements with a wide range of customer stakeholders; Business Owners, Business Analytics, Data Engineering teams, Application Development, End Users and Management teams. You Will: Design and build resilient and efficient data pipelines for batch and real-time streaming Architect and design data infrastructure on cloud using Infrastructure-as-Code tools. Collaborate with product managers, software engineers, data analysts, and data scientists to build scalable and data-driven platforms and tools. Provide technical product expertise, advise on deployment architectures, and handle in-depth technical questions around data infrastructure, PaaS services, design patterns and implementation approaches. Collaborate with enterprise architects, data architects, ETL developers & engineers, data scientists, and information designers to lead the identification and definition of required data structures, formats, pipelines, metadata, and workload orchestration capabilities Address aspects such as data privacy & security, data ingestion & processing, data storage & compute, analytical & operational consumption, data modeling, data virtualization, self-service data preparation & analytics, AI enablement, and API integrations. Lead a team of engineers to deliver impactful results at scale. Execute projects with an Agile mindset. Build software frameworks to solve data problems at scale. Technical Requirements: 7+ years of data engineering experience leading implementations of large-scale lakehouses on Databricks, Snowflake, or Synapse. Prior experience using DBT and PowerBI will be a plus. 3+ years' experience architecting solutions for developing data pipelines from structured, unstructured sources for batch and realtime workloads. Extensive experience with Azure data services (Databricks, Synapse, ADF) and related azure infrastructure services like firewall, storage, key vault etc. is required. Strong programming / scripting experience using SQL and python and Spark. Strong Data Modeling, Data lakehouse concepts. Knowledge of software configuration management environments and tools such as JIRA, Git, Jenkins, TFS, Shell, PowerShell, Bitbucket. Experience with Agile development methods in data-oriented projects Other Requirements: Highly motivated self-starter and team player and demonstrated success in prior roles. Track record of success working through technical challenges within enterprise organizations Ability to prioritize deals, training, and initiatives through highly effective time management Excellent problem solving, analytical, presentation, and whiteboarding skills Track record of success dealing with ambiguity (internal and external) and working collaboratively with other departments and organizations to solve challenging problems Strong knowledge of technology and industry trends that affect data analytics decisions for enterprise organizations Certifications on Azure Data Engineering and related technologies. "Remote postings are limited to candidates residing within the country specified in the posting location" About Rackspace Technology We are the multicloud solutions experts. We combine our expertise with the world’s leading technologies — across applications, data and security — to deliver end-to-end solutions. We have a proven record of advising customers based on their business challenges, designing solutions that scale, building and managing those solutions, and optimizing returns into the future. Named a best place to work, year after year according to Fortune, Forbes and Glassdoor, we attract and develop world-class talent. Join us on our mission to embrace technology, empower customers and deliver the future. More on Rackspace Technology Though we’re all different, Rackers thrive through our connection to a central goal: to be a valued member of a winning team on an inspiring mission. We bring our whole selves to work every day. And we embrace the notion that unique perspectives fuel innovation and enable us to best serve our customers and communities around the globe. We welcome you to apply today and want you to know that we are committed to offering equal employment opportunity without regard to age, color, disability, gender reassignment or identity or expression, genetic information, marital or civil partner status, pregnancy or maternity status, military or veteran status, nationality, ethnic or national origin, race, religion or belief, sexual orientation, or any legally protected characteristic. If you have a disability or special need that requires accommodation, please let us know. Show more Show less
Posted 1 week ago
0 years
0 Lacs
Mysore, Karnataka, India
On-site
Introduction A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. You'll work with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio Your Role And Responsibilities Ability to take full ownership and deliver component or functionality. Supporting the team to deliver project features with high quality and providing technical guidance. Responsible to work effectively individually and with team members toward customer satisfaction and success Preferred Education Master's Degree Required Technical And Professional Expertise SQL ADF Azure Data Bricks Preferred Technical And Professional Experience PostgreSQL, MSSQL Eureka Hystrix, zuul/API gateway In-Memory storage Show more Show less
Posted 1 week ago
4.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Job Description Responsibilities: Develop and maintain data pipelines using Snowflake and Azure Data Factory. Collaborate with data architects, data engineers, and business analysts to understand data requirements and translate them into technical solutions. Design and implement data transformation logic using SQL, Python, or other programming languages. Optimize data pipelines for performance and scalability. Implement data security and governance best practices. Troubleshoot and resolve issues related to data pipelines and processing. Document data pipelines, processes, and best practices. Qualifications: Strong experience with Snowflake's cloud data platform, including data modeling, performance tuning, and security. Proficiency in Azure Data Factory for data integration and orchestration. Experience with SQL and other programming languages for data manipulation and analysis. Familiarity with cloud computing concepts and services, particularly Azure. Strong problem-solving and analytical skills. Excellent communication and collaboration skills. Experience: Typically, 4+ years of experience in data engineering, ETL development, or a related field. Experience working with data warehousing and data integration projects. Experience designing and implementing solutions using Snowflake and ADF in a cloud environment. Other Skills: Strong attention to detail and ability to deliver high-quality work. Ability to work independently and as part of a team. Strong organizational and time management skills. Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Greater Kolkata Area
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Microsoft Management Level Senior Associate Job Description & Summary At PwC, our people in software and product innovation focus on developing cutting-edge software solutions and driving product innovation to meet the evolving needs of clients. These individuals combine technical experience with creative thinking to deliver innovative software products and solutions. Those in software engineering at PwC will focus on developing innovative software solutions to drive digital transformation and enhance business performance. In this field, you will use your knowledge to design, code, and test cutting-edge applications that revolutionise industries and deliver exceptional user experiences. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities We are seeking a developer to design, develop, and maintain data ingestion processes to a data platform built using Microsoft Technologies, ensuring data quality and integrity. The role involves collaborating with data architects and business analysts to implement solutions using tools like ADF, Azure Databricks, and requires strong SQL skills. Key responsibilities include developing, testing, and optimizing ETL workflows and maintaining documentation. B.Tech degree and 5+ years of ETL development experience in Microsoft data track are required. Demonstrated expertise in Agile methodologies, including Scrum, Kanban, or SAFe. Mandatory Skill Sets ETL Development Preferred Skill Sets Microsoft Stack Years Of Experience Required 4+ Education Qualification B.Tech/B.E./MCA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills ETL Development Optional Skills Acceptance Test Driven Development (ATDD), Acceptance Test Driven Development (ATDD), Accepting Feedback, Active Listening, Analytical Thinking, API Management, Application Development, Application Frameworks, Application Lifecycle Management, Application Software, Business Process Improvement, Business Process Management (BPM), Business Requirements Analysis, C++ Programming Language, Client Management, Code Review, Coding Standards, Communication, Computer Engineering, Computer Science, Continuous Integration/Continuous Delivery (CI/CD), Creativity, Debugging, Embracing Change, Emotional Regulation {+ 30 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date Show more Show less
Posted 1 week ago
8.0 years
25 Lacs
Cochin
On-site
Job Role: Senior Dot Net Developer Experience: 8+ years Notice period : Immediate Location: Trivandrum / Kochi Introduction Candidates with 8+ years of experience in IT industry and with strong .Net/.Net Core/Azure Cloud Service/ Azure DevOps. This is a client facing role and hence should have strong communication skills. This is for a US client and the resource should be hands-on - experience in coding and Azure Cloud. Working hours - 8 hours , with a 4 hours of overlap during EST Time zone. ( 12 PM - 9 PM) This overlap hours is mandatory as meetings happen during this overlap hours Responsibilities include: Design, develop, enhance, document, and maintain robust applications using .NET Core 6/8+, C#, REST APIs, T-SQL, and modern JavaScript/jQuery Integrate and support third-party APIs and external services Collaborate across cross-functional teams to deliver scalable solutions across the full technology stack Identify, prioritize, and execute tasks throughout the Software Development Life Cycle (SDLC) Participate in Agile/Scrum ceremonies and manage tasks using Jira Understand technical priorities, architectural dependencies, risks, and implementation challenges Troubleshoot, debug, and optimize existing solutions with a strong focus on performance and reliability Certifications : Microsoft Certified: Azure Fundamentals Microsoft Certified: Azure Developer Associate Other relevant certifications in Azure, .NET, or Cloud technologies Primary Skills : 8+ years of hands-on development experience with: C#, .NET Core 6/8+, Entity Framework / EF Core JavaScript, jQuery, REST APIs Expertise in MS SQL Server , including: Complex SQL queries, Stored Procedures, Views, Functions, Packages, Cursors, Tables, and Object Types Skilled in unit testing with XUnit, MSTest Strong in software design patterns, system architecture, and scalable solution design Ability to lead and inspire teams through clear communication, technical mentorship, and ownership Strong problem-solving and debugging capabilities Ability to write reusable, testable, and efficient code Develop and maintain frameworks and shared libraries to support large-scale applications Excellent technical documentation, communication, and leadership skills Microservices and Service-Oriented Architecture (SOA) Experience in API Integrations 2+ years of hands with Azure Cloud Services, including: Azure Functions Azure Durable Functions Azure Service Bus, Event Grid, Storage Queues Blob Storage, Azure Key Vault, SQL Azure Application Insights, Azure Monitoring Secondary Skills: Familiarity with AngularJS, ReactJS, and other front-end frameworks Experience with Azure API Management (APIM) Knowledge of Azure Containerization and Orchestration (e.g., AKS/Kubernetes) Experience with Azure Data Factory (ADF) and Logic Apps Exposure to Application Support and operational monitoring Azure DevOps - CI/CD pipelines (Classic / YAML) Job Types: Full-time, Permanent Pay: From ₹2,500,000.00 per year Benefits: Paid time off Provident Fund Location Type: In-person Schedule: UK shift Ability to commute/relocate: Kochi, Kerala: Reliably commute or planning to relocate before starting work (Preferred) Experience: .NET: 8 years (Preferred) Azure: 2 years (Preferred) Work Location: In person Speak with the employer +91 9932724170
Posted 1 week ago
3.0 years
0 Lacs
India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Oracle Technical Consultant (Fusion Middleware) – Staff Job Summary: We are seeking a proactive and experienced Oracle Technical Consultant (OCI) – Staff Level to join our team. The ideal candidate will have expertise in Oracle EBS/Cloud, a strong technical background, and experience in executing client transformation projects in ERP, Supply Chain, Procurement, and IT support.. Primary Responsibilities and Accountabilities: Assist in executing client transformation-related engagements in areas of Supply Chain, Procurement, Risk & Compliance, and ERP/IT support. Ensure high-quality work delivery. Identify engagement-related risks and escalate issues as appropriate. Establish and maintain strong relationships with clients (process owners/functional heads) and internal teams. Support Managers in meeting both client and internal KPIs Experience: Core experience in Oracle Technical activities Upto 3 years of relevant experience working in Oracle Fusion HCM (EBS / Fusion) Experience in at least one full life cycle implementation, with at least one implementation Experience with Oracle Fusion Middleware components such as Oracle SOA Suite, Oracle Service Bus (OSB), and Oracle BPM. Experience in major industry sectors like Retail, Government, Energy, Real Estate, Oil & Gas, Power & Utilities. Competencies / Skills: Proficiency in Java, J2EE, and WebLogic Server. Experience with Oracle WebCenter and Oracle Identity Management. Knowledge of Oracle ADF (Application Development Framework). Experience with Oracle API Gateway and Oracle API Management. Familiarity with Oracle Goldengate for data integration and replication. Experience with cloud-native development and microservices architecture. Knowledge of containerization technologies like Docker and Kubernetes. Experience with CI/CD pipelines and tools like Jenkins, Git, and Maven. Understanding of security best practices and compliance standards in middleware environments. Education: Should be a graduate along with a degree in as B.Tech or MBA. Oracle Certified. Bachelor’s degree in IT, Computer Science, or a related technical discipline. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 1 week ago
3.0 years
3 - 7 Lacs
Hyderābād
On-site
Requisition Id : 1610855 As a global leader in assurance, tax, transaction and advisory services, we hire and develop the most passionate people in their field to help build a better working world. This starts with a culture that believes in giving you the training, opportunities and creative freedom. At EY, we don't just focus on who you are now, but who you can become. We believe that it’s your career and ‘It’s yours to build’ which means potential here is limitless and we'll provide you with motivating and fulfilling experiences throughout your career to help you on the path to becoming your best professional self. The opportunity : Consultant-National-Forensics-ASU - Forensics - Discovery - Hyderabad Your key responsibilities Technical Excellence Data sits at the heart of every organization and is relied upon for all aspects of operations, analysis, decision-making, reporting and planning. Fraud Analytics is an integral part of EY Assurance & FIS and our ‘One’ strategy enables individuals to work with our specialists in different areas across the business. The role expects supporting EY Assurance services in using a range of analytical tools and technologies to deal with industry-specific data issues; working with leading companies across all industries locally and assist the EY Global Assurance team to deliver value to their clients. This ranges from designing specific checks for comforting the company auditors on possible fraud scenarios before financial reporting (data quality and governance) to proactive diagnostic analysis (e.g., fraud risk assessment) and value-added analysis (e.g., fraud monitoring). These also include procurement to pay, order to cash, travel & entertainment expense & predictive analytics on known fraud scenarios, in areas relating to customers, workforce, finance and risk The responsibilities of this role: Deliver on time and to a high standard Become a crucial part of the technical team supporting the development of new propositions to solve our client issues Work directly with investigation teams assisting with client engagements Have strong quantitative and analytical skills Have excellent verbal and written communication skills Have an ability to build relationships and liaise with clients Have proven problem solving skills Have experience of delivering projects as part of a team Have flexibility on working hours and a willingness to work on projects in India & abroad Handle client calls, meetings with clients Willingness to travel within India as well as over-seas – both short and long term Required Technical Skills(Data Analytics) - Microsoft SQL Server (T-SQL, Stored Procedures, Query Optimization, Database Design and Implementation) Microsoft Azure ADF Pipelines & Azure Data Bricks Power BI or Tableau as a visualization tool is a must Addon- Python, Power Automate Skills and attributes To qualify for the role you must have Qualification Work directly with investigation teams assisting with client engagements Have strong quantitative and analytical skills Have excellent verbal and written communication skills Have an ability to build relationships and liaise with clients Have proven problem solving skills Experience People with 3+ years of experience with ability to work in a collaborative way to provide services across multiple client departments while adhering to commercial and legal requirements. You will need a practical approach to solving issues and complex problems with the ability to deliver insightful and practical solutions. Data analysis experience in a forensic role Experience of working with financial data sets Experience/understanding of electronic disclosure technologies and computer forensics What we look for People with the ability to work in a collaborative manner to provide services across multiple client departments while following the commercial and legal requirements. You will need a practical approach to solving issues and complex problems with the ability to deliver insightful and practical solutions. We look for people who are agile, curious, mindful and able to sustain postivie energy, while being adaptable and creative in their approach. What we offer With more than 200,000 clients, 300,000 people globally and 33,000 people in India, EY has become the strongest brand and the most attractive employer in our field, with market-leading growth over compete. Our people work side-by-side with market-leading entrepreneurs, game- changers, disruptors and visionaries. As an organisation, we are investing more time, technology and money, than ever before in skills and learning for our people. At EY, you will have a personalized Career Journey and also the chance to tap into the resources of our career frameworks to better know about your roles, skills and opportunities. EY is equally committed to being an inclusive employer and we strive to achieve the right balance for our people - enabling us to deliver excellent client service whilst allowing our people to build their career as well as focus on their wellbeing. If you can confidently demonstrate that you meet the criteria above, please contact us as soon as possible. Join us in building a better working world. Apply now.
Posted 1 week ago
0 years
6 - 12 Lacs
Thanjāvūr
On-site
Job description Company Description AGF Tecnik, a division of AGF Group Inc., was established in India in the spring of 2018. Our team excels in technical drawing (reinforcement steel drawing) and provides estimation and detailing services for diverse construction projects. AGF Group specializes in reinforcing steel, post-tensioning, and scaffolding & access solutions, with over 40 business units in multiple countries including Canada, the United States, and the UAE. AGF Tecnik's headquarters is based in Longueuil. Follow us on LinkedIn to stay updated with AGF Group’s latest news. Role Description An Oracle Technical Consultant designs, develops, implements, and supports Oracle applications, focusing on technical aspects like database management, integrations, and customizations to meet business needs. Here's a more detailed breakdown of the role: Key Responsibilities: Technical Expertise: Proficiency in Oracle database technologies (e.g., SQL, PL/SQL). Experience with Oracle applications (e.g., Oracle Fusion, Oracle E-Business Suite). Knowledge of Oracle Cloud solutions (e.g., PaaS, IaaS). Understanding of data modeling, data administration, and software development principles. Implementation and Support: Participate in the entire project lifecycle, from requirements gathering to deployment and post-implementation support. Develop and customize Oracle applications to meet specific business requirements. Design and implement interfaces, integrations, and workflows. Troubleshoot and resolve technical issues. Conduct testing and quality assurance. Consulting and Collaboration: Work closely with functional consultants, business users, and other stakeholders. Provide technical guidance and expertise to clients and internal teams. Document technical solutions and processes. Stay updated with the latest Oracle technologies and best practices. Specific Skills: Experience with Oracle Forms & Reports, XML / BI Publisher Reporting Tools, Interfaces (outbound /Inbound) and Workflow. Knowledge of Oracle Fusion modules (Finance, SCM, Manufacturing) and tools. Familiarity with Apex development processes. Experience with FRICEW components. Knowledge of OIC, ADF, ODI, Informatica, OBIEE, SOA or Oracle Retail Cloud Infrastructure will be an added advantage. Soft Skills: Strong communication and interpersonal skills. Problem-solving and analytical skills. Ability to work independently and as part of a team. Ability to manage multiple projects and priorities. Job Types: Full-time, Permanent Pay: ₹600,000.00 - ₹1,200,000.00 per year Benefits: Health insurance Leave encashment Provident Fund Schedule: Day shift Monday to Friday Supplemental Pay: Performance bonus Work Location: In person
Posted 1 week ago
3.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Primary Roles And Responsibilities Developing Modern Data Warehouse solutions using Snowflake, Databricks and ADF. Ability to provide solutions that are forward-thinking in data engineering and analytics space Collaborate with DW/BI leads to understand new ETL pipeline development requirements. Triage issues to find gaps in existing pipelines and fix the issues Work with business to understand the need in the reporting layer and develop a data model to fulfill reporting needs Help joiner team members to resolve issues and technical challenges. Drive technical discussions with client architect and team members Orchestrate the data pipelines in the scheduler via Airflow Skills And Qualifications Skills: sql,pl/sql,spark,star and snowflake dimensional modeling,databricks,snowsight,terraform,git,unix shell scripting,snowsql,cassandra,circleci,azure,pyspark,snowpipe,mongodb,neo4j,azure data factory,snowflake,python Bachelor's and/or master’s degree in computer science or equivalent experience. Must have total 6+ yrs. of IT experience and 3+ years' experience in Data warehouse/ETL projects. Expertise in Snowflake security, Snowflake SQL and designing/implementing other Snowflake objects. Hands-on experience with Snowflake utilities, SnowSQL, Snowpipe, Snowsight and Snowflake connectors. Deep understanding of Star and Snowflake dimensional modeling. Strong knowledge of Data Management principles Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture Should have hands-on experience in SQL and Spark (PySpark) Experience in building ETL / data warehouse transformation processes Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience working with structured and unstructured data including imaging & geospatial data. Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning, troubleshooting and Query Optimization. Databricks Certified Data Engineer Associate/Professional Certification (Desirable). Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects Should have experience working in Agile methodology Strong verbal and written communication skills. Strong analytical and problem-solving skills with a high attention to detail. Mandatory Skills: Snowflake/Azure Data Factory/ PySpark / Databricks Show more Show less
Posted 1 week ago
4.0 - 6.0 years
5 - 11 Lacs
Thane, Navi Mumbai, Mumbai (All Areas)
Work from Office
Role & responsibilities Support the implementation and maintenance of data governance policies, procedures, and standards specific to the banking industry. Hands-on experience in creating and maintaining activities associated with data life cycle management and various data governance activities. Develop, update, and maintain the data dictionary for critical banking data assets, ensuring accurate definitions, attributes, and classifications. Interfacing Work with business units and IT teams to standardize terminology across systems for consistency and clarity. Document end-to-end data lineage for key banking data processes (e.g., customer data, transaction data, risk management data). Create and maintain documentation of metadata, data dictionaries, and lineage for ongoing governance processes. Experience on reports and dashboards preparation for data quality scores, and lineage status. 1. Technical Skills. Experience in Data governance related activities like (preparation on data dictionary and data lineage documents). Proficient in writing database queries. anyone of the database like (SQL, Oracle, MySQL, Postgres) Experience in data life cycle management Understanding of data privacy and security frameworks specific to banking, such as PCI DSS, DPDP act. Preferred candidate profile Minimum 4- 8 years of experience in Data governance related activities Bachelors degree in (B. Tech /BCA, BSc (IT) etc.) Information Systems and relevant field Experience in Data management life cycle and Data governance activities.
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The job market for ADF (Application Development Framework) professionals in India is witnessing significant growth, with numerous opportunities available for job seekers in this field. ADF is a popular framework used for building enterprise applications, and companies across various industries are actively looking for skilled professionals to join their teams.
Here are 5 major cities in India where there is a high demand for ADF professionals: - Bangalore - Hyderabad - Pune - Chennai - Mumbai
The estimated salary range for ADF professionals in India varies based on experience levels: - Entry-level: INR 4-6 lakhs per annum - Mid-level: INR 8-12 lakhs per annum - Experienced: INR 15-20 lakhs per annum
In the ADF job market in India, a typical career path may include roles such as Junior Developer, Senior Developer, Technical Lead, and Architect. As professionals gain more experience and expertise in ADF, they can progress to higher-level positions with greater responsibilities.
In addition to ADF expertise, professionals in this field are often expected to have knowledge of related technologies such as Java, Oracle Database, SQL, JavaScript, and web development frameworks like Angular or React.
Here are 25 interview questions for ADF roles, categorized by difficulty level: - Basic: - What is ADF and its key features? - What is the difference between ADF Faces and ADF Task Flows? - Medium: - Explain the lifecycle of an ADF application. - How do you handle exceptions in ADF applications? - Advanced: - Discuss the advantages of using ADF Business Components. - How would you optimize performance in an ADF application?
As you explore job opportunities in the ADF market in India, make sure to enhance your skills, prepare thoroughly for interviews, and showcase your expertise confidently. With the right preparation and mindset, you can excel in your ADF career and secure rewarding opportunities in the industry. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2