Jobs
Interviews

1647 Adf Jobs - Page 17

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

The role involves building and managing data pipelines, troubleshooting issues, and ensuring data accuracy across various platforms such as Azure Synapse Analytics, Azure Data Lake Gen2, and SQL environments. This position requires extensive SQL experience and a strong background in PySpark development. Responsibilities Data Engineering: Work with Azure Synapse Pipelines and PySpark for data transformation and pipeline management. Perform data integration and schema updates in Delta Lake environments, ensuring smooth data flow and accurate reporting. Work with our Azure DevOps team on CI/CD processes for deployment of Infrastructure as Code (IaC) and Workspace artifacts. Develop custom solutions for our customers defined by our Data Architect and assist in improving our data solution patterns over time. Documentation : Document ticket resolutions, testing protocols, and data validation processes. Collaborate with other stakeholders to provide specifications and quotations for enhancements requested by customers. Ticket Management: Monitor the Jira ticket queue and respond to tickets as they are raised. Understand ticket issues, utilizing extensive SQL, Synapse Analytics, and other tools to troubleshoot them. Communicate effectively with customer users who raised the tickets and collaborate with other teams (e.g., FinOps, Databricks) as needed to resolve issues. Troubleshooting and Support: Handle issues related to ETL pipeline failures, Delta Lake processing, or data inconsistencies in Synapse Analytics. Provide prompt resolution to data pipeline and validation issues, ensuring data integrity and performance. Desired Skills & Requirements Seeking a candidate with 5+ years of Dynamics 365 ecosystem experience with a strong PySpark development background. While various profiles may apply, we highly value a strong person-organization fit. Our ideal candidate possesses the following attributes and qualifications: Extensive experience with SQL, including query writing and troubleshooting in Azure SQL, Synapse Analytics, and Delta Lake environments. Strong understanding and experience in implementing and supporting ETL processes, Data Lakes, and data engineering solutions. Proficiency in using Azure Synapse Analytics, including workspace management, pipeline creation, and data flow management. Hands-on experience with PySpark for data processing and automation. Ability to use VPNs, MFA, RDP, jump boxes/jump hosts, etc., to operate within the customers secure environments. Some experience with Azure DevOps CI/CD IaC and release pipelines. Ability to communicate effectively both verbally and in writing, with strong problem-solving and analytical skills. Understanding of the operation and underlying data structure of D365 Finance and Operations, Business Central, and Customer Engagement. Experience with Data Engineering in Microsoft Fabric Experience with Delta Lake and Azure data engineering concepts (e.g., ADLS, ADF, Synapse, AAD, Databricks). Certifications in Azure Data Engineering. Why Join Us? Opportunity to work with innovative technologies in a dynamic environment where progressive work culture with a global perspective where your ideas truly matter, and growth opportunities are endless. Work with the latest Microsoft Technologies alongside Dynamics professionals committed to driving customer success. Enjoy the flexibility to work from anywhere Work-life balance that suits your lifestyle. Competitive salary and comprehensive benefits package. Career growth and professional development opportunities. A collaborative and inclusive work culture.

Posted 3 weeks ago

Apply

3.0 years

0 Lacs

Greater Kolkata Area

Remote

Who We Are Beyondsoft is a leading mid-sized business IT and consulting company that combines modern technologies and proven methodologies to tailor solutions that move your business forward. Our global head office is based in Singapore, and our team is made up of a diversely talented team of experts who thrive on innovation and pushing the bounds of technology to solve our customers most pressing challenges. When it comes time to deliver, we set our sights on that sweet spot where brilliance, emerging technologies, best practices, and accountability converge. We have a global presence spanning four continents (North America, South America, Europe, and Asia). Our global network of talent and customer-centric engagement model enables us to provide top-quality services on an unprecedented scale. What Were About We believe that collaboration, transparency, and accountability are the values that guide our business, our delivery, and our brand. Everyone has something to bring to the table, and we believe in working together with our peers and clients to leverage the best of one another in everything we do. When we proactively collaborate, business decisions become easier, innovation is greater, and outcomes are better. Our ability to achieve our mission and live out our values depends upon a diverse, equitable, and inclusive culture. So, we strive to foster a workplace where people have the respect, support, and voice they deserve, where innovative ideas flourish, and where people can unleash their brilliance. For more information regarding DEI at Beyondsoft, please go to https : SUMMARY : As a Data Engineer, you will be responsible for designing, building, and optimizing scalable data pipelines and infrastructure. Youll work closely with analytics, engineering, and product teams to ensure data integrity and enable high-impact decision-making. This position requires flexibility to work in PST timezone. Additional Requirement For Remote Positions For remote positions, all candidates must complete a video screen with our corporate recruiting team. What You Will Be Doing Maintain automated data onboarding and diagnostic tools for AIP partners. Monitor ADF pipelines and mitigate pipeline runs as needed. Maintain Privacy Dashboard and Bing user interests for Bing Growth team usage. Participate and resolve live sites in related areas. Data Platform development and maintenance, Notebook based processing pipelines and MT migration. Manage the regular data quality Cosmos/MT jobs. Online tooling and support such as DADT tools. Watch out the abnormal pattern, perform ad-hoc data quality analysis, investigate daily the user ad click broken cases. Perform additional duties as assigned. Minimum Qualifications Bachelors degree or higher in Computer Science or a related field. At least 3 years of experience in software development. Good quality software development and understanding. Ability to quickly communicate across time zones. Excellent written and verbal communication skills in English. Self-motivated. Coding Language : Java, C#, Python, Scala. Technologies : Apache Spark, Apache Flink, Apache Kafka, Hadoop, Cosmos, SQL. Azure resource management : Azure Data Factory, Azure Databricks, Azure Key vaults, Managed Identity, Azure Storage, etc. MS Project. Big data experience is a plus. Occasional infrequent in person activity may be required. What We Have To Offer Because we know how important our people are to the success of our clients, its a priority to make sure we stay committed to our employees and making Beyondsoft a great place to work. We take pride in offering competitive compensation and benefits along with a company culture that embodies continuous learning, growth, and training with a dedicated focus on employee satisfaction and work/life balance. Beyondsoft provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type with regards to race, color, religion, age, sex, national origin, disability status, genetics, veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state, or local laws. This policy applies to all terms and conditions of employment, including recruiting, hiring, and the full employee lifecycle up through and including termination. (ref:hirist.tech)

Posted 3 weeks ago

Apply

6.0 - 9.5 years

0 Lacs

Bengaluru, Karnataka, India

On-site

At PwC, our people in software and product innovation focus on developing cutting-edge software solutions and driving product innovation to meet the evolving needs of clients. These individuals combine technical experience with creative thinking to deliver innovative software products and solutions. Those in software engineering at PwC will focus on developing innovative software solutions to drive digital transformation and enhance business performance. In this field, you will use your knowledge to design, code, and test cutting-edge applications that revolutionise industries and deliver exceptional user experiences. Position Title : Full Stack Lead Developer Experience : 6-9.5 Years Job Overview We are seeking a highly skilled and versatile polyglot Full Stack Developer with expertise in modern front-end and back-end technologies, cloud-based solutions, AI/ML and Gen AI. The ideal candidate will have a strong foundation in full-stack development, cloud platforms (preferably Azure), and hands-on experience in Gen AI, AI and machine learning technologies. Key Responsibilities Develop and maintain web applications using Angular/React.js, .NET, and Python. Design, deploy, and optimize Azure native PaaS and SaaS services, including but not limited to Function Apps, Service Bus, Storage Accounts, SQL Databases, Key vaults, ADF, Data Bricks and REST APIs with Open API specifications. Implement security best practices for data in transit and rest. Authentication best practices – SSO, OAuth 2.0 and Auth0. Utilize Python for developing data processing and advanced AI/ML models using libraries like pandas, NumPy, scikit-learn and Langchain, Llamaindex, Azure OpenAI SDK Leverage Agentic frameworks like Crew AI, Autogen etc. Well versed with RAG and Agentic Architecture. Strong in Design patterns – Architectural, Data, Object oriented Leverage azure serverless components to build highly scalable and efficient solutions. Create, integrate, and manage workflows using Power Platform, including Power Automate, Power Pages, and SharePoint. Apply expertise in machine learning, deep learning, and Generative AI to solve complex problems. Primary Skills Proficiency in React.js, .NET, and Python. Strong knowledge of Azure Cloud Services, including serverless architectures and data security. Experience with Python Data Analytics libraries: pandas NumPy scikit-learn Matplotlib Seaborn Experience with Python Generative AI Frameworks: Langchain LlamaIndex Crew AI AutoGen Familiarity with REST API design, Swagger documentation, and authentication best practices. Secondary Skills Experience with Power Platform tools such as Power Automate, Power Pages, and SharePoint integration. Knowledge of Power BI for data visualization (preferred). Preferred Knowledge Areas – Nice To Have In-depth understanding of Machine Learning, deep learning, supervised, un-supervised algorithms. Qualifications Bachelor's or master's degree in computer science, Engineering, or a related field. 6~12 years of hands-on experience in full-stack development and cloud-based solutions. Strong problem-solving skills and ability to design scalable, maintainable solutions. Excellent communication and collaboration skills.

Posted 3 weeks ago

Apply

6.0 - 9.5 years

0 Lacs

Andhra Pradesh, India

On-site

At PwC, our people in software and product innovation focus on developing cutting-edge software solutions and driving product innovation to meet the evolving needs of clients. These individuals combine technical experience with creative thinking to deliver innovative software products and solutions. Those in software engineering at PwC will focus on developing innovative software solutions to drive digital transformation and enhance business performance. In this field, you will use your knowledge to design, code, and test cutting-edge applications that revolutionise industries and deliver exceptional user experiences. Position Title : Full Stack Lead Developer Experience : 6-9.5 Years Job Overview We are seeking a highly skilled and versatile polyglot Full Stack Developer with expertise in modern front-end and back-end technologies, cloud-based solutions, AI/ML and Gen AI. The ideal candidate will have a strong foundation in full-stack development, cloud platforms (preferably Azure), and hands-on experience in Gen AI, AI and machine learning technologies. Key Responsibilities Develop and maintain web applications using Angular/React.js, .NET, and Python. Design, deploy, and optimize Azure native PaaS and SaaS services, including but not limited to Function Apps, Service Bus, Storage Accounts, SQL Databases, Key vaults, ADF, Data Bricks and REST APIs with Open API specifications. Implement security best practices for data in transit and rest. Authentication best practices – SSO, OAuth 2.0 and Auth0. Utilize Python for developing data processing and advanced AI/ML models using libraries like pandas, NumPy, scikit-learn and Langchain, Llamaindex, Azure OpenAI SDK Leverage Agentic frameworks like Crew AI, Autogen etc. Well versed with RAG and Agentic Architecture. Strong in Design patterns – Architectural, Data, Object oriented Leverage azure serverless components to build highly scalable and efficient solutions. Create, integrate, and manage workflows using Power Platform, including Power Automate, Power Pages, and SharePoint. Apply expertise in machine learning, deep learning, and Generative AI to solve complex problems. Primary Skills Proficiency in React.js, .NET, and Python. Strong knowledge of Azure Cloud Services, including serverless architectures and data security. Experience with Python Data Analytics libraries: pandas NumPy scikit-learn Matplotlib Seaborn Experience with Python Generative AI Frameworks: Langchain LlamaIndex Crew AI AutoGen Familiarity with REST API design, Swagger documentation, and authentication best practices. Secondary Skills Experience with Power Platform tools such as Power Automate, Power Pages, and SharePoint integration. Knowledge of Power BI for data visualization (preferred). Preferred Knowledge Areas – Nice To Have In-depth understanding of Machine Learning, deep learning, supervised, un-supervised algorithms. Qualifications Bachelor's or master's degree in computer science, Engineering, or a related field. 6~12 years of hands-on experience in full-stack development and cloud-based solutions. Strong problem-solving skills and ability to design scalable, maintainable solutions. Excellent communication and collaboration skills.

Posted 3 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

delhi

On-site

Are you a skilled professional with experience in SQL, Python (Pandas & SQLAlchemy), and data engineering We have an exciting opportunity for an ETL Developer to join our team! As an ETL Developer, you will be responsible for working with various technologies to extract, transform, and load data. You will play a key role in supporting on-call challenges and ensuring the smooth operation of data pipelines. Required Qualifications: - Bachelors degree in Computer Science or related field or equivalent work experience - 5+ years of experience working with MS SQL - 3+ years of experience with Python (Pandas, SQLAlchemy) - 3+ years of experience supporting on-call challenges Key Skills: - SQL Expertise: You should have at least 5 years of experience running SQL queries on multiple disparate databases to stitch together data for insights. - Python Experience: With 3+ years of experience working with large data sets (2M+ rows) using Python, Pandas, SSIS, or similar tools. - SQL Tuning: Proficient in MS SQL tuning and a solid understanding of SQL queries. - Data Debugging: Solid knowledge of Python (Pandas, SQLAlchemy, Spark) and experience in code debugging, logs, and root cause analysis (RCA). - Agile Experience: Comfortable working in an agile environment and collaborating across teams. - Source Control: Familiarity with GitLab, GitHub, and source code repositories. - Database Management: Ability to create and maintain databases for development, testing, education, and production. - Data Insights: Experience interpreting complex data and providing actionable insights for business goals. - Cloud & Spark: Familiarity with Azure, ADF, Spark, and Scala concepts is a plus. If you're passionate about data, have a strong problem-solving mindset, and enjoy working in a collaborative environment, we want to hear from you! Interested Apply now by sending your resume to samdarshi.singh@mwidm.com or contact us at +91 62392 61536.,

Posted 3 weeks ago

Apply

1.0 years

0 Lacs

Bengaluru, Karnataka, India

Remote

Remote Work: Hybrid Overview: At Zebra, we are a community of innovators who come together to create new ways of working to make everyday life better. United by curiosity and care, we develop dynamic solutions that anticipate our customer’s and partner’s needs and solve their challenges. Being a part of Zebra Nation means being seen, heard, valued, and respected. Drawing from our diverse perspectives, we collaborate to deliver on our purpose. Here you are a part of a team pushing boundaries to redefine the work of tomorrow for organizations, their employees, and those they serve. You have opportunities to learn and lead at a forward-thinking company, defining your path to a fulfilling career while channeling your skills toward causes that you care about – locally and globally. We’ve only begun reimaging the future – for our people, our customers, and the world. Let’s create tomorrow together. A Data Engineer will be responsible for Designs, develops, programs and implements Machine Learning solutions , Implements Artificial/Augmented Intelligence systems/Agentic Workflows/Data Engineer Workflows, Performs Statistical Modelling and Measurements by applying data engineering, feature engineering, statistical methods, ML modelling and AI techniques on structured, unstructured, diverse “big data” sources of machine acquire data to generate actionable insights and foresights for real life business problem solutions and product features development and enhancements. A strong understanding of databases, SQL, cloud technologies, and modern data integration and orchestration tools like Azure Data Factory (ADF) are required to succeed in this role. Responsibilities: Integrates state-of-the-art machine learning algorithms as well as the development of new methods Develops tools to support analysis and visualization of large datasets Develops, codes software programs, implements industry standard auto ML models (Speech, Computer vision, Text Data, LLM), Statistical models, relevant ML models (devices/machine acquired data), AI models and algorithms Identifies meaningful foresights based on predictive ML models from large data and metadata sources; interprets and communicates foresights, insights and findings from experiments to product managers, service managers, business partners and business managers Makes use of Rapid Development Tools (Business Intelligence Tools, Graphics Libraries, Data modelling tools) to effectively communicate research findings using visual graphics, Data Models, machine learning model features, feature engineering / transformations to relevant stakeholders Analyze, review and track trends and tools in Data Science, Machine Learning, Artificial Intelligence and IoT space Interacts with Cross-Functional teams to identify questions and issues for data engineering, machine learning models feature engineering Evaluates and makes recommendations to evolve data collection mechanism for Data capture to improve efficacy of machine learning models prediction Meets with customers, partners, product managers and business leaders to present findings, predictions, foresights; Gather customer specific requirements of business problems/processes; Identify data collection constraints and alternatives for implementation of models Working knowledge of MLOps, LLMs and Agentic AI/Workflows Programming Skills: Proficiency in Python and experience with ML frameworks like TensorFlow, PyTorch LLM Expertise: Hands-on experience in training, fine-tuning, and deploying LLMs Foundational Model Knowledge: Strong understanding of open-weight LLM architectures, including training methodologies, fine-tuning techniques, hyperparameter optimization, and model distillation. Data Pipeline Development: Strong understanding of data engineering concepts, feature engineering, and workflow automation using Airflow or Kubeflow. Cloud & MLOps: Experience deploying ML models in cloud environments like AWS, GCP (Google Vertex AI), or Azure using Docker and Kubernetes.Designs and implementation predictive and optimisation models incorporating diverse data types strong in SQL, Azure Data Factory (ADF) Qualifications: Minimum Education: Bachelors, Master's or Ph.D. Degree in Computer Science or Engineering. Minimum Work Experience (years): 1+ years of experience programming with at least one of the following languages: Python, Scala, Go. 1+ years of experience in SQL and data transformation 1+ years of experience in developing distributed systems using open source technologies such as Spark and Dask. 1+ years of experience with relational databases or NoSQL databases running in Linux environments (MySQL, MariaDB, PostgreSQL, MongoDB, Redis). Key Skills and Competencies: Experience working with AWS / Azure / GCP environment is highly desired. Experience in data models in the Retail and Consumer products industry is desired. Experience working on agile projects and understanding of agile concepts is desired. Demonstrated ability to learn new technologies quickly and independently. Excellent verbal and written communication skills, especially in technical communications. Ability to work and achieve stretch goals in a very innovative and fast-paced environment. Ability to work collaboratively in a diverse team environment. Ability to telework Expected travel: Not expected. To protect candidates from falling victim to online fraudulent activity involving fake job postings and employment offers, please be aware our recruiters will always connect with you via @zebra.com email accounts. Applications are only accepted through our applicant tracking system and only accept personal identifying information through that system. Our Talent Acquisition team will not ask for you to provide personal identifying information via e-mail or outside of the system. If you are a victim of identity theft contact your local police department.

Posted 3 weeks ago

Apply

6.0 years

0 Lacs

India

On-site

Sr. Python Data Engineer – (6 to 12 Years Experience) Responsibilities Design and develop data pipelines and ETL processes. Collaborate with data scientists and analysts to understand data needs. Maintain and optimize data warehousing solutions. Ensure data quality and integrity throughout the data lifecycle. Develop and implement data validation and cleansing routines. Work with large datasets from various sources. Automate repetitive data tasks and processes. Monitor data systems and troubleshoot issues as they arise. Qualifications Bachelor’s degree in Computer Science, Information Technology, or a related field. Proven experience as a Data Engineer or similar role (Minimum 6+ years’ experience as Data Engineer). Strong proficiency in Python and PySpark. Excellent problem-solving abilities. Strong communication skills to collaborate with team members and stakeholders. Individual Contributor Technical Skills Required Expert Python, PySpark and SQL/Snowflake Azure Advanced Data warehousing, Data pipeline design – Advanced Level Data Quality, Data validation, Data cleansing – Advanced Level Intermediate/Basic Microsoft Fabric, ADF, Databricks, Master Data management/Data Governance Data Mesh, Data Lake/Lakehouse Architecture

Posted 3 weeks ago

Apply

6.0 years

5 - 7 Lacs

Gurgaon

On-site

About Gartner IT: Join a world-class team of skilled engineers who build creative digital solutions to support our colleagues and clients. We make a broad organizational impact by delivering cutting-edge technology solutions that power Gartner. Gartner IT values its culture of nonstop innovation, an outcome-driven approach to success, and the notion that great ideas can come from anyone on the team. About the role: Lead Analytics Engineer will provide technical expertise in designing and building Modern Data warehouse in Azure Cloud to meet the data needs for various BU in Gartner. You will be part of the Ingestion Team to bring data from multiple sources into the Data warehouse. Collaborate with Dashboard, Analytic & Business Team to build end to end scalable data pipelines. What you will do: Responsible for reviewing and analysis of business requirements and design technical mapping document Build new ETL pipelines using Azure Data Factory and Synapse Design, build, and automate data pipelines and applications to support data scientists and business users with their reporting and analytics needs Collaborate on Data warehouse architecture and technical design discussions Perform and participate in code reviews, peer inspections and technical design and specifications, as well as document and review detailed designs. Provide status reports to the higher management. Help build defining best practices & processes. Maintain Service Levels and department goals for problem resolution. Design and build tabular data models in Azure Analysis Services for seamless integration with Power BI Write efficient SQL queries and DAX (Data Analysis Expressions) to support robust data models, reports, and dashboards Tune and optimize data models and queries for maximum performance and efficient data retrieval. What you will need: 6-8 years experience in Data warehouse design & development Experience in ETL using Azure Data Factory (ADF) Experience in writing complex TSQL procedures in Synapse / SQL Data warehouse. Experience in analyzing complex code and performance tune pipelines. Good knowledge of Azure cloud technology and exposure in Azure cloud components Good understanding of business process and analyzing underlying data Understanding of dimensional and relational modeling Nice to Have: Experience with version control systems (e.g., Git, Subversion) Power BI and AAS Experience for Tabular model design. Experience with Data Intelligence platforms like Databricks Who you are: Effective time management skills and ability to meet deadlines Excellent communications skills interacting with technical and business audience’s Excellent organization, multitasking, and prioritization skills Must possess a willingness and aptitude to embrace new technologies/ideas and master concepts rapidly. Intellectual curiosity, passion for technology and keeping up with new trends Delivering project work on-time within budget with high quality Don’t meet every single requirement? We encourage you to apply anyway. You might just be the right candidate for this, or other roles. #LI-PM3 Who are we? At Gartner, Inc. (NYSE:IT), we guide the leaders who shape the world. Our mission relies on expert analysis and bold ideas to deliver actionable, objective insight, helping enterprise leaders and their teams succeed with their mission-critical priorities. Since our founding in 1979, we’ve grown to more than 21,000 associates globally who support ~14,000 client enterprises in ~90 countries and territories. We do important, interesting and substantive work that matters. That’s why we hire associates with the intellectual curiosity, energy and drive to want to make a difference. The bar is unapologetically high. So is the impact you can have here. What makes Gartner a great place to work? Our sustained success creates limitless opportunities for you to grow professionally and flourish personally. We have a vast, virtually untapped market potential ahead of us, providing you with an exciting trajectory long into the future. How far you go is driven by your passion and performance. We hire remarkable people who collaborate and win as a team. Together, our singular, unifying goal is to deliver results for our clients. Our teams are inclusive and composed of individuals from different geographies, cultures, religions, ethnicities, races, genders, sexual orientations, abilities and generations. We invest in great leaders who bring out the best in you and the company, enabling us to multiply our impact and results. This is why, year after year, we are recognized worldwide as a great place to work . What do we offer? Gartner offers world-class benefits, highly competitive compensation and disproportionate rewards for top performers. In our hybrid work environment, we provide the flexibility and support for you to thrive — working virtually when it's productive to do so and getting together with colleagues in a vibrant community that is purposeful, engaging and inspiring. Ready to grow your career with Gartner? Join us. The policy of Gartner is to provide equal employment opportunities to all applicants and employees without regard to race, color, creed, religion, sex, sexual orientation, gender identity, marital status, citizenship status, age, national origin, ancestry, disability, veteran status, or any other legally protected status and to seek to advance the principles of equal employment opportunity. Gartner is committed to being an Equal Opportunity Employer and offers opportunities to all job seekers, including job seekers with disabilities. If you are a qualified individual with a disability or a disabled veteran, you may request a reasonable accommodation if you are unable or limited in your ability to use or access the Company’s career webpage as a result of your disability. You may request reasonable accommodations by calling Human Resources at +1 (203) 964-0096 or by sending an email to ApplicantAccommodations@gartner.com . Job Requisition ID:101545 By submitting your information and application, you confirm that you have read and agree to the country or regional recruitment notice linked below applicable to your place of residence. Gartner Applicant Privacy Link: https://jobs.gartner.com/applicant-privacy-policy For efficient navigation through the application, please only use the back button within the application, not the back arrow within your browser.

Posted 3 weeks ago

Apply

7.0 years

4 - 9 Lacs

Noida

On-site

Posted On: 11 Jul 2025 Location: Noida, UP, India Company: Iris Software Why Join Us? Are you inspired to grow your career at one of India’s Top 25 Best Workplaces in IT industry? Do you want to do the best work of your life at one of the fastest growing IT services companies ? Do you aspire to thrive in an award-winning work culture that values your talent and career aspirations ? It’s happening right here at Iris Software. About Iris Software At Iris Software, our vision is to be our client’s most trusted technology partner, and the first choice for the industry’s top professionals to realize their full potential. With over 4,300 associates across India, U.S.A, and Canada, we help our enterprise clients thrive with technology-enabled transformation across financial services, healthcare, transportation & logistics, and professional services. Our work covers complex, mission-critical applications with the latest technologies, such as high-value complex Application & Product Engineering, Data & Analytics, Cloud, DevOps, Data & MLOps, Quality Engineering, and Business Automation. Working at Iris Be valued, be inspired, be your best. At Iris Software, we invest in and create a culture where colleagues feel valued, can explore their potential, and have opportunities to grow. Our employee value proposition (EVP) is about “Being Your Best” – as a professional and person. It is about being challenged by work that inspires us, being empowered to excel and grow in your career, and being part of a culture where talent is valued. We’re a place where everyone can discover and be their best version. Job Description Technical Expertise: Must Have: Experience in Emma orchestration engine is a must Proficient in Python programming with experience in agentic platforms from procode (e.g., Autogen, Semantic Kernel, LangGraph) to low code (e.g., Crew.ai, EMA.ai). Hands-on experience with Azure open AI and related tools and services. Fluent in GenAI packages like Llamaindex and Langchain. Soft Skills: Excellent communication and collaboration skills, with the ability to work effectively with stakeholders across business and technical teams. Strong problem-solving and analytical skills. Attention to detail. Ability to work with teams in a dynamic, fast-paced environment. Experience: 7 to 10 years of experience in software development, with 3+ years in AI/ML or Generative AI projects. Demonstrated experience in deploying and managing AI applications in production environments. Key Responsibilities: Design, develop, and implement complex Generative AI solutions with high accuracy and for complex use cases. Utilize agentic platforms from procode (e.g., Autogen, Semantic Kernel, LangGraph) to low code (e.g., Crew.ai, EMA.ai). Leverage Azure OpenAI ecosystems and tooling, including training models, advanced prompting, Assistant API, and agent curation. Write efficient, clean, and maintainable Python code for AI applications. Develop and deploy RESTful APIs using frameworks like Flask or Django for model integration and consumption. Fine-tune and optimize AI models for business use cases. Mandatory Competencies Programming Language - Python - Django Programming Language - Python - Flask Data Science and Machine Learning - Data Science and Machine Learning - Gen AI Data Science and Machine Learning - Data Science and Machine Learning - AI/ML Cloud - Azure - Azure Data Factory (ADF), Azure Databricks, Azure Data Lake Storage, Event Hubs, HDInsight Beh - Communication and collaboration Perks and Benefits for Irisians At Iris Software, we offer world-class benefits designed to support the financial, health and well-being needs of our associates to help achieve harmony between their professional and personal growth. From comprehensive health insurance and competitive salaries to flexible work arrangements and ongoing learning opportunities, we're committed to providing a supportive and rewarding work environment. Join us and experience the difference of working at a company that values its employees' success and happiness.

Posted 3 weeks ago

Apply

4.0 - 6.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Summary Position Summary Senior Analyst, Data-Marts & Reporting - Reporting and Analytics – Digital Data Analytics Innovation - Deloitte Support Services India Private Limited Are you a quick learner with a willingness to work with new technologies? Data-Marts and Reporting team offers you a particular opportunity to be an integral part of the Datamarts & Reporting – CoRe Digital | Data | Analytics | Innovation Group. The principle focus of this group is the research, development, maintain and documentation of customized solutions that e-enable the delivery of cutting- edge technology to firm's business centers. Work you will do As a Senior Analyst, you will research and develop solutions built on varied technologies like Microsoft SQL Server, MSBI Suite, MS Azure SQL, Tableau, .Net. You will support a team which provides high-quality solutions to the customers by following a streamlined system development methodology. In the process of acquainting yourself with various development tools, testing tools, methodologies and processes, you will be aligned to the following role: Role: Datamart Solution Senior Analyst As a Datamart Solution Analyst, you will be responsible for delivering technical solutions on building high performing datamart and reporting tools using tools/technologies like Microsoft SQL Server, MSBI Suite, MS Azure SQL, Tableau, .Net. Your key responsibilities include: Interact with end users to gather, document, and interpret requirements. Leverage requirements to design technical solution. Develop SQL objects and scripts based on design. Analyze, debug, and optimize existing stored procedures and views. Leverage indexes, performance tuning techniques, and error handling to improve performance of SQL scripts. Create and modify SSIS packages, ADF Pipelines for transferring data between various systems cloud and On-premise environments. Should be able to seamlessly work with different Azure services. Improve performance and find opportunities to improvise process to bring in efficiency in SQL, SSIS and ADF. Create, schedule and monitor SQL jobs. Build interactive visualizations in Tableau for leadership reporting. Proactively prioritize activities, handle tasks and deliver quality solutions on time. Communicate clearly and regularly with team leadership and project teams. Manage ongoing deliverable timelines and own relationships with end clients to understand if deliverables continue to meet client’s need. Work collaboratively with other team members and end clients throughout development life cycle. Research, learn, implement, and share skills on new technologies. Understand the customer requirement well and provide status update to project lead (US/USI) on calls and emails efficiently. Proactively prioritize activities, handle tasks and deliver quality solutions on time. Guide junior members in team to get them up to speed in domain, tool and technologies we work on. Continuously improves skills in this space by completing certification and recommended training. Obtain and maintain a thorough understanding of the MDM data model, Global Systems & Client attributes. Good understanding of MVC .Net, Sharepoint front end solutions. Good to have knowledge on Full stack development. The team CoRe - Digital Data Analytics Innovation (DDAI) sits inside Deloitte’s global shared services organization and serves Qualifications and experience Required: Educational Qualification: B.E/B.Tech or MTech (60% or 6.5 GPA and above) Should be proficient in understanding of one or more of the following Technologies: Knowledge in DBMS concepts, exposure to querying on any relational database preferably MS SQL Server, MS Azure SQL, SSIS, Tableau. Knowledge on any of the coding language like C#. NET or VB .Net would be added advantage. Understands development methodology and lifecycle. Excellent analytical skills and communication skills (written, verbal, and presentation) Ability to chart ones’ own career and build networks within the organization Ability to work both independently and as part of a team with professionals at all levels Ability to prioritize tasks, work on multiple assignments, and raise concerns/questions where appropriate Seek information / ideas / establish relationship with customer to assess any future opportunities Total Experience: 4-6 years of overall experience At least 3 years of experience in data base development, ETL and Reporting Skill set Required: SQL Server, MS Azure SQL, Azure Data factory, SSIS, Azure Synapse, Data warehousing & BI Preferred: Tableau, .Net Good to have: MVC .Net, Sharepoint front end solutions. Location: Hyderabad Work hours: 2 p.m. – 11 p.m. How you will grow At Deloitte, we have invested a great deal to create a rich environment in which our professionals can grow. We want all our people to develop in their own way, playing to their own strengths as they hone their leadership skills. And, as a part of our efforts, we provide our professionals with a variety of learning and networking opportunities—including exposure to leaders, sponsors, coaches, and challenging assignments—to help accelerate their careers along the way. No two people learn in exactly the same way. So, we provide a range of resources, including live classrooms, team- based learning, and eLearning. Deloitte University (DU): The Leadership Center in India, our state-of-the-art, world- class learning center in the Hyderabad office, is an extension of the DU in Westlake, Texas, and represents a tangible symbol of our commitment to our people’s growth and development. Explore DU: The Leadership Center in India. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Deloitte’s culture Our positive and supportive culture encourages our people to do their best work every day. We celebrate individuals by recognizing their uniqueness and offering them the flexibility to make daily choices that can help them to be healthy, centered, confident, and aware. We offer well-being programs and are continuously looking for new ways to maintain a culture that is inclusive, invites authenticity, leverages our diversity, and where our people excel and lead healthy, happy lives. Learn more about Life at Deloitte. Corporate citizenship Deloitte is led by a purpose: to make an impact that matters. This purpose defines who we are and extends to relationships with our clients, our people, and our communities. We believe that business has the power to inspire and transform. We focus on education, giving, skill-based volunteerism, and leadership to help drive positive social impact in our communities. Learn more about Deloitte’s impact on the world. Disclaimer: Please note that this description is subject to change basis business/project requirements and at the discretion of the management. #EAG-Core Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 300780

Posted 3 weeks ago

Apply

7.0 - 11.0 years

0 Lacs

haryana

On-site

As a Senior Manager specializing in Data Analytics & AI, you will be a pivotal member of the EY Data, Analytics & AI Ireland team. Your role as a Databricks Platform Architect will involve enabling clients to extract significant value from their information assets through innovative data analytics solutions. You will have the opportunity to work across various industries, collaborating with diverse teams and leading the design and implementation of data architecture strategies aligned with client goals. Your key responsibilities will include leading teams with varying skill sets in utilizing different Data and Analytics technologies, adapting your leadership style to meet client needs, creating a positive learning culture, engaging with clients to understand their data requirements, and developing data artefacts based on industry best practices. Additionally, you will assess existing data architectures, develop data migration strategies, and ensure data integrity and minimal disruption during migration activities. To qualify for this role, you must possess a strong academic background in computer science or related fields, along with at least 7 years of experience as a Data Architect or similar role in a consulting environment. Hands-on experience with cloud services, data modeling techniques, data management concepts, Python, Spark, Docker, Kubernetes, and cloud security controls is essential. Ideally, you will have the ability to effectively communicate technical concepts to non-technical stakeholders, lead the design and optimization of the Databricks platform, work closely with the data engineering team, maintain a comprehensive understanding of the data pipeline, and stay updated on new and emerging technologies in the field. EY offers a competitive remuneration package, flexible working options, career development opportunities, and a comprehensive Total Rewards package. Additionally, you will benefit from support, coaching, opportunities for skill development, and a diverse and inclusive culture that values individual contributions. If you are passionate about leveraging data to solve complex problems, drive business outcomes, and contribute to a better working world, consider joining EY as a Databricks Platform Architect. Apply now to be part of a dynamic team dedicated to innovation and excellence.,

Posted 3 weeks ago

Apply

10.0 - 14.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Senior/Senior Principal Support Engineer - EBS Apps Developer within the Customer Success Services (CSS) team at Oracle, you will be joining a group of highly skilled technical experts who support over 6,000 companies globally. Your role will involve building and maintaining technical landscapes for clients through tailored support services. The ideal candidate should possess strong technical knowledge in Oracle applications, SQL, PL-SQL, and OAF, XML, Oracle Forms and Reports, among others. Additionally, experience in Oracle Fusion and Oracle On-Premise Applications, integration web services, and Oracle Cloud is required. Your responsibilities will include developing technical solutions to meet business requirements, resolving key issues related to code change requirements and bug fixes, and supporting Oracle ERP products and services. You will also design and build OTBI and BIP reports, configure Alerts using Bursting, and collaborate with functional teams to understand reporting needs. Furthermore, you will maintain and optimize Oracle Fusion SaaS solutions, provide support for existing BI reports, and ensure compliance with Oracle Fusion best practices. The ideal candidate should have a minimum of 10 years of relevant experience, excellent problem-solving and troubleshooting skills, and the ability to work effectively in a team. Strong communication and teamwork skills, self-driven and result-oriented mindset, and the ability to keep track of schedules are essential for this role. You should also be willing to work 24x7 on-call and travel onsite to customers as needed. In addition to technical capabilities, the ideal candidate should be able to work independently on CEMLI objects, investigate, analyze, design, and develop solutions for enhancements/developments related to CEMLIs, and lead the support team in Incident and Problem Management. Understanding customer requirements, having hands-on knowledge of Oracle EBS R12 and Fusion/SaaS modules, and good knowledge of business processes and application setups are also important qualifications for this role. If you are passionate about problem-solving, enthusiastic about learning cutting-edge technologies, and customer-centric, this role offers a professional context where you can constantly develop yourself and be in touch with innovative technologies in on-prem and cloud environments. Join us at Oracle and be part of a global team committed to empowering innovation and inclusivity in the workplace.,

Posted 3 weeks ago

Apply

3.0 years

0 Lacs

Kochi, Kerala, India

On-site

We are seeking an experienced and hands-on Senior Azure Data Engineer with Power BI expertise to take on a dual role that combines technical leadership and active development. You will lead BI and data engineering efforts for enterprise-grade analytics solutions using Power BI, Azure Data Services, and Databricks, contributing to both design and delivery. This position offers the opportunity to work in a collaborative environment and play a key role in shaping our cloud-first data architecture. Key Responsibilities Lead and participate in the design, development, and deployment of scalable BI and data engineering solutions. Build robust data pipelines and integrations using Azure Data Factory (ADF) and Databricks (Python/Scala). Develop interactive and insightful Power BI dashboards and reports, with strong expertise in DAX and Power Query. Collaborate with stakeholders to gather business requirements and translate them into technical solutions. Optimize data models and ensure best practices in data governance, security, and performance tuning. Manage and enhance data warehousing solutions, ensuring data consistency, availability, and reliability. Work in an Agile/Scrum environment with product owners, data analysts, and engineers. Mentor junior developers and ensure code quality and development standards are maintained. Technical Must-Haves Power BI : 3+ years of hands-on experience in developing complex dashboards and DAX queries. Databricks (Python/Scala) : 3 to 4+ years of experience in building scalable data engineering solutions. Azure Data Factory (ADF) : Strong experience in orchestrating and automating data workflows. SQL : Advanced skills in writing and optimizing queries for data extraction and transformation. Solid understanding of data warehousing concepts, star/snowflake schemas, and ETL/ELT practices. Nice To Have Knowledge of Azure Synapse Analytics, Azure Functions, or Logic Apps. Experience with CI/CD pipelines for data deployments. Familiarity with Azure DevOps, Git, or similar version control systems. Exposure to data lake architecture, Delta Lake, or medallion architecture (ref:hirist.tech)

Posted 3 weeks ago

Apply

8.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Description We are looking for a highly skilled and experienced Snowflake Architect to lead the design, development, and deployment of enterprise-grade cloud data solutions. The ideal candidate will have a strong background in data architecture, cloud data platforms, and Snowflake implementation, with hands-on experience in end-to-end data pipeline and data warehouse design. This role requires strategic thinking, technical leadership, and the ability to work collaboratively across cross-functional Responsibilities : Lead the architecture, design, and implementation of scalable Snowflake-based data warehousing solutions. Define data modeling standards, best practices, and governance frameworks. Design and optimize ETL/ELT pipelines using tools like Snowpipe, Azure Data Factory, Informatica, or DBT. Collaborate with stakeholders to understand data requirements and translate them into robust architectural solutions. Implement data security, privacy, and role-based access controls within Snowflake. Guide development teams on performance tuning, query optimization, and cost management in Snowflake. Ensure high availability, fault tolerance, and compliance across data platforms. Mentor developers and junior architects on Snowflake capabilities and Skills & Experience : 8+ years of overall experience in data engineering, BI, or data architecture, with at least 3+ years of hands-on Snowflake experience. Expertise in Snowflake architecture, data sharing, virtual warehouses, clustering, and performance optimization. Strong experience with SQL, Python, and cloud data services (e.g., AWS, Azure, or GCP). Hands-on experience with ETL/ELT tools like ADF, Informatica, Talend, DBT, or Matillion. Good understanding of data lakes, data mesh, and modern data stack principles. Experience with CI/CD for data pipelines, DevOps, and data quality frameworks. Solid knowledge of data governance, metadata management, and cataloging to Have : Snowflake certification (e.g., SnowPro Core/Advanced Architect). Familiarity with Apache Airflow, Kafka, or event-driven data ingestion. Knowledge of data visualization tools such as Power BI, Tableau, or Looker. Experience in healthcare, BFSI, or retail domain projects (ref:hirist.tech)

Posted 3 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

You will be responsible for designing, developing, and implementing data-centric software solutions using various technologies. This includes conducting code reviews, recommending best coding practices, and providing effort estimates for the proposed solutions. Additionally, you will design audit business-centric software solutions and maintain comprehensive documentation for all proposed solutions. As a key member of the team, you will lead architect and design efforts for product development and application development for relevant use cases. You will provide guidance and support to team members and clients, implementing best practices of data engineering and architectural solution design, development, testing, and documentation. Your role will require you to participate in team meetings, brainstorming sessions, and project planning activities. It is essential to stay up-to-date with the latest advancements in the data engineering area to drive innovation and maintain a competitive edge. You will stay hands-on with the design, development, and validation of systems and models deployed. Collaboration with audit professionals to understand business, regulatory, and risk requirements, as well as key alignment considerations for audit, is a crucial aspect of the role. Driving efforts in the data engineering and architecture practice area will be a key responsibility. In terms of mandatory technical and functional skills, you should have a deep understanding of RDBMS (MS SQL Server, ORACLE, etc.), strong programming skills in T-SQL, and proven experience in ETL and reporting (MSBI stack/COGNOS/INFORMATICA, etc.). Additionally, experience with cloud-centric databases (AZURE SQL/AWS RDS), ADF (AZURE Data Factory), data warehousing skills using SYNAPSE/Redshift, understanding and implementation experience of datalakes, and experience in large data processing/ingestion using Databricks APIs, Lakehouse, etc., are required. Knowledge in MPP databases like SnowFlake/Postgres-XL is also essential. Preferred technical and functional skills include understanding financial accounting, experience with NoSQL using MONGODB/COSMOS, Python coding experience, and an aptitude towards emerging data platforms technologies like MS AZURE Fabric. Key behavioral attributes required for this role include strong analytical, problem-solving, and critical-thinking skills, excellent collaboration skills, the ability to work effectively in a team-oriented environment, excellent written and verbal communication skills, and the willingness to learn new technologies and work on them.,

Posted 3 weeks ago

Apply

0.0 - 2.0 years

0 Lacs

Bengaluru, Karnataka, India

Remote

Remote Work: Hybrid Overview: At Zebra, we are a community of innovators who come together to create new ways of working to make everyday life better. United by curiosity and care, we develop dynamic solutions that anticipate our customer’s and partner’s needs and solve their challenges. Being a part of Zebra Nation means being seen, heard, valued, and respected. Drawing from our diverse perspectives, we collaborate to deliver on our purpose. Here you are a part of a team pushing boundaries to redefine the work of tomorrow for organizations, their employees, and those they serve. You have opportunities to learn and lead at a forward-thinking company, defining your path to a fulfilling career while channeling your skills toward causes that you care about – locally and globally. We’ve only begun reimaging the future – for our people, our customers, and the world. Let’s create tomorrow together. A Data scientiest will be responsible for Designs, develops, programs and implements Machine Learning solutions , Implements Artificial/Augmented Intelligence systems/Agentic Workflows/Data Engineer Workflows, Performs Statistical Modelling and Measurements by applying data engineering, feature engineering, statistical methods, ML modelling and AI techniques on structured, unstructured, diverse “big data” sources of machine acquire data to generate actionable insights and foresights for real life business problem solutions and product features development and enhancements. Responsibilities: Integrates state-of-the-art machine learning algorithms as well as the development of new methods Develops tools to support analysis and visualization of large datasets Develops, codes software programs, implements industry standard auto ML models (Speech, Computer vision, Text Data, LLM), Statistical models, relevant ML models (devices/machine acquired data), AI models and algorithms Identifies meaningful foresights based on predictive ML models from large data and metadata sources; interprets and communicates foresights, insights and findings from experiments to product managers, service managers, business partners and business managers Makes use of Rapid Development Tools (Business Intelligence Tools, Graphics Libraries, Data modelling tools) to effectively communicate research findings using visual graphics, Data Models, machine learning model features, feature engineering / transformations to relevant stakeholders Analyze, review and track trends and tools in Data Science, Machine Learning, Artificial Intelligence and IoT space Interacts with Cross-Functional teams to identify questions and issues for data engineering, machine learning models feature engineering Evaluates and makes recommendations to evolve data collection mechanism for Data capture to improve efficacy of machine learning models prediction Meets with customers, partners, product managers and business leaders to present findings, predictions, foresights; Gather customer specific requirements of business problems/processes; Identify data collection constraints and alternatives for implementation of models Working knowledge of MLOps, LLMs and Agentic AI/Workflows Programming Skills: Proficiency in Python and experience with ML frameworks like TensorFlow, PyTorch LLM Expertise: Hands-on experience in training, fine-tuning, and deploying LLMs Foundational Model Knowledge: Strong understanding of open-weight LLM architectures, including training methodologies, fine-tuning techniques, hyperparameter optimization, and model distillation. Data Pipeline Development: Strong understanding of data engineering concepts, feature engineering, and workflow automation using Airflow or Kubeflow. Cloud & MLOps: Experience deploying ML models in cloud environments like AWS, GCP (Google Vertex AI), or Azure using Docker and Kubernetes.Designs and implementation predictive and optimisation models incorporating diverse data types strong SQL, Azure Data Factory (ADF) Qualifications: Bachelors degree, Masters or PhD in statistics, mathematics, computer science or related discipline preferred 0-2 years Statistics modeling and algorithms Machine Learning experience including deep learning and neural networks, genetics algorithm etc Working knowledge with big data – Hadoop, Cassandra, Spark R. Hands on experience preferred Data Mining Data Visualization and visualization analysis tools including R Work/project experience in sensors, IoT, mobile industry highly preferred Excellent written and verbal communication Comfortable presenting to Sr Management and CxO level executives Self-motivated and self-starting with high degree of work ethic Position Specific Information Travel Requirements (as a % of time): <10% Able to telework? Yes/no – if yes, % of time and expectations while teleworking Yes, 70%. To visit Zebra site 2-3 days a week or every other week Personal Protective Equipment (PPE) Required (safety glasses, steel-toed boots, gloves, etc.): No U.S. Only – Frequency Definitions for Physical Activities, Environmental Conditions and Physical Demands: Never – 0% Occasionally - 0-20 times per shift or up to 33% of the time Frequently - 20-100 times per shift or 33-66% of the time Constantly - Over 100 times per shift or 66-100% of the time Physical Activities (all U.S. only jobs): Enter in N, O, F or C as applicable Enter in Frequency (N)Never, (O)Occasionally, (F)Frequently or (C)Constantly Ascending or descending ladders, stairs, scaffolding, ramps, poles and the like. Working from heights such as roofs, ladders, or powered lifts. N N Moving self in different positions to accomplish tasks in various environments including awkward or tight and confined spaces. N Remaining in a stationary position, often standing or sitting for prolonged periods. Stooping, kneeling, bending, crouching, reaching, pushing/pulling. N N Moving about to accomplish tasks or moving from one worksite to another. N Adjusting or moving objects up to __ pounds in all directions. N Communicating with others to exchange information. F Repeating motions that may include the wrists, hands and/or fingers. F (typing) Operating machinery and/or power tools. N Operating motor vehicles, industrial vehicles, or heavy equipment. N Assessing the accuracy, neatness and thoroughness of the work assigned. F Environmental Conditions (U.S. only): Enter in N, O, F or C as applicable Enter in Frequency (N)Never, (O)Occasionally, (F)Frequently or (C)Constantly Exposure to extreme temperatures (high or low). N Outdoor elements such as precipitation and wind. N Noisy environments. N Other hazardous conditions such as vibration, uneven ground surfaces, or dust & fumes. N Small and/or enclosed spaces. N No adverse environmental conditions expected. N Physical Demands (U.S. only): Check only one below Check only one below Sedentary work that primarily involves sitting/standing. X Light work that includes moving objects up to 20 pounds. Medium work that includes moving objects up to 50 pounds. Heavy work that includes moving objects up to 100 pounds or more (team lift) Must be able to see color. To protect candidates from falling victim to online fraudulent activity involving fake job postings and employment offers, please be aware our recruiters will always connect with you via @zebra.com email accounts. Applications are only accepted through our applicant tracking system and only accept personal identifying information through that system. Our Talent Acquisition team will not ask for you to provide personal identifying information via e-mail or outside of the system. If you are a victim of identity theft contact your local police department.

Posted 3 weeks ago

Apply

0 years

0 Lacs

North 24 Parganas, West Bengal, India

On-site

Company Description Nevaeh Technology is a IT and ITES Business Consulting, Technology, and Services Company, promoted by senior IT and Management professionals. The company emphasizes core values such as teamwork, employee empowerment, and integrity without compromise. Nevaeh delivers outstanding services to Corporate Houses, SMEs, and the Government Sector, specializing in areas like IT Infrastructure Services. ISO 9001:2008 certified. Role Description This is a full-time, on-site role for an ADF Scanner Technician.. The ADF Scanner Technician will be responsible for setting up and maintaining Automatic Document Feed (ADF) scanners, troubleshooting hardware and software issues, ensuring optimal scanner performance, managing electronic document workflows, and providing technical support to users. Daily tasks include performing regular maintenance checks, calibrating scanners, and updating software as necessary. Qualifications Experience with ADF Scanner operations and maintenance Technical skills in troubleshooting hardware and software issues Excellent problem-solving skills and attention to detail Strong communication skills for providing technical support Experience in IT infrastructure services is a plus Ability to work independently and as part of a team

Posted 3 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Job Category Job Type: Job Location: Salary Years of Experience: Job Title: Snowflake Developer Location: Gurugram Experience: 3 to 7 years Skillset: Snowflake, Azure, ADF (Azure Data Factory) Job Type: Full-Time Overview We are looking for a talented Snowflake Developer with expertise in Snowflake, Azure, and Azure Data Factory (ADF) to join our dynamic team. In this role, you will be responsible for developing, implementing, and optimizing data pipelines and ETL processes. You will work on cloud-based data platforms to ensure the effective and seamless integration of data across systems. The ideal candidate will have a solid background in working with Snowflake and cloud data services, and be ready to travel to client locations as required. Key Responsibilities Design, develop, and implement data solutions using Snowflake and Azure technologies. Develop and manage ETL pipelines using Azure Data Factory (ADF) for seamless data movement and transformation. Collaborate with cross-functional teams to ensure that the data platform meets business needs and aligns with data architecture best practices. Monitor, optimize, and troubleshoot data pipelines and workflows in Snowflake and Azure environments. Implement data governance and security practices in line with industry standards. Perform data validation and ensure data integrity across systems and platforms. Ensure data integration and management processes are optimized for performance, scalability, and reliability. Provide technical support and guidance to junior developers and team members. Collaborate with the client to understand project requirements and ensure deliverables are met on time. Be open to traveling to client locations as needed for project delivery and stakeholder engagements. Skills and Qualifications: 3 to 7 years of hands-on experience in Snowflake development and data management. Strong working knowledge of Azure (Azure Data Services, Azure Data Lake, etc.) and Azure Data Factory (ADF). Expertise in designing and developing ETL pipelines and data transformation processes using Snowflake and ADF. Proficiency in SQL and data modeling, with experience working with structured and semi-structured data. Knowledge of data warehousing concepts and best practices in Snowflake. Understanding of data security, privacy, and compliance requirements in cloud environments. Experience with cloud-based data solutions and integration services. Strong problem-solving and debugging skills. Ability to work effectively with both technical and non-technical teams. Good communication skills to collaborate with clients and team members. Bachelors degree in Computer Science, Information Technology, or a related field. Preferred Skills: Experience with other Azure services like Azure SQL Database, Azure Synapse Analytics, and Power BI. Familiarity with data governance tools and data pipeline orchestration best practices. Ability to optimize Snowflake queries and database performance. Why Join Us: Work with cutting-edge cloud technologies like Snowflake and Azure. Exposure to complex, large-scale data projects across industries. Collaborative work environment that promotes innovation and learning. Competitive salary and benefits package. Opportunities for career growth and development.

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Description As a member of the Support organization, your focus is to deliver post-sales support and solutions to the Oracle customer base while serving as an advocate for customer needs. This involves resolving post-sales non-technical customer inquiries via phone and electronic means, as well as, technical questions regarding the use of and troubleshooting for our Electronic Support Services. A primary point of contact for customers, you are responsible for facilitating customer relationships with Support and providing advice and assistance to internal Oracle employees on diverse customer situations and escalated issues. Career Level - IC4 Responsibilities Education & Experience: BE, BTech, MCA , CA or equivalent preferred. Other qualifications with adequate experience may be considered. 5+ years relevant working experience ##Functional/Technical Knowledge & Skills: Must have good understanding of the following Oracle Cloud Financials version 12+ capabilities: We are looking for a techno-functional person who has real-time hands-on functional/product and/or technical experience; and/or worked with L2 or L3 level support; and/or having equivalent knowledge. We expect candidate to have: Strong business processes knowledge and concepts. Implementation/Support experience on either of the area - ERP - Cloud Financial Modules like GL, AP, AR, FA, IBY, PA, CST, ZX and PSA or HCM - Core HR, Benefits, Absence, T&L, Payroll, Compensation, Talent Management or SCM - Inventory, OM, Procurement Candidate must have hands on experience minimum in any of the 5 modules on the above pillars. Ability to relate the product functionality to business processes, and thus offer implementation advices to customers on how to meet their various business scenarios using Oracle Cloud Financials. Technically Strong with Expert Skills in SQL, PLSQL, OTBI/ BIP/FRS reports, FBDI, ADFDI, BPM workflows, ADF Faces, BI Extract for FTP, Payment Integration and Personalisation. Ability to relate the product functionality to business processes, and thus offer implementation advice to customers on how to meet their various business scenarios using Oracle Cloud. Strong problem solving skills. Strong Customer interactions and service orientation so you can understand customer’s critical situations and accordingly provide the response, and mobilise the organisational resources, while setting realistic expectations to customers. Strong operations management and innovation orientation so you can continually improve the processes, methods, tools, and utilities. Strong team player so you leverage each other’s strengths. You will be engaged in collaboration with peers within/across the teams often. Strong learning orientation so you keep abreast of the emerging business models/processes, applications product solutions, product features, technology features – and use this learning to deliver value to customers on a daily basis. High flexibility so you remain agile in a fast changing business and organisational environment. Create and maintain appropriate documentation for architecture, design, technical, implementation, support and test activities. # Personal Attributes: Self driven and result oriented Strong problem solving/analytical skills Strong customer support and relation skills Effective communication (verbal and written) Focus on relationships (internal and external) Strong willingness to learn new things and share them with others Influencing/negotiating Team player Customer focused Confident and decisive Values Expertise (maintaining professional expertise in own discipline) Enthusiasm Flexibility Organizational skills Values and enjoys coaching/knowledge transfer ability Values and enjoys teaching technical courses Note: Shift working is mandatory. Candidate should be open to work in evening and night shifts on rotation basis. Career Level - IC3/IC4/IC5 Qualifications Career Level - IC4 About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law.

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Job Description As a member of the Support organization, your focus is to deliver post-sales support and solutions to the Oracle customer base while serving as an advocate for customer needs. This involves resolving post-sales non-technical customer inquiries via phone and electronic means, as well as, technical questions regarding the use of and troubleshooting for our Electronic Support Services. A primary point of contact for customers, you are responsible for facilitating customer relationships with Support and providing advice and assistance to internal Oracle employees on diverse customer situations and escalated issues. Career Level - IC4 Responsibilities Education & Experience: BE, BTech, MCA , CA or equivalent preferred. Other qualifications with adequate experience may be considered. 5+ years relevant working experience ##Functional/Technical Knowledge & Skills: Must have good understanding of the following Oracle Cloud Financials version 12+ capabilities: We are looking for a techno-functional person who has real-time hands-on functional/product and/or technical experience; and/or worked with L2 or L3 level support; and/or having equivalent knowledge. We expect candidate to have: Strong business processes knowledge and concepts. Implementation/Support experience on either of the area - ERP - Cloud Financial Modules like GL, AP, AR, FA, IBY, PA, CST, ZX and PSA or HCM - Core HR, Benefits, Absence, T&L, Payroll, Compensation, Talent Management or SCM - Inventory, OM, Procurement Candidate must have hands on experience minimum in any of the 5 modules on the above pillars. Ability to relate the product functionality to business processes, and thus offer implementation advices to customers on how to meet their various business scenarios using Oracle Cloud Financials. Technically Strong with Expert Skills in SQL, PLSQL, OTBI/ BIP/FRS reports, FBDI, ADFDI, BPM workflows, ADF Faces, BI Extract for FTP, Payment Integration and Personalisation. Ability to relate the product functionality to business processes, and thus offer implementation advice to customers on how to meet their various business scenarios using Oracle Cloud. Strong problem solving skills. Strong Customer interactions and service orientation so you can understand customer’s critical situations and accordingly provide the response, and mobilise the organisational resources, while setting realistic expectations to customers. Strong operations management and innovation orientation so you can continually improve the processes, methods, tools, and utilities. Strong team player so you leverage each other’s strengths. You will be engaged in collaboration with peers within/across the teams often. Strong learning orientation so you keep abreast of the emerging business models/processes, applications product solutions, product features, technology features – and use this learning to deliver value to customers on a daily basis. High flexibility so you remain agile in a fast changing business and organisational environment. Create and maintain appropriate documentation for architecture, design, technical, implementation, support and test activities. # Personal Attributes: Self driven and result oriented Strong problem solving/analytical skills Strong customer support and relation skills Effective communication (verbal and written) Focus on relationships (internal and external) Strong willingness to learn new things and share them with others Influencing/negotiating Team player Customer focused Confident and decisive Values Expertise (maintaining professional expertise in own discipline) Enthusiasm Flexibility Organizational skills Values and enjoys coaching/knowledge transfer ability Values and enjoys teaching technical courses Note: Shift working is mandatory. Candidate should be open to work in evening and night shifts on rotation basis. Career Level - IC3/IC4/IC5 Qualifications Career Level - IC4 About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law.

Posted 3 weeks ago

Apply

4.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals. In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decision-making for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary: A career within…. Responsibilities: Job Accountabilities - Hands on Experience in Azure Data Components like ADF / Databricks / Azure SQL - Good Programming Logic Sense in SQL - Good PySpark knowledge for Azure Data Bricks - Data Lake and Data Warehouse Concept Understanding - Unit and Integration testing understanding - Good communication skill to express thoghts and interact with business users - Understanding of Data Security and Data Compliance - Agile Model Understanding - Project Documentation Understanding - Certification (Good to have) - Domain Knowledge Mandatory skill sets: Azure DE, ADB, ADF, ADL Preferred skill sets: Azure DE, ADB, ADF, ADL Years of experience required: 4 to 7 years Education qualification: Graduate Engineer or Management Graduate Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering, Bachelor in Business Administration Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Microsoft Azure Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Communication, Competitive Advantage, Continuous Process Improvement, Creativity, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Inclusion, Industry Trend Analysis {+ 16 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date

Posted 3 weeks ago

Apply

4.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals. In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decision-making for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. *Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary: A career within…. Responsibilities: Job Accountabilities - Hands on Experience in Azure Data Components like ADF / Databricks / Azure SQL - Good Programming Logic Sense in SQL - Good PySpark knowledge for Azure Data Bricks - Data Lake and Data Warehouse Concept Understanding - Unit and Integration testing understanding - Good communication skill to express thoghts and interact with business users - Understanding of Data Security and Data Compliance - Agile Model Understanding - Project Documentation Understanding - Certification (Good to have) - Domain Knowledge Mandatory skill sets: Azure DE, ADB, ADF, ADL Preferred skill sets: Azure DE, ADB, ADF, ADL Years of experience required: 4 to 7 years Education qualification: Graduate Engineer or Management Graduate Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering, Bachelor in Business Administration Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Microsoft Azure Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Communication, Competitive Advantage, Continuous Process Improvement, Creativity, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Inclusion, Industry Trend Analysis {+ 16 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date

Posted 3 weeks ago

Apply

7.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

WPP is the creative transformation company. We use the power of creativity to build better futures for our people, planet, clients, and communities. Working at WPP means being part of a global network of more than 100,000 talented people dedicated to doing extraordinary work for our clients. We operate in over 100 countries, with corporate headquarters in New York, London and Singapore. WPP is a world leader in marketing services, with deep AI, data and technology capabilities, global presence and unrivalled creative talent. Our clients include many of the biggest companies and advertisers in the world, including approximately 300 of the Fortune Global 500. Our people are the key to our success. We're committed to fostering a culture of creativity, belonging and continuous learning, attracting and developing the brightest talent, and providing exciting career opportunities that help our people grow. Why we're hiring: At WPP, technology is at the heart of everything we do, and it is WPP IT’s mission to enable everyone to collaborate, create and thrive. WPP IT is undergoing a significant transformation to modernise ways of working, shift to cloud and micro-service-based architectures, drive automation, digitise colleague and client experiences and deliver insight from WPP’s petabytes of data. WPP Media is the world’s leading media investment company responsible for more than $63B in annual media investment through agencies Mindshare, MediaCom, Wavemaker, Essence and m/SIX, as well as the outcomes-driven programmatic audience company, Xaxis and data and technology company Choreograph. WPP Media’s portfolio includes Data & Technology, Investment and Services, all united in a vision to shape the next era of media where advertising works better for people. By leveraging all the benefits of scale, the company innovates, differentiates and generates sustained value for our clients wherever they do business.The WPP Media IT team in WPP IT are the technology solutions partner for the WPP Media group of agencies and are accountable for co-ordinating and assuring end-to-end change delivery, managing the WPP Media IT technology life cycle and innovation pipeline. We are looking for a hands-on, technically strong Data Operations Lead to head our newly established Data Integration & Operations team in Chennai. This is a build-and-run role: you’ll help define how the team operates while leading day-to-day delivery. The team is part of the global Data & Measure function and is responsible for ensuring that our data products run efficiently, reliably, and consistently across platforms and markets. You will own the operational layer of our data products — including data ingestion, monitoring, deployment pipelines, automation, and support. This role requires deep technical knowledge of Azure and/or GCP, alongside the ability to lead and scale a growing team. What you'll be doing: Technical Ownership & Execution Lead a team responsible for data integration, ingestion, orchestration, and platform operations Build and maintain automated data pipelines using Azure Data Factory, GCP Dataflow/Composer, or equivalent tools Define and implement platform-wide monitoring, logging, and alerting Manage cloud environments, including access control, security, and deployment automation Operational Standardisation Create and roll out standard operating procedures, runbooks, onboarding guides, and automation patterns Ensure repeatable, scalable practices across all supported data products Define reusable deployment frameworks and templates for integration Platform Support & Performance Set up and manage SLAs, incident workflows, and escalation models Proactively identify and resolve operational risks in cloud-based data platforms Partner with development and product teams to ensure seamless transition from build to run Team Leadership Lead and mentor a new, growing team in Chennai Shape the team’s operating model, priorities, and capabilities Act as a subject matter expert and escalation point for technical operations What you'll need: Required Skills 7+ years in data operations, platform engineering, or data engineering Deep, hands-on experience in Azure and/or GCP environments Strong understanding of cloud-native data pipelines, architecture, and security Skilled in orchestration (e.g. ADF, Dataflow, Airflow), scripting (Python, Bash), and SQL Familiarity with DevOps practices, CI/CD, and infrastructure-as-code Proven experience managing production data platforms and support Ability to design operational frameworks from the ground up Demonstrated experience leading technical teams, including task prioritization, mentoring, and delivery oversight Preferred Skills Experience with tools like dbt, Azure Synapse, BigQuery, Databricks, etc. Exposure to BI environments (e.g. Power BI, Looker) Familiarity with global support models and tiered ticket handling Experience with documentation, enablement, and internal tooling Who you are: You're open : We are inclusive and collaborative; we encourage the free exchange of ideas; we respect and celebrate diverse views. We are open-minded: to new ideas, new partnerships, new ways of working. You're optimistic : We believe in the power of creativity, technology and talent to create brighter futures or our people, our clients and our communities. We approach all that we do with conviction: to try the new and to seek the unexpected. You're extraordinary: we are stronger together: through collaboration we achieve the amazing. We are creative leaders and pioneers of our industry; we provide extraordinary every day. What we'll give you: Passionate, inspired people – We aim to create a culture in which people can do extraordinary work. Scale and opportunity – We offer the opportunity to create, influence and complete projects at a scale that is unparalleled in the industry. Challenging and stimulating work – Unique work and the opportunity to join a group of creative problem solvers. Are you up for the challenge? We believe the best work happens when we're together, fostering creativity, collaboration, and connection. That's why we’ve adopted a hybrid approach, with teams in the office around four days a week. If you require accommodations or flexibility, please discuss this with the hiring team during the interview process. WPP is an equal opportunity employer and considers applicants for all positions without discrimination or regard to particular characteristics. We are committed to fostering a culture of respect in which everyone feels they belong and has the same opportunities to progress in their careers. Please read our Privacy Notice (https://www.wpp.com/en/careers/wpp-privacy-policy-for-recruitment) for more information on how we process the information you provide.

Posted 3 weeks ago

Apply

6.0 - 10.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

At PwC, our people in managed services focus on a variety of outsourced solutions and support clients across numerous functions. These individuals help organisations streamline their operations, reduce costs, and improve efficiency by managing key processes and functions on their behalf. They are skilled in project management, technology, and process optimization to deliver high-quality services to clients. Those in managed service management and strategy at PwC will focus on transitioning and running services, along with managing delivery teams, programmes, commercials, performance and delivery risk. Your work will involve the process of continuous improvement and optimising of the managed services process, tools and services. Focused on relationships, you are building meaningful client connections, and learning how to manage and inspire others. Navigating increasingly complex situations, you are growing your personal brand, deepening technical expertise and awareness of your strengths. You are expected to anticipate the needs of your teams and clients, and to deliver quality. Embracing increased ambiguity, you are comfortable when the path forward isn’t clear, you ask questions, and you use these moments as opportunities to grow. Skills Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: Respond effectively to the diverse perspectives, needs, and feelings of others. Use a broad range of tools, methodologies and techniques to generate new ideas and solve problems. Use critical thinking to break down complex concepts. Understand the broader objectives of your project or role and how your work fits into the overall strategy. Develop a deeper understanding of the business context and how it is changing. Use reflection to develop self awareness, enhance strengths and address development areas. Interpret data to inform insights and recommendations. Uphold and reinforce professional and technical standards (e.g. refer to specific PwC tax and audit guidance), the Firm's code of conduct, and independence requirements. Role: Senior Associate – Data Engineer Tower: Data Analytics & Insights Managed Service Experience: 6 - 10 years Key Skills: Data Engineering Educational Qualification: Bachelor's degree in computer science/IT or relevant field Work Location: Bangalore AC Job Description As a Managed Services - Data Engineer Senior Associate, you'll work as part of a team of problem solvers, helping to solve complex business issues from strategy to execution by using Data, Analytics & Insights Skills. PwC Professional skills and responsibilities for this management level include but are not limited to: Use feedback and reflection to develop self-awareness, personal strengths, and address development areas. Proven track record as an SME in chosen domain. Mentor Junior resources within the team, conduct KSS and lessons learnt. Flexible to work in stretch opportunities/assignments. Demonstrate critical thinking and the ability to bring order to unstructured problems. Ticket Quality and deliverables review. Status Reporting for the project. Adherence to SLAs, experience in incident management, change management and problem management. Review your work and that of others for quality, accuracy, and relevance. Know how and when to use tools available for a given situation and can explain the reasons for this choice. Seek and embrace opportunities which give exposure to different situations, environments, and perspectives. Use straightforward communication, in a structured way, when influencing and connecting with others. Able to read situations and modify behavior to build quality relationships. Uphold the firm's code of ethics and business conduct. Demonstrate leadership capabilities by working with clients directly and leading the engagement. Work in a team environment that includes client interactions, workstream management, and cross-team collaboration. Good Team player. Take up cross competency work and contribute to COE activities. Escalation/Risk management. Position Requirements Required Skills: Primary Skill: ETL/ELT, SQL, SSIS, SSMS, Informatica, Python Secondary Skill: Azure/AWS/GCP (preferrable anyone), Power BI, Advance Excel, Excel Macro Data Ingestion Senior Associate Should have extensive experience in developing scalable, repeatable, and secure data structures and pipelines to ingest, store, collect, standardize, and integrate data that for downstream consumption like Business Intelligence systems, Analytics modelling, Data scientists etc. Designing and implementing data pipelines to extract, transform, and load (ETL) data from various sources into data storage systems, such as data warehouses or data lakes. Should have experience in building efficient, ETL/ELT processes using industry leading tools like Informatica, SSIS, SSMS, AWS, Azure, ADF, GCP, Snowflake, Spark, SQL, Python etc. Should have Hands-on experience with Data analytics tools like Informatica, Hadoop, Spark etc. Monitoring and troubleshooting data pipelines and resolving issues related to data processing, transformation, or storage. Implementing and maintaining data security and privacy measures, including access controls and encryption, to protect sensitive data. Should have experience in creating visually impactful dashboards in Tableau for data reporting. Extract, interpret and analyze data to identify key metrics and transform raw data into meaningful, actionable information. Good understanding of formulas / DAX, Measures, Establishing hierarchies, Data refresh, Row/column/report level security, Report governance, complex visualizations, level of detail (LOD) expressions etc. Ability to create and replicate functionalities like parameters (for top fields, sheet switching), interactive buttons to switch between dashboards, burger menus etc. Participate in requirement gathering with business and evaluate the data as per the requirement. Coordinate and manage data analytics & Reporting activities with stakeholders. Expertise in writing and analyzing complex SQL queries. Excellent problem solving, design, debugging, and testing skills, Competency in Excel (macros, pivot tables, etc.) Good to have minimum 5 years’ hands on Experience of delivering Managed Data and Analytics programs (Managed services and Managed assets) Should have Strong communication, problem solving, quantitative and analytical abilities. Effectively communicate with project team members and sponsors throughout the project lifecycle (status updates, gaps/risks, roadblocks, testing outcomes) Nice To Have Certification in any cloud platform Experience in Data ingestion technology by using any of the industry tools like Informatica, Talend, DataStage, etc. Managed Services- Data, Analytics & Insights At PwC we relentlessly focus on working with our clients to bring the power of technology and humans together and create simple, yet powerful solutions. We imagine a day when our clients can simply focus on their business knowing that they have a trusted partner for their IT needs. Every day we are motivated and passionate about making our clients’ better. Within our Managed Services platform, PwC delivers integrated services and solutions that are grounded in deep industry experience and powered by the talent that you would expect from the PwC brand. The PwC Managed Services platform delivers scalable solutions that add greater value to our client’s enterprise through technology and human-enabled experiences. Our team of highly skilled and trained global professionals, combined with the use of the latest advancements in technology and process, allows us to provide effective and efficient outcomes. With PwC’s Managed Services our clients are able to focus on accelerating their priorities, including optimizing operations and accelerating outcomes. PwC brings a consultative first approach to operations, leveraging our deep industry insights combined with world class talent and assets to enable transformational journeys that drive sustained client outcomes. Our clients need flexible access to world class business and technology capabilities that keep pace with today’s dynamic business environment. Within our global, Managed Services platform, we provide Data, Analytics & Insights Managed Service where we focus more so on the evolution of our clients’ Data, Analytics, Insights and cloud portfolio. Our focus is to empower our clients to navigate and capture the value of their application portfolio while cost-effectively operating and protecting their solutions. We do this so that our clients can focus on what matters most to your business: accelerating growth that is dynamic, efficient and cost-effective. As a member of our Data, Analytics & Insights Managed Service team, we are looking for candidates who thrive working in a high-paced work environment capable of working on a mix of critical Application Evolution Service offerings and engagement including help desk support, enhancement and optimization work, as well as strategic roadmap and advisory level work. It will also be key to lend experience and effort in helping win and support customer engagements from not only a technical perspective, but also a relationship perspective.

Posted 3 weeks ago

Apply

3.0 - 9.0 years

0 Lacs

Andhra Pradesh, India

On-site

At PwC, our people in business application consulting specialise in consulting services for a variety of business applications, helping clients optimise operational efficiency. These individuals analyse client needs, implement software solutions, and provide training and support for seamless integration and utilisation of business applications, enabling clients to achieve their strategic objectives. As a Guidewire developer at PwC, you will specialise in developing and customising applications using the Guidewire platform. Guidewire is a software suite that provides insurance companies with tools for policy administration, claims management, and billing. You will be responsible for designing, coding, and testing software solutions that meet the specific needs of insurance organisations. Driven by curiosity, you are a reliable, contributing member of a team. In our fast-paced environment, you are expected to adapt to working with a variety of clients and team members, each presenting varying challenges and scope. Every experience is an opportunity to learn and grow. You are expected to take ownership and consistently deliver quality work that drives value for our clients and success as a team. As you navigate through the Firm, you build a brand for yourself, opening doors to more opportunities. Skills Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: Apply a learning mindset and take ownership for your own development. Appreciate diverse perspectives, needs, and feelings of others. Adopt habits to sustain high performance and develop your potential. Actively listen, ask questions to check understanding, and clearly express ideas. Seek, reflect, act on, and give feedback. Gather information from a range of sources to analyse facts and discern patterns. Commit to understanding how the business works and building commercial awareness. Learn and apply professional and technical standards (e.g. refer to specific PwC tax and audit guidance), uphold the Firm's code of conduct and independence requirements. Total Experience – 3 To 9 Years Education Qualification: BTech/BE/MTech/MS/MCA Preferred Skill Set/Roles and Responsibility - Hands on Experience in Azure Data Bricks and ADF Guidewire. Works with business in identifying detailed analytical and operational reporting/extracts requirements. Experience in Python is a must have. Able to create Microsoft SQL / ETL / SSIS complex queries. Participates in Sprint development, test, and integration activities. Creates detailed source to target mappings. Creates and validates data dictionaries Writes and validates data translation and migration scripts. Communicating with business to gather business requirements. Performs GAP analysis between existing (legacy) and new (GW) data related solutions. Working with Informatica ETL devs.

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies