Jobs
Interviews

24278 Etl Jobs - Page 23

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 years

0 Lacs

Gandhinagar, Gujarat, India

On-site

Job Description Job Title : ETL Testing Location : Chennai & Pune Candidates Specification : Must be open for rotational shifts (including night shifts) Job Description Must have min 2+ years of relevant experience in ETL Testing Should have good exposure in Python Programming, ETL DB Testing etc. Good to have : Automation Testing with Java, Selenium, BDD , Cucumber Skills Required RoleETL Testing Industry TypeIT Services & Consulting Functional AreaIT-Software Required Education Bachelor Degree Employment TypeFull Time, Permanent Key Skills ETL SQL ETL TESTING ETL TESTING AUTOMATION AWS BIGDATA ETL BIG DATA SPARK PYTHON Other Information Job CodeGO/JC/410/2025 Recruiter NameDeepikad

Posted 4 days ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Fueled by strategic investment in technology and innovation, Client Technology seeks to drive growth opportunities and solve complex business problems for our clients through building a robust platform for business and powerful product engine that are integral to innovation at scale. You will work with technologists and business specialists, blending EY’s deep industry knowledge and innovative ideas with our platforms, capabilities, and technical expertise. As a champion for change and growth, you will be at the forefront of integrating emerging technologies from AI to Data Analytics into every corner of what we do at EY. That means more growth for you, exciting learning opportunities, career choices, and the chance to make a real impact. The opportunity: We are looking for a highly experienced Power BI developer who will be part of the Data Engineering tam within EY Client Technology’s Advanced Analytics Team. The candidate will be responsible for designing, developing and maintaining Power BI data models, reports and dashboards. If you are passionate about business intelligence, analytics and have a knack for turning complex data into actionable insights, we want to hear from you. To qualify for the role, you must have: Strong proficiency in Power BI, including DAX and the Power Query formula language (M-language). Advanced understanding of data modeling, data warehousing and ETL techniques. Designed, developed and maintained Power BI reports (including paginated reports) and dashboards to support business decision-making processes. Designed, developed and implemented Power BI data models for complex and large-scale enterprise environments. Proven experience with deploying and optimizing large datasets. Proficiency in SQL and other data querying languages. Strong collaboration, analytical, interpersonal and communication abilities. Ideally, you’ll also have: Bachelor's degree in Computer Science, Information Technology, Data Science, or a related field. Microsoft Power BI certification. Experience with other BI tools. Worked within large teams to successfully implement Power BI solutions. Sound knowledge of the software development lifecycle and experience with Git. Ability to propose solutions by recalling best practices learned from Microsoft documentation, whitepapers and community publications. What we look for: We want people who are self-starters, who can take initiative and get things done. If you can think critically and creatively to solve problems, you will excel. You should be comfortable working with culturally diverse outsourced on/offshore team members which means you may need to work outside of the normal working hours in your time zone to partner with other Client Technology staff globally. Some travel may also be required, both domestic and international. What we offer: As part of this role, you'll work in a highly integrated, global team with the opportunity and tools to grow, develop and drive your career forward. Here, you can combine global opportunity with flexible working. The EY benefits package goes above and beyond too, focusing on your physical, emotional, financial and social well-being. Your recruiter can talk to you about the benefits available in your country. Here’s a snapshot of what we offer: Continuous learning: You’ll develop the mindset and skills to navigate whatever comes next. Success as defined by you: We’ll provide the tools and flexibility, so you can make a meaningful impact, your way. Transformative leadership: We’ll give you the insights, coaching and confidence to be the leader the world needs. Diverse and inclusive culture: You’ll be embraced for who you are and empowered to use your voice to help others find theirs. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 4 days ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Dear Aspirants, Greetings from ValueLabs! We are hiring for Power BI Lead Role: Power BI Lead Skill Set: Power BI, SQL, ADF. Experience: 8+ Years Notice Period: Immediate Location: Hyderabad Roles and Responsibilities: : Experience Required: Minimum 8 years of relevant experience in data engineering, specifically with Power BI and Azure Data Engineering Key Responsibilities: 1. Data Engineering: a. Develop and maintain ETL processes using Azure Data Factory. b. Design and implement data warehouses using Azure Synapse Analytics. c. Optimize data storage and retrieval strategies to ensure efficient use of resources and fast query performance. d. Implement data quality checks and validation processes to ensure accuracy and reliability of data. 2. Power BI Reporting: a. Create compelling and interactive reports and dashboards using Power BI Desktop. b. Design and implement Power BI data models that efficiently integrate with various data sources. c. Automate report delivery and scheduling using Power Automate or similar tools. d. Collaborate with business stakeholders to understand reporting needs and translate those into actionable insights. 3. Technical Leadership: a. Act as a technical authority within the team, providing guidance on data engineering principles, Azure platform, and Power BI tools. b. Design, architect, and implement scalable data pipelines using Azure Data Factory, Azure Synapse Analytics, and other relevant technologies. c. Ensure adherence to data governance standards and regulations, such as GDPR, HIPAA, etc. d. Implement robust monitoring and alerting mechanisms to detect and resolve issues proactively. 4. Team Management: a. Oversee and manage a team of data engineers, ensuring they meet project deadlines and deliver high-quality work. b. Develop and implement team guidelines, policies, and procedures to enhance productivity and performance. c. Mentor and coach team members to improve their skills and career development. d. Conduct regular one-on-one meetings to discuss progress, address concerns, and set goals. Preferred Skills: • Experience with DevOps practices, including CI/CD pipelines and automation tools like Azure DevOps. Qualifications: • Bachelor’s degree in Computer Science, Information Technology, or related field. • Proficient in Azure Data Engineering services, including Azure Data Factory, Azure Synapse Analytics, and Azure Databricks. • Expertise in designing and implementing Power BI reports and dashboards. • Strong understanding of data architecture, data modelling, and data governance principles. • Experience working with large datasets and complex data transformation processes. • Excellent communication and interpersonal skills, with the ability to collaborate effectively across departments. • Ability to manage multiple priorities and work under tight deadlines. • Professional certification in Azure Data Engineer or equivalent, preferred but not required. Job Description: Experience Required: Minimum 8 years of relevant experience in data engineering, specifically with Power BI and Azure Data Engineering Key Responsibilities: 1. Data Engineering: a. Develop and maintain ETL processes using Azure Data Factory. b. Design and implement data warehouses using Azure Synapse Analytics. c. Optimize data storage and retrieval strategies to ensure efficient use of resources and fast query performance. d. Implement data quality checks and validation processes to ensure accuracy and reliability of data. 2. Power BI Reporting: a. Create compelling and interactive reports and dashboards using Power BI Desktop. b. Design and implement Power BI data models that efficiently integrate with various data sources. c. Automate report delivery and scheduling using Power Automate or similar tools. d. Collaborate with business stakeholders to understand reporting needs and translate those into actionable insights. 3. Technical Leadership: a. Act as a technical authority within the team, providing guidance on data engineering principles, Azure platform, and Power BI tools. b. Design, architect, and implement scalable data pipelines using Azure Data Factory, Azure Synapse Analytics, and other relevant technologies. c. Ensure adherence to data governance standards and regulations, such as GDPR, HIPAA, etc. d. Implement robust monitoring and alerting mechanisms to detect and resolve issues proactively. 4. Team Management: a. Oversee and manage a team of data engineers, ensuring they meet project deadlines and deliver high-quality work. b. Develop and implement team guidelines, policies, and procedures to enhance productivity and performance. c. Mentor and coach team members to improve their skills and career development. d. Conduct regular one-on-one meetings to discuss progress, address concerns, and set goals. Preferred Skills: • Experience with DevOps practices, including CI/CD pipelines and automation tools like Azure DevOps. Qualifications: • Bachelor’s degree in Computer Science, Information Technology, or related field. • Proficient in Azure Data Engineering services, including Azure Data Factory, Azure Synapse Analytics, and Azure Databricks. • Expertise in designing and implementing Power BI reports and dashboards. • Strong understanding of data architecture, data modelling, and data governance principles. • Experience working with large datasets and complex data transformation processes. • Excellent communication and interpersonal skills, with the ability to collaborate effectively across departments. • Ability to manage multiple priorities and work under tight deadlines. • Professional certification in Azure Data Engineer or equivalent, preferred but not required.

Posted 4 days ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

Performance Marketing & Innovation Lead – CRM, Platform & Analytics Location: Gurgaon, India Leads: Salesforce CRM Ops, Platform Innovation, and Performance Analytics Role Summary This role is ideal for someone who has risen from innovations & Tech, CRM or performance marketing into strategic leadership—perhaps a Managing Director of Martech/Performance, or even a Tech Delivery GM—now ready to integrate capabilities and lead a multidisciplinary team. You’ll be responsible for unifying Salesforce Marketing Cloud execution, site innovation, and data performance reporting under a single playbook. You won’t write the code—but you’ll know how to manage those who do, and how to translate between business goals, tech constraints, and customer outcomes. Core Responsibilities CRM & Salesforce Marketing Cloud Operations Oversee Salesforce Marketing Cloud (SFMC) implementation and journey management across 8+ markets. Translate campaign briefs into multi-market CRM journeys across email, WhatsApp, SMS, and push. Guide CRM delivery pods: campaign managers, builders, QA, and reporting analysts. Ensure CRM journeys meet business objectives (e.g., retention uplift, test drive bookings). Troubleshoot delivery issues, QA errors, and stakeholder escalations with speed and accuracy. Platform Innovation & Digital Experience Co-lead and CMS innovation in partnership with the Digital Experimentation Lead. Manage a backlog of site experiments (e.g., UX tweaks, funnel enhancements, A/B tests). Translate innovation lab concepts into scalable platform initiatives. Champion cross-functional input from media, analytics, product, and creative teams. Performance Analytics & Integration Lead performance integration across CRM, and paid media. Collaborate with the Marketing Science Unit (MSU) to define unified KPIs (e.g., site-to-lead, lead-to-booking, CAC). Ensure reporting pipelines across Salesforce, Adobe Analytics, and Power BI are delivering actionable insights. Build feedback loops that turn performance dashboards into optimization actions. Team Leadership & Governance Manage a 10–15 person cross-functional team across CRM ops, platform specialists, and analysts. Serve as the senior delivery voice in regional performance reviews and executive check-ins. Establish clear delivery operating rhythms (standups, retrospectives, quarterly planning). Partner with WPP HR to shape talent ladders, retention strategies, and team engagement. What You’ll Get Enterprise-wide ownership of performance delivery across Salesforce, platform, and analytics. Executive exposure with global and regional stakeholders. A central role in one of WPP’s most visible digital transformation engagements. The chance to shape performance operations from the ground up, with global scaling potential. An ecosystem of collaboration across WPP agencies (VML, WPP Media, Hogarth, etc.) Who Would Thrive in This Role A seasoned CRM or digital performance lead who has grown into strategic ownership. Someone who has led Salesforce or martech project delivery in a regional or global setting. A team builder who thrives on scaling talent and motivating high-performance squads. A translator of complexity—able to bridge product, marketing, analytics, and tech. Someone data-fluent but not data-obsessed—who uses insights to move business outcomes. A structured thinker who’s comfortable with agile, experimentation, and change management. Tools & Systems You'll Interface With Salesforce Marketing Cloud (SFMC) & Salesforce CRM Core Adobe Analytics / Adobe Target / Customer Journey Analytics Content Management Systems (CMS): Adobe Experience Manager Power BI / Tableau / Google Data Studio Snowflake / ETL Pipelines A/B Testing / Experimentation Toolkits JIRA / Confluence / Slack / MS Teams Career Progression Regional VP – Performance & Martech Integration Head of Innovation & Marketing Systems Transformation Who You Might Be Today A CRM Director or Digital Performance Lead at a multinational brand or digital agency. A Managing Director or GM of a martech/CRM services unit. A Client Partner with cross-stream delivery experience and platform oversight. A Head of Martech/Analytics now looking to lead an integrated performance team. Or a former CRM Manager turned leader who thrives in structured delivery and cross-discipline integration.

Posted 4 days ago

Apply

0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Skills: Database programming - SQL/ PL-SQL/ T-SQL ETL - Data pipeline, data preparation Analytics - BI Tool Roles & Responsibilities • Implement some of the world's largest data size big data analytics projects using Kyvos platcirm • Preparation of data for BI modeling using Spark, Hive, SQL and other ETL/ELT OLAP Data Modelling • Tuning of models for fastest and sub second query performance from business intelligense tools • Communicate with customer stake holders for busin

Posted 4 days ago

Apply

3.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Data Warehouse ETL Testing Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. Your typical day will involve collaborating with team members to develop innovative solutions and ensure seamless application functionality. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work-related problems. - Develop and implement ETL test cases to ensure data accuracy. - Conduct data validation and reconciliation processes. - Collaborate with cross-functional teams to troubleshoot and resolve data issues. - Create and maintain test documentation for ETL processes. - Identify opportunities for process improvement and optimization. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Warehouse ETL Testing. - Strong understanding of SQL and database concepts. - Experience with ETL tools such as Informatica or Talend. - Knowledge of data warehousing concepts and methodologies. - Hands-on experience in data quality assurance and testing. Additional Information: - The candidate should have a minimum of 3 years of experience in Data Warehouse ETL Testing. - This position is based at our Gurugram office. - A 15 years full-time education is required.

Posted 4 days ago

Apply

5.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Snowflake Data Warehouse Good to have skills : NA Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions, and ensuring that applications are aligned with business objectives. You will engage in problem-solving activities, manage project timelines, and contribute to the overall success of application development initiatives. Roles & Responsibilities: - Expected to be an SME. - Collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Facilitate knowledge sharing sessions to enhance team capabilities. - Monitor project progress and ensure alignment with business goals. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse. - Strong understanding of data modeling and ETL processes. - Experience with SQL and database management. - Familiarity with cloud computing concepts and services. - Ability to troubleshoot and optimize application performance. Additional Information: - The candidate should have minimum 5 years of experience in Snowflake Data Warehouse. - This position is based in Hyderabad. - A 15 years full time education is required.

Posted 4 days ago

Apply

5.0 years

0 Lacs

Bhubaneswar, Odisha, India

On-site

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NA Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that application development aligns with business objectives, overseeing project timelines, and facilitating communication among stakeholders. You will also engage in problem-solving activities, providing guidance and support to your team while ensuring that best practices are followed throughout the development process. Your role will be pivotal in driving the success of application initiatives and fostering a collaborative environment. Roles & Responsibilities: - Expected to be an SME. - Collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Facilitate training and development opportunities for team members to enhance their skills. - Monitor project progress and implement necessary adjustments to meet deadlines. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform. - Strong understanding of data integration and ETL processes. - Experience with cloud computing platforms and services. - Familiarity with data governance and compliance standards. - Ability to analyze and interpret complex data sets. Additional Information: - The candidate should have minimum 5 years of experience in Databricks Unified Data Analytics Platform. - This position is based at our Bhubaneswar office. - A 15 years full time education is required.

Posted 4 days ago

Apply

4.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Overview We are seeking a Platform Architect with expertise in Informatica PowerCenter and Informatica Intelligent Cloud Services (IICS) to design, implement, and optimize enterprise-level data integration platforms. The ideal candidate will have a strong background in ETL/ELT architecture, cloud data integration, and platform modernization, ensuring scalability, security, and performance across on-prem and cloud environments. Responsibilities Platform Engineering & Administration Oversee installation, configuration, and optimization of PowerCenter and IICS environments. Manage platform scalability, performance tuning, and troubleshooting. Implement data governance, security, and compliance (e.g., role-based access, encryption, data lineage tracking). Optimize connectivity and integrations with various sources (databases, APIs, cloud storage, SaaS apps). Cloud & Modernization Initiatives Architect and implement IICS-based data pipelines for real-time and batch processing. Migrate existing PowerCenter workflows to IICS, leveraging serverless and cloud-native features. Ensure seamless integration with cloud platforms (AWS, Azure, GCP) and modern data lakes/warehouses (Snowflake, Redshift, BigQuery). Qualifications 4 years of experience in data integration and ETL/ELT architecture. Expert-level knowledge of Informatica PowerCenter and IICS (Cloud Data Integration, API & Application Integration, Data Quality). Hands-on experience with cloud platforms (AWS, Azure, GCP) and modern data platforms (Snowflake, Databricks, Redshift, BigQuery). Strong SQL, database tuning, and performance optimization skills. Deep understanding of data governance, security, and compliance best practices. Experience in automation, DevOps (CI/CD), and Infrastructure-as-Code (IaC) tools for data platforms. Excellent communication, leadership, and stakeholder management skills. Preferred Qualifications Informatica certifications (IICS, PowerCenter, Data Governance). Proficient to Power Center to IDMC Conversions Understanding on real-time streaming (Kafka, Spark Streaming). Knowledge of API-based integration and event-driven architectures. Familiarity with Machine Learning and AI-driven data processing.

Posted 4 days ago

Apply

5.0 years

0 Lacs

Thiruvananthapuram Taluk, India

Remote

🚨 Now Hiring: Testing Specialist – SAP HCM Payroll (RISE/S4HANA) 📍 Location : Trivandrum (Hybrid / Remote flexibility available) 🏢 Company : Claidroid Technologies 🧭 Experience : 5+ Years in SAP HCM Payroll Testing 🔍 About the Role: Join us at Claidroid Technologies as we embark on a large-scale SAP HCM Payroll migration to S/4HANA on RISE with SAP . We’re seeking an experienced SAP Testing Specialist with deep functional knowledge and proven expertise in testing, UAT coordination, and post-go-live assurance. We’re a rapidly growing technology company with a global footprint spanning India, Finland, and the US . At Claidroid, we specialize in delivering real-world solutions at the cutting edge of: Artificial Intelligence, Generative AI & AIOps Cloud & Edge Computing ServiceNow, DevOps, and ETL Modernization Identity & Access Management, Cybersecurity Geospatial Intelligence & Digital Twins We empower enterprises across industries — from smart cities to finance to healthcare — to unlock data-driven transformation and measurable impact. 🎯 Role Overview 🔑 What You'll Do: ✔ Lead test planning and execution for SAP HCM modules (PA, PY, PT, OM) ✔ Perform interface testing (IDOCs, RFCs, Web Services) ✔ Execute functional, regression & UAT testing 🛠 Skills We Want to See 🔷 Must-Have Skills: 🧰 What You’ll Bring: ✅ 5+ years of experience in SAP HCM Payroll testing ✅ Exposure to S/4HANA or RISE with SAP environments ✅ Proficiency with SAP Solution Manager, HP ALM, JIRA, Zephyr ✅ Strong manual testing skills (automation using Tricentis is a bonus). ✅ Familiar with test tools like SAP Solution Manager, HP ALM, JIRA, Zephyr ✅ Solid understanding of interface testing and HANA-optimized systems ✅ Strong coordination skills & stakeholder communication ✅ Bonus if you know Tricentis, SuccessFactors , or basic ABAP debugging 🎯 Why Join Claidroid? This role offers the chance to play a pivotal role in a mission-critical SAP migration project , work in a collaborative global team, and apply modern SAP testing methodologies to ensure a high-performance, compliant, and stable payroll system. 📨 Interested? Apply immediately byu sharing email to our HR SPOC with Pratik - pratik.chavan@claidroid.com Be part of Claidroid’s journey to engineer the intelligent enterprise. 📅 Why Join Us? We’re not looking for someone to fill a seat. We’re looking for someone to leave their fingerprints on our processes — and make them better for everyone. This is your chance to show off your craft in a team that values thoughtful, effective solutions over corporate buzzwords.

Posted 4 days ago

Apply

5.0 years

0 Lacs

Indore, Madhya Pradesh, India

On-site

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Apache Spark Good to have skills : NA Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand their data needs and provide effective solutions, ensuring that the data infrastructure is robust and scalable to meet the demands of the organization. Roles & Responsibilities: - Expected to be an SME. - Collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Mentor junior team members to enhance their skills and knowledge in data engineering. Professional & Technical Skills: - Must To Have Skills: Proficiency in Apache Spark. - Strong understanding of data pipeline architecture and design. - Experience with ETL processes and data integration techniques. - Familiarity with data warehousing concepts and technologies. - Knowledge of data quality frameworks and best practices. Additional Information: - The candidate should have minimum 7.5 years of experience in Apache Spark. - This position is based in Chennai. - A 15 years full time education is required.

Posted 4 days ago

Apply

155.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

About General Mills We make food the world loves: 100 brands. In 100 countries. Across six continents. With iconic brands like Cheerios, Pillsbury, Betty Crocker, Nature Valley, and Häagen-Dazs, we’ve been serving up food the world loves for 155 years (and counting). Each of our brands has a unique story to tell. How we make our food is as important as the food we make. Our values are baked into our legacy and continue to accelerate us into the future as an innovative force for good. General Mills was founded in 1866 when Cadwallader Washburn boldly bought the largest flour mill west of the Mississippi. That pioneering spirit lives on today through our leadership team, who upholds a vision of relentless innovation while being a force for good. For more details, check out http://www.generalmills.com General Mills India Center (GIC) is our global capability center in Mumbai that works as an extension of our global organization, delivering business value, service excellence, and growth, while standing for good for our planet and people. With our team of 1800+ professionals, we deliver superior value across the areas of Supply chain (SC), Digital & Technology (D&T) Innovation, Technology & Quality (ITQ), Consumer and Market Intelligence (CMI), Sales Strategy & Intelligence (SSI), Global Shared Services (GSS), Finance Shared Services (FSS) and Human Resources Shared Services (HRSS). For more details, check out https://www.generalmills.co.in. We advocate for advancing equity and inclusion to create more equitable workplaces and a better tomorrow. Job Overview Function Overview The Digital and Technology team at General Mills stands as the largest and foremost unit, dedicated to exploring the latest trends and innovations in technology while leading the adoption of cutting-edge technologies across the organization. Collaborating closely with global business teams, the focus is on understanding business models and identifying opportunities to leverage technology for increased efficiency and disruption. The team places a strong emphasis on service partnerships and employee engagement with a commitment to advancing equity and supporting communities. In fostering an inclusive culture, the team values individuals passionate about learning and growing with technology, exemplified by the "Work with Heart" philosophy, emphasizing results over facetime. Those intrigued by the prospect of contributing to the digital transformation journey of a Fortune 500 company are encouraged to explore more details about the function through the following Link Purpose of the role The Digital and Technology team of General Mills India Centre is looking for a passionate and enthusiastic individual to work in General Mills’ D&T organization! We are accelerating the digital transformation of our HR organization to provide a competitive advantage to our business. To this end, we are looking for a D&T Analyst with a passion for data and analytics to join our HR Data Foundations team. This role is expected to demonstrate high proficiency in system knowledge/configuration, problem solving, process/data analysis, and communication skills. This role will work collaboratively across teams to provide recommendations on aligning technical solutions to business opportunities. Key Accountabilities Partner with business SMEs and D&T peers to learn the HR data needed to drive Data & Analytics for HR Proven ability to lead data requirements and support internal business clients consuming HR data Appropriate enforcement and governance of the HR security model and classifications Translate requirements into technical documents and specifications Maintain and enhance our HR GCP project Create and edit custom API reports in Workday to support HR data sourcing to GCP Partner with data engineers, analytic engineers, and architects to sustain and build new data pipelines Create and maintain Workday visual content using Discovery Boards and Custom Reports with Custom Dashboards Understand, document, and communicate timelines and priorities to business partners Ensure our code follows the latest coding practices and industry standards Understand and follow Agile methodologies Understand the end-to-end HR business processes, data, and analytics technology Effective verbal and written communication and influencing skills Proactive learning mindset with a passion to increase your skills in analytics capabilities Complete significant data analysis, manipulation, and validation as we create/migrate data sources Responsible for quality assurance, creation of test scripts, and testing execution for new capabilities and use cases Develop documentation and training to support system or processes changes Minimum Qualifications 8+ years of overall experience with 6+ years of relevant experience in a data or business analyst position Comfort working from 1:00 pm to 10:00 pm Bachelor’s/Master’s degree in HR or equivalent relevant discipline preferred Experience creating calculated fields. Experience with Workday Reporting, Report Writer, Dashboard, and Discovery Boards Effective verbal and written communication and influencing skills at the tactical level Strong problem-solving abilities and attention to detail Can do, positive attitude, and commitment to a team delivery approach Strong relationship management skills Excellent stakeholder management skills Preferred Qualifications Workday Prism Analytics expertise Experience writing SQL Broad FMCG Business and Technology expertise Broad understanding of Enterprise Data warehousing & Analytics Good knowledge of SAP R/3 or SAP S/4 HANA, SAP BW, SAP ETL / foundational data model/Reporting Experience Agile / SCRUM Delivery experience Excellent academics Results-oriented, high energy, self-motivated High-level understanding of GCP Cloud architecture Expert level of experience with Calculated Fields, Workday Advance Reports, Discovery Boards, Dashboards Intermediate level of experience with HR Analytics, Workday Reporting/PRISM, Data Architecture, Data Governance, Tableau, Power BI, Looker Tool Experience Basic level of experience with ETL Tool - Talend/ SAP Data, SSIS, SQL, GCP, BigQuery, FMCG Domain, SAP R/3 or SAP S4, Agile, Scrum, Data Warehousing, AI/ML Company Overview We exist to make food the world loves. But we do more than that. Our company is a place that prioritizes being a force for good, a place to expand learning, explore new perspectives and reimagine new possibilities, every day. We look for people who want to bring their best — bold thinkers with big hearts who challenge one other and grow together. Because becoming the undisputed leader in food means surrounding ourselves with people who are hungry for what’s next.

Posted 4 days ago

Apply

3.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Snowflake Data Warehouse Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will play a crucial role in developing innovative solutions to enhance business operations and user experience. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Develop and implement software solutions to meet business requirements. - Collaborate with cross-functional teams to design and deliver high-quality applications. - Troubleshoot and debug applications to ensure optimal performance. - Stay updated with industry trends and technologies to enhance application development processes. - Provide technical guidance and support to junior team members. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse. - Strong understanding of database concepts and SQL. - Experience in ETL processes and data modeling. - Knowledge of cloud platforms like AWS or Azure. - Hands-on experience in developing scalable and efficient applications. Additional Information: - The candidate should have a minimum of 3 years of experience in Snowflake Data Warehouse. - This position is based at our Gurugram office. - A 15 years full time education is required.

Posted 4 days ago

Apply

7.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Role: Data Engineer Experience: 7+ Years Mode: Hybrid Key Responsibilities: • Design and implement enterprise-grade Data Lake solutions using AWS (e.g., S3, Glue, Lake Formation). • Define data architecture patterns, best practices, and frameworks for handling large-scale data ingestion, storage, computing and processing. • Optimize cloud infrastructure for performance, scalability, and cost-effectiveness. • Develop and maintain ETL pipelines using tools such as AWS Glue or similar platforms. CI/CD Pipelines managing in DevOps. • Create and manage robust Data Warehousing solutions using technologies such as Redshift. • Ensure high data quality and integrity across all pipelines. • Design and deploy dashboards and visualizations using tools like Tableau, Power BI, or Qlik. • Collaborate with business stakeholders to define key metrics and deliver actionable insights. • Implement best practices for data encryption, secure data transfer, and role-based access control. • Lead audits and compliance certifications to maintain organizational standards. • Work closely with cross-functional teams, including Data Scientists, Analysts, and DevOps engineers. • Mentor junior team members and provide technical guidance for complex projects. • Partner with stakeholders to define and align data strategies that meet business objectives. Qualifications & Skills: • Strong experience in building Data Lakes using AWS Cloud Platforms Tech Stack. • Proficiency with AWS technologies such as S3, EC2, Glue/Lake Formation (or EMR), Quick sight, Redshift, Athena, Airflow (or) Lambda + Step Functions + Event Bridge, Data and IAM. • Expertise in AWS tools that includes Data Lake Storage, Compute, Security and Data Governance. • Advanced skills in ETL processes, SQL (like Cloud SQL, Aurora, Postgres), NoSQL DB’s (like DynamoDB, MongoDB, Cassandra) and programming languages (e.g., Python, Spark, or Scala). Real-time streaming applications preferably in Spark, Kafka, or other streaming platforms. • AWS Data Security: Good Understanding of security concepts such as: Lake formation, IAM, Service roles, • Encryption, KMS, Secrets Manager. • Hands-on experience with Data Warehousing solutions and modern architectures like Lakehouse’s or Delta Lake. Proficiency in visualization tools such as Tableau, Power BI, or Qlik. • Strong problem-solving skills and ability to debug and optimize application applications for performance. • Strong understanding of Database/SQL for database operations and data management. • Familiarity with CI/CD pipelines and version control systems like Git. • Strong understanding of Agile methodologies and working within scrum teams. Preferred Qualifications: • Bachelor of Engineering degree in Computer Science, Information Technology, or a related field. • AWS Certified Solutions Architect – Associate (required). • Experience with Agile/Scrum methodologies and design sprints.

Posted 4 days ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Required Skillset: with minimum 8+ years of hands-on experience in designing, developing, and delivering advanced analytics solutions using Power BI. This role requires deep technical expertise in data modeling, ETL architecture, and enterprise-grade reporting frameworks. Data Warehousing: Proven expertise in data warehousing concepts and analytics architecture ( Oracle ADW/Snowflake/Microsoft Azure) Data Modeling Skills : Strong proficiency in Power BI Semantic data modeling, DAX, and visual storytelling. Experience working with large-scale datasets, cloud-based multi sources, and hybrid architectures.(Power BI) Programming Languages: Advanced knowledge of Python for data manipulation and workflow automation. Proficient in PL/SQL is Preferred. ETL Processes : Solid understanding of ETL processes, with the ability to produce high-quality specification documents ( Oracle ODI/Informatica(IICS) /SSIS). BI Tools: Experience in Power Report/Dashboard building . Ability to orchestrate and mentor Power BI developers effectively. ( Power BI) Responsibilities: - Translate complex business requirements into scalable technical solutions using Power BI and related technologies - Architect and implement semantic data models using Star, Snowflake, and composite designs - Lead the development of interactive dashboards and executive-level visualizations - Design and document ETL specifications; collaborate with ETL developers to ensure alignment with reporting needs - Oversee Power BI development teams and provide guidance on best practices - Integrate and model data from multiple API sources, leveraging Python for transformation and automation - Optimize performance across Power BI reports and dataflows using advanced DAX expressions - Ensure alignment between analytics architecture and enterprise data governance standards

Posted 4 days ago

Apply

5.0 - 7.0 years

0 Lacs

Pune, Maharashtra, India

On-site

What You’ll Do Works in Data and Analytics to independently design projects to meet business needs Leverage advanced techniques to create code that is optimized, efficient, repeatable and scalable Use analytical insights to develop solutions, improved processes and work with counterparts in the business to implement proposed solutions Understand ingestion prioritizations, analyze, understand and prepare complex and new data sources and incorporate them into project design. Acts as a domain and/or technical lead to more junior members Package, summarize, visualize and perform storytelling on analytical findings and results for management and business users Solve complex internal or customer analytical problems across various business domains and areas of expertise Ability to talk to impact and strategic importance of findings on the business (either Equifax or external customer) and recommend appropriate course of action; ability to advocate to implementation of solutions and improved processes as well as share the qualitative and quantitative elements to drive value and share the "so what' out of the analysis being performed. Ensure proper use of Equifax data assets by working closely with data governance and compliance professionals What Experience You Need BS degree in a STEM major or equivalent discipline 5-7 years of experience in a related analyst role Cloud certification strongly preferred Technical capabilities including SQL, BigQuery, R, Python, MS Excel / Google Sheets, Tableau, Looker Experience working as a team and collaborating with others on producing descriptive and diagnostic analysis. Experience working with non-technical partners to explore new capabilities and data monetisation opportunities What Could Set You Apart Demonstrates technical competence and business acumen with advanced data analysis techniques including statistical analysis, data mining, data visualization, BI Tools, time series analysis Proficiency in Data Management & Big Data tools : ETL for data integration and cleansing. Experience with cloud databases and storage. Familiar with big data frameworks with experience in handling large-scale datasets and big data analysis Knowledge and in-depth understanding of relational database management systems Experience in Data Visualization through creation of designing clear, effective charts and dashboards, data story telling that help stakeholders understand complex data. Experience in Data Governance and quality management Great stakeholder communication, excellent interpersonal skills with an ability to communicate complex analytic findings to non-technical audience Strong numeracy skills. High level of attention to detail. Ability to work independently on complex and demanding projects. A willingness to learn new applications and systems. Project Management skills with ability to organise tasks and man age time to consistently meet deadlines. Big picture thinking, dynamic problem-solving, forecasting and predictive skills Experience with Agile Methodology and request/issue queue management Experience in machine learning or predictive analytics, to model future outcomes or automate processes Experience with Comprehensive Credit Reporting Understanding of Equifax or similar products

Posted 4 days ago

Apply

10.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Summary Experience: 10 Years Minimum in SAP BODS, Data Migration with implementation projects experience as mandate. Location: Bangalore Notice Peirod: Immediate, Short Notice Period Joiners are preferred. Work Model: Hybrid Model, Need to operate from office for 3 days in a week or 12 days in a month as mandate. Shifts: UK- India based shift timings. *BODS with 10yrs exp. Knowledge in BW is must. Good knowledge in calculation views *Experience in BODS design and architecture ¿ Should have worked on preparation of Technical Specification documents. ¿ Strong hands on experience in Business Objects Data Services (BODS) as technical developer ¿ Thorough knowledge of developing real time Interfaces like IDOC , BAPI SAP applications with BODS ¿ Sound Knowledge of SQL ¿ Experience in Data Migration Project ¿ Good analytical skills to analyze the ETL issues and fix them independently ¿ Must have experience of Installation and configuring BODS. ¿ Experience on Various transformations of BODS. ¿ Should have experience in Data Migration project with an end to end implementation. ¿ Should have good understanding of the BODS landscape and architecture. ¿ Should be able to provide the estimation on BODS work effort. ¿ Should have knowledge on working with SAP system as source and target. ¿ Should be able to connect to customers and gather requirements and work independently on those requirements. ¿ Should have basic knowledge of Data Warehousing concepts on schema formation. ¿ Should have sound knowledge of BODS scheduling and Management Console

Posted 4 days ago

Apply

6.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

#BIDeveloper Experience - 6+ Years Skills - Tableau, MicroStrategy, Dashboards, Prep, KPI reporting and metrics, Filters, SQL Location - Hyderabad Requirements: Experience : 4+ years of hands-on experience in building and maintaining dashboards/reports using MicroStrategy and Tableau . SQL : Strong proficiency in SQL for data querying, optimization, and database management. Data Visualization : Strong experience in designing visually impactful, user-friendly reports and dashboards. Data Warehousing : Understanding of data warehousing concepts, ETL processes, and data modeling. Problem-Solving : Strong analytical skills with the ability to troubleshoot and resolve technical issues. Communication : Excellent verbal and written communication skills, with the ability to explain technical concepts to non-technical stakeholders.

Posted 4 days ago

Apply

150.0 years

0 Lacs

Gurgaon, Haryana, India

Remote

A Snapshot of Your Day As a Backend developer / Data Engineer (f/w/d), you will build and guide your team of Data Engineers who are responsible for enabling technically translated business requirements into analytical products, helping to bring new insights alive. As a leader, you will be at the forefront of our transformation journey to become a data driven company in future. Passionate about the environment and climate change? Ready to be part of the future of the energy transition? The Siemens Energy Data Analytics & AI team plays a meaningful role in driving the energy transformation. Our team is looking for innovative, hardworking, and versatile data, digital, and AI professionals that will drive us forward on this exciting venture. How You’ll Make An Impact Drive technical implementation and accountability for strategy and execution Determine the scope of the Minimum Viable Product (MVP) for upcoming initiatives and outline the future roadmap for enhancements and potential improvements Collaborate with data owners on planning and execution of key initiatives Oversee the performance and quality of integration pipelines created in close collaboration with the data integration team Drive architecture design and discussions with guidance from the teams Promote data literacy towards visualization through self-service capabilities for end users What You Bring Extensive experience leading data architectures and data-related initiatives, as well as leading a data operations team In-depth understanding and knowledge of data acquisition, data modeling and analytics Extensive experience in data & analytics and data architecture and proven ability as a team leader, Extensive and in-depth knowledge of database and data warehouse modeling Experience developing streaming and batch data pipelines for cloud and hybrid infrastructures, Experience with streaming frameworks (e.g., AWS IoT stack) and hands-on experience with modern software development tools Extensive experience in data analytics and the ability to slice and dice data as needed Experience with cloud providers in data architecture (e.g. AWS) , Knowledge of data warehouses that support analytical workloads. Extensive and in-depth experience working with ETL About The Team Within the enterprise team for Advanced Analytics & AI, we develop new methodologies and tools to enable data-driven decision-making along the value chain improving sustainability, reliability, and affordability of our energy solutions across the globe. We provide guidance and direction to the business and drive the strategy for the development of AI technologies within a partner ecosystem across industry and academia. In our Business Functions we enable our organization to reach their targets by providing best-in class services and solutions in the areas of IT, HR, Finance, Real Estate, Strategy & Technology and more. Who is Siemens Energy? At Siemens Energy, we are more than just an energy technology company. We meet the growing energy demand across 90+ countries while ensuring our climate is protected. With ~100,000 dedicated employees, we not only generate electricity for over 16% of the global community, but we’re also using our technology to help protect people and the environment. Our global team is committed to making sustainable, reliable, and affordable energy a reality by pushing the boundaries of what is possible. We uphold a 150-year legacy of innovation that encourages our search for people who will support our focus on decarbonization, new technologies, and energy transformation. Find out how you can make a difference at Siemens Energy: https://www.siemens-energy.com/employeevideo Our Commitment to Diversity Lucky for us, we are not all the same. Through diversity, we generate power. We run on inclusion and our combined creative energy is fueled by over 130 nationalities. Siemens Energy celebrates character – no matter what ethnic background, gender, age, religion, identity, or disability. We energize society, all of society, and we do not discriminate based on our differences. Rewards/Benefits Employees are eligible for Remote Working arrangements up to 2 days per week. All employees are automatically covered under the Medical Insurance. Company paid considerable Family floater cover covering employee, spouse and 2 dependent children up to 25 years of age. Siemens Energy provides an option to opt for Meal Card to all its employees which will be as per the terms and conditions prescribed in the company policy. – As a part of CTC, tax saving measure Flexi Pay empowers employees with the choice to customize the amount in some of the salary components within a defined range thereby optimizing the tax benefits. Accordingly, each employee is empowered to decide on the best Possible net income out of the same fixed individual base pay on a monthly basis Reference https://jobs.siemens-energy.com/jobs

Posted 4 days ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

📊 We’re Hiring | Sr. Power BI Developer – WFO | Gurugram 📍 Location: Gurugram (Work from Office) 💰 CTC: Up to ₹18 LPA 💼 Experience: Relevant expertise in BI Reporting, DAX, SQL, ETL 📞 Contact: Nitin Chopra | 📧 nitin@skyleaf.global | 📱 8743841946 🏢 Client: Confidential (Global Tech + Analytics Firm) About the Role: We’re looking for a Senior Power BI Developer to transform complex data into meaningful insights through high-impact dashboards, data models, and BI reports. If you’re passionate about data storytelling and decision support—this role is for you. Key Responsibilities: 🔹 Translate business requirements into powerful Power BI dashboards 🔹 Design & develop data models and interactive visualizations 🔹 Define KPIs and monitor them through insightful reporting 🔹 Execute advanced DAX queries and functions for metrics modeling 🔹 Optimize SQL queries and ETL processes for performance 🔹 Build and document data warehouse logic, relationships & parameters 🔹 Improve and maintain existing BI systems with enhancements What You Bring: ✅ Strong hands-on experience with Power BI, DAX, SQL, and Data Warehousing ✅ Ability to translate raw data into strategic visual insights ✅ Proficiency in writing optimized scripts for performance reporting ✅ Understanding of current ETL systems and ability to redesign them ✅ Strong analytical mindset with excellent business acumen 📩 Interested in building data-driven impact? Let’s connect! Nitin Chopra 📧 nitin@skyleaf.global | 📱 8743841946

Posted 4 days ago

Apply

2.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

KEY ACCOUNTABILITIES: Microsoft Dynamics Navision ERP Process and skilled & for Process customization. Using Power BI, create dashboards and interactive visual reports. Define key performance indicators (KPIs) with specific objectives and track them regularly. Analyse data and display it in reports to aid decision-making. Create, test, and deploy Power BI scripts, as well as execute efficient deep analysis. Use Power BI to run DAX queries and functions. Create charts and data documentation with explanations of algorithms, parameters, models, and relationships. Construct a data warehouse. Use SQL queries to get the best results. Make technological adjustments to current BI systems to improve their performance. For a better understanding of the data, use filters and visualizations. Minimum Qualifications: Bachelor's degree in Computer Science, Information Technology, or a related field. 2+ years of experience in Microsoft Dynamics Navision ERP & BI development and data analysis. Strong proficiency in Power BI, including creating dashboards, reports, and data models. Experience with SQL Server and MS SQL Server BI Stack tools and Power BI Proven ability to analyze data and extract meaningful insights. Excellent communication and interpersonal skills. Preferred Qualifications: Experience with other ERP and Business Process and Power BI tools.. Experience with data warehousing and ETL (Extract, Transform, Load) processes. Experience with cloud-based BI solutions, such as Azure Power BI or AWS QuickSight. Certification in Microsoft Dynamics Navision ERP . In addition to the technical qualifications listed above, the ideal candidate will also have a strong understanding of business processes and a passion for data-driven decision-making. They will be able to work independently and as part of a team, and they will be able to communicate complex technical concepts to both technical and non-technical audiences. We are the ASSA ABLOY Group Our people have made us the global leader in access solutions. In return, we open doors for them wherever they go. With nearly 63,000 colleagues in more than 70 different countries, we help billions of people experience a more open world. Our innovations make all sorts of spaces – physical and virtual – safer, more secure, and easier to access. As an employer, we value results – not titles, or backgrounds. We empower our people to build their career around their aspirations and our ambitions – supporting them with regular feedback, training, and development opportunities. Our colleagues think broadly about where they can make the most impact, and we encourage them to grow their role locally, regionally, or even internationally. As we welcome new people on board, it’s important to us to have diverse, inclusive teams, and we value different perspectives and experiences.

Posted 4 days ago

Apply

5.0 - 15.0 years

0 Lacs

India

On-site

**********************************4 months contract opportunity********************************** Job Summary: We are seeking a skilled and detail-oriented Snowflake Developer to design, develop, and maintain scalable data solutions using the Snowflake platform. The ideal candidate will have experience in data warehousing, ETL/ELT processes, and cloud-based data architecture. Key Responsibilities: Design and implement data pipelines using Snowflake, SQL, and ETL tools. Develop and optimize complex SQL queries for data extraction and transformation. Create and manage Snowflake objects such as databases, schemas, tables, views, and stored procedures. Integrate Snowflake with various data sources and third-party tools. Monitor and troubleshoot performance issues in Snowflake environments. Collaborate with data engineers, analysts, and business stakeholders to understand data requirements. Ensure data quality, security, and governance standards are met. Automate data workflows and implement best practices for data management. Required Skills and Qualifications: Proficiency in Snowflake SQL and Snowflake architecture. Experience with ETL/ELT tools (e.g., Informatica, Talend, dbt, Matillion). Strong knowledge of cloud platforms (AWS, Azure, or GCP). Familiarity with data modeling and data warehousing concepts. Experience with Python, Java, or Shell scripting is a plus. Understanding of data security, role-based access control, and data sharing in Snowflake. Excellent problem-solving and communication skills. Preferred Qualifications: Snowflake certification (e.g., SnowPro Core). Experience with CI/CD pipelines and DevOps practices. Knowledge of BI tools like Tableau, Power BI, or Looker. Req 5-15 years of experience is preferred. Experience with Agile based development Problem solving skills: Proficiency in writing performant SQL Queries/Scripts to generate business insights and drive better organizational decision making.

Posted 4 days ago

Apply

5.0 years

0 Lacs

India

Remote

Siebel Data Migration Location : Remote NP : 30days Experience :5+ yrs Budget : Max 28 LPA Payroll : STL - Sterlite Technologies Limited 5 - 14 years of experience with Telecom domain experience is mandatory • Should have work experience on Siebel Data Migration (EIM) including Import, Delete, Export and Merge Process. • Experience in handling Siebel data migration projects. • Having good experience in Requirement gathering, Design and Development of the Siebel Data Migration Projects. • Should be able to understand the legacy system and concept of data mapping • Able to Identify, communicate and resolve/mitigate risks, issues, assumptions. • Need to work with key business stakeholders to ensure meeting business requirements. • Should have Customer facing, technical architect and application designers to define the data migration requirements and structure for the application. • understand data migration deliverables throughout development to ensure quality and traceability to requirements and adherence to all quality management plans and standards. • Responsible for review and provide inputs and support during development of data migration Plan. • Should have good work experience and knowledge on SQL, PL/SQL, Shell Scripting along with Siebel EIM and understanding of Siebel data flow and data model. • Should have good knowledge on ETL Tools and Data Stage.

Posted 4 days ago

Apply

4.0 years

0 Lacs

India

On-site

Position Overview: We are seeking a proactive and technically skilled Junior Solution Architect to join our dynamic team. The ideal candidate will design, demo, and implement tailored AI-driven and conversational workflows that align with customer support objectives. This role requires close collaboration with client IT teams, business stakeholders, and internal teams to integrate CRMs and APIs, manage project risks, guide user testing, and ensure successful go-live execution—all while delivering seamless, automated customer support experiences. Key Responsibilities: Solution Design & Customization: Design, demo and build tailored AI-driven and conversational workflows that align with customer support objectives. Leverage decision trees and natural language interfaces to streamline complex support interactions. Technical Configuration & Integration: Collaborate with client IT teams to configure and integrate with CRMs, APIs, and third-party platforms—ensuring reliable data exchange and conversational AI compatibility. Client Onboarding & Planning: Lead onboarding sessions to gather business requirements, understand customer support processes, and define project goals, with a focus on identifying opportunities for conversational automation. Stakeholder Collaboration: Act as the primary technical point of contact for clients throughout the project lifecycle, providing regular updates and managing expectations. Operate as the technical liaison to translate requirements between client business and IT resources, as well internally with team members. Risk & Issue Management: Identify and mitigate project risks; resolve technical and process-related issues to prevent project delays. Testing & Validation: Guide UAT (User Acceptance Testing) with clients, ensuring conversational workflows and integrations meet functional and experience goals. Training & Enablement: Participate in platform and user training sessions to ensure customers are equipped to build and maintain AI-enabled conversational flows, workflows, and reporting. Go-Live & Post-Implementation Support: Ensure successful go-live execution and provide a seamless transition to Customer Success and Support for ongoing enhancements and support. Project Management: Coordinate and collaborate on multiple SaaS implementation projects concurrently - balancing technical execution, AI workflow development, and client alignment within defined timelines and scopes. Qualifications & Skills: Work Experience: 4+ years in a solutions architect, customer engineering, technical project implementation roles, with at least 2+ years in enterprise B2B SaaS, preferably in customer support or contact center solutions, and with start-up experience. Conversational AI & Technical Expertise: Proven hands-on experience with automation technologies and integration architectures, including APIs, authentication protocols, middleware, web services, and messaging patterns. Proficiency in technologies enabling conversational AI—such as large language models (LLMs), prompt engineering, AI-led agentic workflows, and real-time decisioning. Strong front-end skills (JavaScript, CSS, JSON) and familiarity with ETL/data transformation. Experience integrating with CRMs (especially Salesforce and Zendesk) is a significant plus. Business Acumen: Specializes in digesting complex business requirements and designing comprehensive and integrated workflow solutions that are flexible and adaptable to the client's needs. Analytical Mindset: Proficient in troubleshooting technical challenges and collaborating with cross-functional teams to find solutions. Ability to quickly understand operational processes and identify areas the technology can condense and/or optimize the processes. Client-Facing Expertise: Exceptional communication and interpersonal skills; experience interfacing with customers to manage expectations, resolve issues, and ensure project success. Ability to clearly articulate technical topics to a non-technical audience, including experience working with executives. Preferred Qualifications: Experience with A/B testing and SEO experimentation. Understanding of technical SEO, including site speed and mobile optimization. Familiarity with social media’s role in SEO and content amplification. Knowledge of CRO (Conversion Rate Optimization) principles. Strong project management skills and ability to handle multiple SEO initiatives simultaneously.

Posted 4 days ago

Apply

6.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Job Title: Senior Google Cloud Platform (GCP) Data Engineer Location: Hybrid (Bengaluru, India) Job Type: Full-Time Experience Required: Minimum 6 Years Joining: Immediate or within 1 week About the Company: Tech T7 Innovations is a global IT solutions provider known for delivering cutting-edge technology services to enterprises across various domains. With a team of seasoned professionals, we specialize in software development, cloud computing, data engineering, machine learning, and cybersecurity. Our focus is on leveraging the latest technologies and best practices to create scalable, reliable, and secure solutions for our clients. Job Summary: We are seeking a highly skilled Senior GCP Data Engineer with over 6 years of experience in data engineering and extensive hands-on expertise in Google Cloud Platform (GCP). The ideal candidate must have a strong foundation in GCS, BigQuery, Apache Airflow/Composer, and Python, with a demonstrated ability to design and implement robust, scalable data pipelines in a cloud environment. Roles and Responsibilities: Design, develop, and deploy scalable and secure data pipelines using Google Cloud Platform components including GCS, BigQuery, and Airflow. Develop and manage robust ETL/ELT workflows using Python and integrate with orchestration tools such as Apache Airflow or Cloud Composer. Collaborate with data scientists, analysts, and business stakeholders to gather requirements and deliver reliable and efficient data solutions. Optimize BigQuery performance using best practices such as partitioning, clustering, schema design , and query tuning . Manage, monitor, and maintain data lake and data warehouse environments with high availability and integrity. Automate pipeline monitoring, error handling, and alerting mechanisms to ensure seamless and reliable data delivery . Contribute to architecture decisions involving data modeling, data flow, and integration strategies in a cloud-native environment. Ensure compliance with data governance , privacy, and security policies as per enterprise and regulatory standards. Mentor junior engineers and drive best practices in cloud engineering and data operations . Mandatory Skills: Google Cloud Platform (GCP): In-depth hands-on experience with GCS, BigQuery, IAM, and Cloud Functions. BigQuery (BQ): Expertise in large-scale analytics, schema optimization, and data modeling. Google Cloud Storage (GCS): Strong understanding of data lifecycle management, access controls, and best practices. Apache Airflow / Cloud Composer: Proficiency in writing and managing complex DAGs for data orchestration. Python Programming: Advanced skills in automation, API integration, and data processing using libraries like Pandas, PySpark, etc. Preferred Qualifications: Experience with CI/CD pipelines for data infrastructure and workflows. Exposure to other GCP services like Dataflow , Pub/Sub , and Cloud Functions . Familiarity with Infrastructure as Code (IaC) tools such as Terraform . Strong communication and analytical skills for problem-solving and stakeholder engagement. GCP Certifications (e.g., Professional Data Engineer) will be a significant advantage

Posted 4 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies