Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
7 - 12 years
8 - 14 Lacs
Delhi NCR, Mumbai, Bengaluru
Work from Office
We are seeking an experienced Data Migration Consultant to lead and execute data migration projects. The ideal candidate will have a deep understanding of data migration strategies, tools, and best practices, along with the ability to work across various data environments. This role involves ensuring the smooth transition of data from legacy systems to new environments, while maintaining data integrity and minimizing disruptions to business operations. Responsibilities : - Lead the planning, design, and execution of data migration projects, ensuring timely and accurate data transfer from legacy systems to new platforms. - Collaborate with stakeholders to understand data requirements, map source to target data, and define data migration strategies. - Develop data extraction, transformation, and loading (ETL) processes using appropriate tools and technologies. - Perform data quality assessments, cleanse data, and resolve data inconsistencies prior to migration. - Manage and execute data migration testing, including data validation and reconciliation, to ensure the accuracy and completeness of migrated data. - Troubleshoot and resolve issues related to data migration, minimizing disruptions to business operations. - Create and maintain detailed documentation of data migration processes, including data mapping, transformation rules, and validation reports. - Provide post-migration support, addressing any data-related issues that arise after go-live. - Collaborate with cross-functional teams, including IT, business analysts, and project managers, to ensure successful data migration. - Stay updated with the latest data migration tools, techniques, and best practices, applying them to enhance project outcomes. Must Have Skills : - Extensive experience in data migration projects, with a focus on ETL processes and tools. - Strong understanding of data migration strategies, including data extraction, transformation, and loading (ETL). - Proven experience in working with various data environments, including relational databases, cloud platforms, and legacy systems. - Expertise in data mapping, data cleansing, and data quality assurance. - Ability to develop and execute data migration testing plans, including data validation and reconciliation. - Strong problem-solving skills and the ability to troubleshoot data migration issues. - Excellent communication skills, capable of working with both technical and non-technical stakeholders. - Experience with data migration tools such as SAP Data Services, Informatica, Talend, or equivalent. - Ability to work independently and collaboratively within a team. - Strong attention to detail and commitment to data accuracy. Preferred Skills : - Experience with data migration in SAP environments, including SAP S/4HANA and SAP ECC. - Knowledge of data governance and data management best practices. - Familiarity with cloud-based data migration strategies and tools. - SAP certification in data migration or related areas. - Experience with Agile project management methodologies. Location-Delhi NCR,Bangalore,Chennai,Pune,Kolkata,Ahmedabad,Mumbai,Hyderabad
Posted 3 months ago
2 - 4 years
10 - 20 Lacs
Mumbai
Hybrid
As a Deputy Manager in the Data Engineering department, you will be responsible for managing and organizing data structures, designing and implementing scalable ETL processes, and ensuring high data quality and integrity. You will work closely with various teams to understand and fulfill their data requirements, and contribute to the overall data strategy of the organization.
Posted 3 months ago
5 - 8 years
8 - 12 Lacs
Chennai, Bengaluru
Work from Office
Initio Reporting Resource Role Description: Ab Initio Developer would need to have at least 7+ years of experience. Primary Skill: Experience with ETL tool Ab Initio. Datawarehouse project experience. Database designing/modeling exp. Secondary Skill: Reporting: Graph Development SSIS/Talend /Ab Initio Strong MS SQL Database Responsibilities: Data Warehousing experience is preferred. Designing and creating Graph Development, EME basic, SDLC, Data Analysis Design, build, and test applications for data management and business intelligence in the Ab Initio ETL environment. Experience on Ab Initio components like Reformat, Join, Sort, Rollup, Ab initio parallelism and products like Metadata Hub, Control Center Excellent technical knowledge in Design, development and validation of complex ETL features using Ab Initio Plan, coordinate, develop and support ETL processes including architecting table structure, building ETL process, documentation, and long term preparedness. Develop cross validation rules to ensure mapping accuracy. Designing database tables and creating views, functions, and stored procedures. Writing optimized SQL queries for integration with other applications. Creating database triggers for use in automation. Maintaining data quality and overseeing database security. Communicate issues, risks, and concerns proactively to management. Self sufficient and good communication skills
Posted 3 months ago
2 - 6 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : IBM InfoSphere DataStage Good to have skills : NA Minimum 2 year(s) of experience is required Educational Qualification : 15 years full time education with Engineering or equivalent Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will work closely with the team to ensure the successful delivery of high-quality software solutions. Roles & Responsibilities: Expected to perform independently and become an SME. Required active participation/contribution in team discussions. Contribute in providing solutions to work related problems. Collaborate with cross-functional teams to gather and analyze requirements. Design, develop, and test applications using IBM InfoSphere DataStage. Troubleshoot and debug issues in existing applications. Ensure the scalability and performance of applications. Document technical specifications and user guides for reference. Professional & Technical Skills: Must To Have Skills:Proficiency in IBM InfoSphere DataStage. Strong understanding of ETL concepts and data integration techniques. Experience in designing and implementing data integration solutions. Knowledge of SQL and database concepts. Familiarity with data warehousing and data modeling. Good To Have Skills:Experience with IBM InfoSphere Information Server. Experience with other ETL tools such as Informatica or Talend. Additional Information: The candidate should have a minimum of 2 years of experience in IBM InfoSphere DataStage. This position is based at our Bengaluru office. A 15 years full time education with Engineering or equivalent is required. Qualification 15 years full time education with Engineering or equivalent
Posted 3 months ago
0 - 2 years
4 - 8 Lacs
Ahmedabad
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Data Engineering Good to have skills : NA Minimum 0-2 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions for data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across systems. You will play a crucial role in managing and optimizing data infrastructure to support the organization's data needs. Roles & Responsibilities: Expected to build knowledge and support the team. Participate in Problem Solving discussions. Design and develop data pipelines to extract, transform, and load data from various sources. Ensure data quality and integrity by implementing data validation and cleansing processes. Collaborate with cross-functional teams to understand data requirements and design efficient data solutions. Optimize and tune data pipelines for performance and scalability. Troubleshoot and resolve data-related issues and incidents. Stay updated with the latest trends and technologies in data engineering and recommend improvements to existing processes. Additional responsibility:Mentor and guide junior professionals in data engineering best practices. Professional & Technical Skills: Must To Have Skills:Proficiency in Data Engineering. Strong understanding of data modeling and database design principles. Experience with ETL tools such as Apache NiFi or Talend. Familiarity with cloud platforms such as AWS or Azure. Good To Have Skills:Experience with big data technologies such as Hadoop or Spark. Knowledge of data warehousing concepts and techniques. Experience with SQL and NoSQL databases. Solid understanding of data governance and security principles. Additional Information: The candidate should have a minimum of 0-2 years of experience in Data Engineering. This position is based at our Ahmedabad office. A 15 years full-time education is required. Qualification 15 years full time education
Posted 3 months ago
5 - 10 years
10 - 14 Lacs
Bengaluru
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Informatica PowerCenter Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : regular 15 years of Education Summary :As an Application Lead for Custom Software Engineering, you will be responsible for leading the effort to design, build, and configure applications using Informatica PowerCenter. Your typical day will involve collaborating with cross-functional teams, ensuring timely delivery of projects, and acting as the primary point of contact for the project. Roles & Responsibilities: Lead the design, development, and implementation of Informatica PowerCenter-based ETL solutions. Collaborate with cross-functional teams to ensure timely delivery of projects. Act as the primary point of contact for the project, providing guidance and support to team members. Ensure adherence to best practices and standards for software development, testing, and deployment. Provide technical leadership and mentorship to team members, ensuring their professional growth and development. Professional & Technical Skills: Must To Have Skills:Strong experience in Informatica PowerCenter. Good To Have Skills:Experience in other ETL tools like DataStage, Talend, or SSIS. Experience in designing and implementing ETL solutions for complex data integration scenarios. Strong understanding of data warehousing concepts and best practices. Experience in SQL and database technologies like Oracle, SQL Server, or Teradata. Experience in Unix/Linux environments and shell scripting. Additional Information: The candidate should have a minimum of 5 years of experience in Informatica PowerCenter. The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering high-quality ETL solutions. This position is based at our Bengaluru office. Qualifications regular 15 years of Education
Posted 3 months ago
5 - 9 years
10 - 15 Lacs
Bengaluru
Work from Office
Practice Overview: Skill/Operating Group Technology Consulting Level Consultant Location Gurgaon/Mumbai/Bangalore/Pune/Hyderabad/Chennai/Kolkata Travel Percentage Expected Travel could be anywhere between 0-100% Why Technology Consulting The Technology Consulting business within Capability Network invents the future for clients by providing them with the right guidance, design thinking and innovative solutions for technological transformation. As technology rapidly evolves, it's more important than ever to have an innovation advisor who can create a new vision or put one into place to solve the client's toughest business problems. Specialize in management or technology consulting to transform the world's leading organizations by building innovative business solutions as their trusted advisor by: Helping Clients :Rethinking IT and Digital ecosystems and innovating to help clients keep pace with fast-changing, customer-driven environments. Enhancing your Skillset: Building expertise and innovating with leading-edge technologies such as Blockchain, Artificial Intelligence and Cloud. Transforming Businesses :Developing customized, next-generation products and services that help clients shift to new business models designed for today's connected landscape of disruptive technologies Principal Duties And Responsibilities: Working closely with our clients, Consulting professionals design, build and implement strategies that can help enhance business performance. They develop specialized expertise-strategic, industry, functional, technical-in a diverse project environment that offers multiple opportunities for career growth. The opportunities to make a difference within exciting client initiatives are limitless in this ever-changing business landscape. Here are just a few of your day-to-day responsibilities. Identifying, assessing, and solving complex business problems for area of responsibility, where analysis of situations or data requires an in-depth evaluation of variable factors Overseeing the production and implementation of solutions covering multiple cloud technologies, associated Infrastructure / application architecture, development, and operating models Called upon to apply your solid understanding of Data, Data on cloud and disruptive technologies. Driving enterprise business, application, and integration architecture Helping solve key business problems and challenges by enabling an architecture transformation, painting a picture of, and charting a journey from the current state to a to-be enterprise environment Lead business development activities and assist the team to achieve sales targets Experience in participating in client presentations & orals for proposal defense etc. Implementing programs/interventions that prepare the organization for implementation of new business processes Assisting our clients to build the required capabilities for growth and innovation to sustain high performance Managing multi-disciplinary teams to shape, sell, communicate, and implement programs Experience in effectively communicating the target state, architecture & topology on cloud to clients Deep understanding of Data and Analytics platforms, data integration w/ cloud and industry best practices in data governance and management Provide thought leadership to the downstream teams for developing offerings and assets Mentoring and developing our people Qualifications Qualifications: Bachelor's degree MBA Degree from Tier-1 College (Preferable) Minimum 5 -9 years of large-scale consulting experience and/or working with hi tech companies in data architecture, data governance, information security and information management. Certified on DAMA (Data Management) or Azure Data Architecture or Google Cloud Data Analytics or AWS Data Analytics Experience: We are looking for experienced professionals with information strategy, data architecture, data modernization, data governance, data management, data operating model and data security experience across all stages of the innovation spectrum, with a remit to build the future in real-time. The candidate should have practical industry expertise in one of these areas - Financial Services, Retail, consumer goods, Telecommunications, Life Sciences, Transportation, Hospitality, Automotive/Industrial, Mining and Resources. Key Competencies and Skills: The right candidate should have competency and skills aligned to one or more of these archetypes - Data SME - Experience in deal shaping & strong presentation skills, leading proposal experience, customer orals; technical understanding of data platforms, data on cloud strategy, data strategy, data operating model, change management of data transformation programs, data modeling skills. Data on Cloud Architect - Technical understanding of data platform strategy for data on cloud migrations, big data technologies, experience in architecting large scale data lake and DW on cloud solutions. Experience one or more technologies in this space: AWS, Azure, GCP, AliCloud, Snowflake, Hadoop, Cloudera Data Strategy - Data Capability Maturity Assessment, Data & Analytics / AI Strategy, Data Operating Model & Governance, Data Hub Enablement, Data on Cloud Strategy, Data Architecture Strategy Data Transformation Lead - Understanding of data supply chain and data platforms on cloud, experience in conducting alignment workshops, building value realization framework for data transformations, program management experience MDM / DQ/ DG Architect - Data Governance & Management SME for areas including Data Quality, MDM, Metadata, data lineage, data catalog. Experience one or more technologies in this space: Collibra, Talend, Informatica, SAP MDG, Stibo, Alteryx, Alation etc. Exceptional interpersonal and presentation skills - ability to convey technology and business value propositions to senior stakeholders Capacity to develop high impact thought leadership that articulates a forward-thinking view of the market Other desired skills - Strong desire to work in technology-driven business transformation Strong knowledge of technology trends across IT and digital and how they can be applied to companies to address real world problems and opportunities. Comfort conveying both high-level and detailed information, adjusting the way ideas are presented to better address varying social styles and audiences. Leading proof of concept and/or pilot implementations and defining the plan to scale implementations across multiple technology domains Flexibility to accommodate client travel requirements Published Thought leadership Whitepapers, POV's Additional Information: Accenture is an equal opportunities employer and welcomes applications from all sections of society and does not discriminate on grounds of race, religion, or belief, ethnic or national origin, disability, age, citizenship, marital, domestic, or civil partnership status, sexual orientation, gender identity, or any other basis as protected by applicable law.
Posted 3 months ago
3 - 8 years
5 - 10 Lacs
Pune
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Microsoft Azure Data Services Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : NA Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements using Microsoft Azure Data Services. Your typical day will involve working with Azure Data Factory, Azure Databricks, and other Azure services to develop and deploy data pipelines and data processing solutions. Roles & Responsibilities: Design, develop, and deploy data pipelines and data processing solutions using Microsoft Azure Data Services, including Azure Data Factory, Azure Databricks, and other Azure services. Collaborate with cross-functional teams to understand business requirements and design solutions that meet those requirements. Develop and maintain technical documentation, including design documents, data flow diagrams, and data models. Troubleshoot and resolve issues related to data pipelines and data processing solutions, working closely with other members of the team and with external vendors as needed. Professional & Technical Skills: Must To Have Skills:Experience with Microsoft Azure Data Services, including Azure Data Factory and Azure Databricks. Good To Have Skills:Experience with other Azure services, such as Azure Synapse Analytics and Azure Stream Analytics. Strong understanding of data modeling and data warehousing concepts. Experience with SQL and NoSQL databases, such as Azure SQL Database and Cosmos DB. Experience with data integration and ETL tools, such as Informatica and Talend. Additional Information: The candidate should have a minimum of 3 years of experience in Microsoft Azure Data Services. The ideal candidate will possess a strong educational background in computer science, information technology, or a related field, along with a proven track record of delivering data processing solutions. This position is based at our Pune office. Qualifications NA
Posted 3 months ago
3 - 8 years
5 - 10 Lacs
Coimbatore
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : SAP SuccessFactors Onboarding Good to have skills : SAP SuccessFactors Employee Central, SAP SuccessFactors Recruiting Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions for data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across systems. You will play a crucial role in managing and optimizing data infrastructure to support business needs and drive data-driven decision-making. Roles & Responsibilities: Expected to perform independently and become an SME. Required active participation/contribution in team discussions. Contribute in providing solutions to work-related problems. Design and develop data solutions for data generation, collection, and processing. Create and maintain data pipelines to ensure efficient data flow. Implement ETL (extract, transform, and load) processes to migrate and deploy data across systems. Ensure data quality and integrity by performing data validation and cleansing. Collaborate with cross-functional teams to understand data requirements and provide technical expertise. Optimize data infrastructure and performance to support business needs. Troubleshoot and resolve data-related issues in a timely manner. Professional & Technical Skills: Must To Have Skills:Proficiency in SAP SuccessFactors Onboarding. Good To Have Skills:Experience with SAP SuccessFactors Employee Central, SAP SuccessFactors Recruiting. Strong understanding of data engineering principles and best practices. Experience with data modeling and database design. Proficient in SQL and scripting languages like Python or R. Familiarity with cloud platforms and technologies such as AWS or Azure. Knowledge of data integration and ETL tools like Informatica or Talend. Experience with data visualization tools such as Tableau or Power BI. Additional Information: The candidate should have a minimum of 3 years of experience in SAP SuccessFactors Onboarding. This position is based in Coimbatore. A 15 years full-time education is required. Qualifications 15 years full time education
Posted 3 months ago
5 - 10 years
10 - 12 Lacs
Pune
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : PySpark, Apache Spark, Talend ETL Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. You will collaborate with teams, make team decisions, and provide solutions to problems. Engaging with multiple teams, you will contribute to key decisions and provide solutions for your immediate team and across multiple teams. In this role, you will have the opportunity to showcase your creativity and technical expertise in designing and building applications. Roles & Responsibilities: Expected to be an SME Collaborate and manage the team to perform Responsible for team decisions Engage with multiple teams and contribute on key decisions Provide solutions to problems for their immediate team and across multiple teams Design, build, and configure applications to meet business process and application requirements Collaborate with cross-functional teams to gather and define application requirements Develop and implement software solutions using the Databricks Unified Data Analytics Platform Perform code reviews and ensure adherence to coding standards Troubleshoot and debug applications to identify and resolve issues Optimize application performance and ensure scalability Document technical specifications and user manuals for applications Stay updated with emerging technologies and industry trends Train and mentor junior developers to enhance their technical skills Professional & Technical Skills: Must To Have Skills:Proficiency in Databricks Unified Data Analytics Platform Good To Have Skills:Experience with PySpark, Apache Spark, Talend ETL Strong understanding of statistical analysis and machine learning algorithms Experience with data visualization tools such as Tableau or Power BI Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity Additional Information: The candidate should have a minimum of 5 years of experience in Databricks Unified Data Analytics Platform This position is based at our Bengaluru office A 15 years full time education is required Qualifications 15 years full time education
Posted 3 months ago
5 - 10 years
7 - 12 Lacs
Pune
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Data Warehouse ETL Testing Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. Your day will involve creating innovative solutions to address business needs and ensuring seamless application functionality. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead and mentor junior professionals- Implement best practices for application development- Conduct code reviews and ensure code quality Professional & Technical Skills:- Must To Have Skills:Proficiency in Data Warehouse ETL Testing- Strong understanding of ETL processes- Experience with data modeling and database design- Knowledge of SQL and scripting languages- Hands-on experience with ETL tools like Informatica or Talend Additional Information:- The candidate should have a minimum of 5 years of experience in Data Warehouse ETL Testing- This position is based at our Pune office- A 15 years full-time education is required Qualifications 15 years full time education
Posted 3 months ago
10 - 14 years
35 - 40 Lacs
Chennai, Pune, Kolkata
Work from Office
Azure Expertise: The architect should have experience in architecting large scale analytics solutions using native services such as Azure Synapse, Data Lake, Data Factory, HDInsight, Databricks, Azure Cognitive Services, Azure ML, Azure Event Hub. Work with customers, end users, technical architects, and application designers to define the data requirements and data structure for BI/Analytic/ solutions Designs conceptual and logical models for the data lake, data warehouse, data mart, and semantic layer (data structure, storage, and integration). Lead the database analysis, design, and build effort Expert in big data technologies on Azure/GCP Experience with ETL platform like ADF / Glue / Ab Initio/ Informatica / Talend / Airflow Experience in data visualization tools like Tableau, Power BI etc. 3+ years of experience in a data engineering, metadata management, database modelling and development role Strong experience in handling streaming data with kafka Understanding of Data APIs, Web services Experience in Data security and Data Archiving/Backup, Encryption and define the standard processes for same. Experience in setting up DataOps and MLOps Communicates physical database designs to lead data architect / database administrator
Posted 3 months ago
6 - 10 years
13 - 17 Lacs
Pune
Work from Office
About Persistent We are a trusted Digital Engineering and Enterprise Modernization partner, combining deep technical expertise and industry experience to help our clients anticipate what’s next. Our offerings and proven solutions create a unique competitive advantage for our clients by giving them the power to see beyond and rise above. We work with many industry-leading organizations across the world including 12 of the 30 most innovative US companies, 80% of the largest banks in the US and India, and numerous innovators across the healthcare ecosystem. Our growth trajectory continues, as we reported $1,231M annual revenue (16% Y-o-Y). Along with our growth, we’ve onboarded over 4900 new employees in the past year, bringing our total employee count to over 23,500+ people located in 19 countries across the globe. Persistent Ltd. is dedicated to fostering diversity and inclusion in the workplace. We invite applications from all qualified individuals, including those with disabilities, and regardless of gender or gender preference. We welcome diverse candidates from all backgrounds. For more details please login to www.persistent.com About The Position We are looking for a Big Data Lead who will be responsible for the management of data sets that are too big for traditional database systems to handle. You will create, design, and implement data processing jobs in order to transform the data into a more usable format. You will also ensure that the data is secure and complies with industry standards to protect the company?s information. What You?ll Do Manage customer's priorities of projects and requests Assess customer needs utilizing a structured requirements process (gathering, analyzing, documenting, and managing changes) to prioritize immediate business needs and advising on options, risks and cost Design and implement software products (Big Data related) including data models and visualizations Demonstrate participation with the teams you work in Deliver good solutions against tight timescales Be pro-active, suggest new approaches and develop your capabilities Share what you are good at while learning from others to improve the team overall Show that you have a certain level of understanding for a number of technical skills, attitudes and behaviors Deliver great solutions Be focused on driving value back into the business Expertise You?ll Bring 6 years' experience in designing & developing enterprise application solution for distributed systems Understanding of Big Data Hadoop Ecosystem components (Sqoop, Hive, Pig, Flume) Additional experience working with Hadoop, HDFS, cluster management Hive, Pig and MapReduce, and Hadoop ecosystem framework HBase, Talend, NoSQL databases Apache Spark or other streaming Big Data processing, preferred Java or Big Data technologies, will be a plus Benefits Competitive salary and benefits package Culture focused on talent development with quarterly promotion cycles and company-sponsored higher education and certifications Opportunity to work with cutting-edge technologies Employee engagement initiatives such as project parties, flexible work hours, and Long Service awards Annual health check-ups Insurance coverage: group term life, personal accident, and Mediclaim hospitalization for self, spouse, two children, and parents Inclusive Environment •We offer hybrid work options and flexible working hours to accommodate various needs and preferences. •Our office is equipped with accessible facilities, including adjustable workstations, ergonomic chairs, and assistive technologies to support employees with physical disabilities Let's unleash your full potential. See Beyond, Rise Above
Posted 3 months ago
6 - 10 years
13 - 17 Lacs
Bengaluru
Work from Office
About Persistent We are a trusted Digital Engineering and Enterprise Modernization partner, combining deep technical expertise and industry experience to help our clients anticipate what’s next. Our offerings and proven solutions create a unique competitive advantage for our clients by giving them the power to see beyond and rise above. We work with many industry-leading organizations across the world including 12 of the 30 most innovative US companies, 80% of the largest banks in the US and India, and numerous innovators across the healthcare ecosystem. Our growth trajectory continues, as we reported $1,231M annual revenue (16% Y-o-Y). Along with our growth, we’ve onboarded over 4900 new employees in the past year, bringing our total employee count to over 23,500+ people located in 19 countries across the globe. Persistent Ltd. is dedicated to fostering diversity and inclusion in the workplace. We invite applications from all qualified individuals, including those with disabilities, and regardless of gender or gender preference. We welcome diverse candidates from all backgrounds. For more details please login to www.persistent.com About The Position We are looking for a Big Data Lead who will be responsible for the management of data sets that are too big for traditional database systems to handle. You will create, design, and implement data processing jobs in order to transform the data into a more usable format. You will also ensure that the data is secure and complies with industry standards to protect the company?s information. What You?ll Do Manage customer's priorities of projects and requests Assess customer needs utilizing a structured requirements process (gathering, analyzing, documenting, and managing changes) to prioritize immediate business needs and advising on options, risks and cost Design and implement software products (Big Data related) including data models and visualizations Demonstrate participation with the teams you work in Deliver good solutions against tight timescales Be pro-active, suggest new approaches and develop your capabilities Share what you are good at while learning from others to improve the team overall Show that you have a certain level of understanding for a number of technical skills, attitudes and behaviors Deliver great solutions Be focused on driving value back into the business Expertise You?ll Bring 6 years' experience in designing & developing enterprise application solution for distributed systems Understanding of Big Data Hadoop Ecosystem components (Sqoop, Hive, Pig, Flume) Additional experience working with Hadoop, HDFS, cluster management Hive, Pig and MapReduce, and Hadoop ecosystem framework HBase, Talend, NoSQL databases Apache Spark or other streaming Big Data processing, preferred Java or Big Data technologies, will be a plus Benefits Competitive salary and benefits package Culture focused on talent development with quarterly promotion cycles and company-sponsored higher education and certifications Opportunity to work with cutting-edge technologies Employee engagement initiatives such as project parties, flexible work hours, and Long Service awards Annual health check-ups Insurance coverage: group term life, personal accident, and Mediclaim hospitalization for self, spouse, two children, and parents Inclusive Environment •We offer hybrid work options and flexible working hours to accommodate various needs and preferences. •Our office is equipped with accessible facilities, including adjustable workstations, ergonomic chairs, and assistive technologies to support employees with physical disabilities. Let's unleash your full potential. See Beyond, Rise Above
Posted 3 months ago
6 - 10 years
13 - 17 Lacs
Hyderabad
Work from Office
About Persistent We are a trusted Digital Engineering and Enterprise Modernization partner, combining deep technical expertise and industry experience to help our clients anticipate what’s next. Our offerings and proven solutions create a unique competitive advantage for our clients by giving them the power to see beyond and rise above. We work with many industry-leading organizations across the world including 12 of the 30 most innovative US companies, 80% of the largest banks in the US and India, and numerous innovators across the healthcare ecosystem. Our growth trajectory continues, as we reported $1,231M annual revenue (16% Y-o-Y). Along with our growth, we’ve onboarded over 4900 new employees in the past year, bringing our total employee count to over 23,500+ people located in 19 countries across the globe. Persistent Ltd. is dedicated to fostering diversity and inclusion in the workplace. We invite applications from all qualified individuals, including those with disabilities, and regardless of gender or gender preference. We welcome diverse candidates from all backgrounds. For more details please login to www.persistent.com About The Position We are looking for a Big Data Lead who will be responsible for the management of data sets that are too big for traditional database systems to handle. You will create, design, and implement data processing jobs in order to transform the data into a more usable format. You will also ensure that the data is secure and complies with industry standards to protect the company?s information. What You?ll Do Manage customer's priorities of projects and requests Assess customer needs utilizing a structured requirements process (gathering, analyzing, documenting, and managing changes) to prioritize immediate business needs and advising on options, risks and cost Design and implement software products (Big Data related) including data models and visualizations Demonstrate participation with the teams you work in Deliver good solutions against tight timescales Be pro-active, suggest new approaches and develop your capabilities Share what you are good at while learning from others to improve the team overall Show that you have a certain level of understanding for a number of technical skills, attitudes and behaviors Deliver great solutions Be focused on driving value back into the business Expertise You?ll Bring 6 years' experience in designing & developing enterprise application solution for distributed systems Understanding of Big Data Hadoop Ecosystem components (Sqoop, Hive, Pig, Flume) Additional experience working with Hadoop, HDFS, cluster management Hive, Pig and MapReduce, and Hadoop ecosystem framework HBase, Talend, NoSQL databases Apache Spark or other streaming Big Data processing, preferred Java or Big Data technologies, will be a plus Benefits Competitive salary and benefits package Culture focused on talent development with quarterly promotion cycles and company-sponsored higher education and certifications Opportunity to work with cutting-edge technologies Employee engagement initiatives such as project parties, flexible work hours, and Long Service awards Annual health check-ups Insurance coverage: group term life, personal accident, and Mediclaim hospitalization for self, spouse, two children, and parents Inclusive Environment •We offer hybrid work options and flexible working hours to accommodate various needs and preferences. •Our office is equipped with accessible facilities, including adjustable workstations, ergonomic chairs, and assistive technologies to support employees with physical disabilities. Let's unleash your full potential. See Beyond, Rise Above
Posted 3 months ago
10.0 years
0 Lacs
Bengaluru, Karnataka
On-site
Location: Bangalore - Karnataka, India - EOIZ Industrial Area Worker Type Reference: Regular - Permanent Pay Rate Type: Salary Career Level: T4(A) Job ID: R-43320-2025 Description & Requirements Grade 12- 3 Nos – Azure Data Architect Key Responsibilities of the role include: Data Engineer Develop and implement data engineering project including data lakehouse or Big data platform Knowledge of Azure Purview is must Knowledge of Azure Data fabric Ability to define reference data architecture Cloud native data platform experience in Microsoft Data stack including – Azure data factory, Databricks on Azure Knowledge about latest data trends including datafabric and data mesh Robust knowledge of ETL and data transformation and data standardization approaches Key contributor on growth of the COE and influencing client revenues through Data and analytics solutions Lead the selection, deployment, and management of Data tools, platforms, and infrastructure. Ability to guide technically a team of data engineers Oversee the design, development, and deployment of Data solutions Define, differentiate & strategize new Data services/offerings and create reference architecture assets Drive partnerships with vendors on collaboration, capability building, go to market strategies, etc. Guide and inspire the organization about the business potential and opportunities around Data Network with domain experts Collaborate with client teams to understand their business challenges and needs. Develop and propose Data solutions tailored to client specific requirements. Influence client revenues through innovative solutions and thought leadership. Lead client engagements from project initiation to deployment. Build and maintain strong relationships with key clients and stakeholders. Build re-usable Methodologies, Pipelines & Models Create data pipelines for more efficient and repeatable data science projects Design and implement data architecture solutions that support business requirements and meet organizational needs Collaborate with stakeholders to identify data requirements and develop data models and data flow diagrams Work with cross-functional teams to ensure that data is integrated, transformed, and loaded effectively across different platforms and systems Develop and implement data governance policies and procedures to ensure that data is managed securely and efficiently Develop and maintain a deep understanding of data platforms, technologies, and tools, and evaluate new technologies and solutions to improve data management processes Ensure compliance with regulatory and industry standards for data management and security. Develop and maintain data models, data warehouses, data lakes and data marts to support data analysis and reporting. Ensure data quality, accuracy, and consistency across all data sources. Knowledge of ETL and data integration tools such as Informatica, Qlik Talend, and Apache NiFi. Experience with data modeling and design tools such as ERwin, PowerDesigner, or ER/Studio Knowledge of data governance, data quality, and data security best practices Experience with cloud computing platforms such as AWS, Azure, or Google Cloud Platform. Familiarity with programming languages such as Python, Java, or Scala. Experience with data visualization tools such as Tableau, Power BI, or QlikView. Understanding of analytics and machine learning concepts and tools. Knowledge of project management methodologies and tools to manage and deliver complex data projects. Skilled in using relational database technologies such as MySQL, PostgreSQL, and Oracle, as well as NoSQL databases such as MongoDB and Cassandra. Strong expertise in cloud-based databases such as Azure datalake , Synapse, Azure data factory and AWS glue , AWS Redshift and Azure SQL. Knowledge of big data technologies such as Hadoop, Spark, snowflake, databricks , and Kafka to process and analyze large volumes of data. Proficient in data integration techniques to combine data from various sources into a centralized location. Strong data modeling, data warehousing, and data integration skills. People & Interpersonal Skills Build and manage a high-performing team of Data engineers and other specialists. Foster a culture of innovation and collaboration within the Data team and across the organization. Demonstrate the ability to work in diverse, cross-functional teams in a dynamic business environment. Candidates should be confident, energetic self-starters, with strong communication skills. Candidates should exhibit superior presentation skills and the ability to present compelling solutions which guide and inspire. Provide technical guidance and mentorship to the Data team Collaborate with other stakeholders across the company to align the vision and goals Communicate and present the Data capabilities and achievements to clients and partners Stay updated on the latest trends and developments in the Data domain What is required for the role? 10+ years of experience in the information technology industry with strong focus on Data engineering, architecture and preferably as Azure data engineering lead 8+ years of data engineering or data architecture experience in successfully launching, planning, and executing advanced data projects. Data Governance experience is mandatory MS Fabric Certified Experience in working on RFP/ proposals, presales activities, business development and overlooking delivery of Data projects is highly desired A master’s or bachelor’s degree in computer science, data science, information systems, operations research, statistics, applied mathematics, economics, engineering, or physics. Candidate should have demonstrated the ability to manage data projects and diverse teams. Should have experience in creating data and analytics solutions. Experience in building solutions with Data solutions in any one or more domains – Industrial, Healthcare, Retail, Communication Problem-solving, communication, and collaboration skills. Good knowledge of data visualization and reporting tools Ability to normalize and standardize data as per Key KPIs and Metrics Important Notice: Recruitment Scams Please be aware that HARMAN recruiters will always communicate with you from an '@harman.com' email address. We will never ask for payments, banking, credit card, personal financial information or access to your LinkedIn/email account during the screening, interview, or recruitment process. If you are asked for such information or receive communication from an email address not ending in '@harman.com' about a job with HARMAN, please cease communication immediately and report the incident to us through: harmancareers@harman.com. HARMAN is proud to be an Equal Opportunity / Affirmative Action employer. All qualified applicants will receive consideration for employment without regard to race, religion, color, national origin, gender (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender identity, gender expression, age, status as a protected veteran, status as an individual with a disability, or other applicable legally protected characteristics.
Posted 3 months ago
2 - 7 years
19 - 25 Lacs
Hyderabad
Hybrid
Role & responsibilities Current opening is only for Hyderabad location only. Finally round inperson. If interested share your profile with Current ctc, Expected ctc & Notice Period. Talend Experience: 3-5 years of experience as a Talend developer. • Snowflake Experience: 1-3 years of experience working with Snowflake data platform. • ETL Knowledge: Proficient in ETL processes and data integration techniques. • SQL Skills: Strong SQL skills for database querying and management. • Cloud Data Solutions: Experience in cloud data solutions and architecture. Preferred candidate profile Perks and benefits
Posted 3 months ago
3 - 7 years
14 - 18 Lacs
Chennai
Work from Office
Develop and maintain system interfaces and APIs. Ensure seamless integration between software applications. Work with RESTful and SOAP APIs for data exchange. Optimize data transmission processes for performance and reliability. Troubleshoot and resolve interface issues. Required Skills: Proficiency in API development (REST, SOAP, GraphQL). Strong experience in Java, Python, .NET, or Node.js. Hands-on experience with integration platforms (MuleSoft, Boomi, Talend, etc.).
Posted 3 months ago
5 - 10 years
7 - 12 Lacs
Chennai
Work from Office
Project Role : Analytics and Modeling Lead Project Role Description : Lead the effort to gather, analyze and model client data (customers, financials, operational, organizational, access channel), key performance indicators, and/or market data (competitors, products, suppliers), using a broad set of analytical tools and techniques to develop quantitative and qualitative business insights and improve decision-making. Must have skills : Talend ETL Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years of fulltime education or above Summary :As an Analytics and Modeling Lead, you will lead the effort to gather, analyze and model client data, key performance indicators, and market data using analytical tools to develop business insights and improve decision-making. Roles & Responsibilities: Expected to be an SME Collaborate and manage the team to perform Responsible for team decisions Engage with multiple teams and contribute on key decisions Provide solutions to problems for their immediate team and across multiple teams Lead data architecture design and implementation Develop ETL processes using Talend ETL tool Ensure data quality and integrity across systems Professional & Technical Skills: Must To Have Skills:Proficiency in Talend ETL Strong understanding of data architecture principles Experience in data modeling and database design Knowledge of SQL and database management systems Hands-on experience with data integration and transformation Familiarity with data warehousing concepts Additional Information: The candidate should have a minimum of 5 years of experience in Talend ETL This position is based at our Chennai office A 15 years of fulltime education or above is required Qualifications 15 years of fulltime education or above
Posted 3 months ago
4 - 8 years
7 - 17 Lacs
Hyderabad
Work from Office
In this role, you will: Lead moderately complex initiatives and deliverables within technical domain environments Contribute to large scale planning of strategies Design, code, test, debug, and document for projects and programs associated with technology domain, including upgrades and deployments Review moderately complex technical challenges that require an in-depth evaluation of technologies and procedures Resolve moderately complex issues and lead a team to meet existing client needs or potential new clients needs while leveraging solid understanding of the function, policies, procedures, or compliance requirements Collaborate and consult with peers, colleagues, and mid-level managers to resolve technical challenges and achieve goals Lead projects and act as an escalation point, provide guidance and direction to less experienced staff Required Qualifications: 4+ years of Software Engineering experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education Desired Qualifications: Knowledge/Skills/Ability Knowledge : large scale development projects, including data migrations and third party vendor integrations Technology Knowledge and understanding of Data 4+ years of ETL (Extract, Transform, Load) Programming experience , Abinitio and Informatica 2+ years of experience in Gen AI.ML relevant technologies 4+ years of ETL, data warehouse and data analytics delivery experience on internal or external cloud platforms 4+ years of experience with databases such as Oracle, DB2, SQL server, or Teradata 4+ years of Informatica experience 4+ years of Oracle experience 4+ years of Autosys experience 4+ years of UNIX experience 4+ years of hands on experience on any ETL tool ( informatica, Abinitio , Talend,etc) Good experience in creating performant design using Mapplets, Mappings, Workflows, Worklets for Data Quality(cleansing) ETL jobs Knowledge/Skills/Ability Knowledge and understanding of Big Data, Hadoop framework Other Desired The applicant must have strong communication, customer service, troubleshooting, and organizational skills and the ability to mentor juniors. Ability to identify and impose best practices, standards and SDLC concepts to all functional areas. Good understanding of agile methodology Job Expectations: As a Developer, you will be responsible for hands-on coding to meet the objectives for your project assignments. You will ensure delivery of the project roadmap while producing sustainable, reusable, well-tested code that meets our internal agile software development standards. You must effectively manage the engineering function for your assigned project to ensure it is successfully delivery to production. As a Developer you must foster strong working relationships with team members and internal business partners to effectively manage priorities and ensure success of your projects. To be successful at this position, you must be a self-starter with the ability to take ownership and handle multiple tasks simultaneously . You have good interpersonal communication skills, team-oriented and thrive in a fast-paced environmentAn ideal candidate Lead an agile engineering team supporting technology solutions development across multiple product or capability domains. Partner with business product managers, lead systems architects, and senior engineers to develop product functional and technical strategy for the domain(s), including development of actionable short and long-term product roadmaps and shaping prioritized features. Oversee engineering teams to deliver commitments aligned to strategic product priorities. Mentor and guide the professional and technical development of senior engineers and lower level engineer and assists in hiring top engineering talent. Collaborate within and across agile teams to design, test, implement, and support technical solutions in full-stack development tools and methodologies. Ensures the craftsmanship, security, availability, resilience, and scalability of solutions developed by the teams or third party providers. Support implementation of features spanning multiple teams for multiple product or capability domains. Partners with product managers to drive business outcomes. Ensure compliance and risk management requirements for supported area are met. Ensures that key areas of technology risk including security, stability, and scalability are addressed in products and capabilities within the domain(s). Interface with third party vendors and technology service providers. May speak at a conference as a subject matter expert. Lead a team of individual contributor engineers and/or lower level engineers.
Posted 3 months ago
4 - 7 years
6 - 9 Lacs
Bengaluru
Work from Office
Job Responsibilities Assist in the design and implementation of Snowflake-based analytics solution(data lake and data warehouse) on AWS Requirements definition, source data analysis and profiling, the logical and physical design of the data lake and data warehouse as well as the design of data integration and publication pipelines Develop Snowflake deployment and usage best practices Help educate the rest of the team members on the capabilities and limitations of Snowflake Build and maintain data pipelines adhering to suggested enterprise architecture principles and guidelines Design, build, test, and maintain data management systems Work in sync with internal and external team members like data architects, data scientists, data analysts to handle all sorts of technical issue Act as technical leader within the team Working in Agile/Lean model Deliver quality deliverables on time Translating complex functional requirements into technical solutions. EXPERTISE AND QUALIFICATIONS Essential Skills, Education and Experience Should have a B.E. B.Tech. MCA or equivalent degree along with 4-7 years of experience in Data Engineering Strong experience in DBT concepts like Model building and configurations, incremental load strategies, macro, DBT tests. Strong experience in SQL Strong Experience in AWS Creation and maintenance of optimum data pipeline architecture for ingestion, processing of data Creation of necessary infrastructure for ETL jobs from a wide range of data sources using Talend, DBT, S3, Snowflake. Experience in Data storage technologies like Amazon S3, SQL, NoSQL Data modeling technical awareness Experience in working with stakeholders working in different time zones Good to have AWS data services development experience. Working knowledge on using Bigdata technologies. Experience in collaborating data quality and data governance team. Exposure to reporting tools like Tableau Apache Airflow, Apache Kafka (nice to have) Payments domain knowledge CRM, Accounting, etc. in depth understanding Regulatory reporting exposure Other skills Good Communication skills Team Player Problem solver Willing to learn new technologies, share your ideas and assist other team members as needed Strong analytical and problem-solving skills; ability to define problems, collect data, establish facts, and draw conclusions.
Posted 3 months ago
7 - 12 years
9 - 14 Lacs
Gurgaon
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Talend Big Data Good to have skills : NA Minimum 7.5 year(s) of experience is required Educational Qualification : Any Qualification Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements using Talend Big Data. Your typical day will involve working with the development team, analyzing business requirements, and developing solutions to meet those requirements. Roles & Responsibilities: Design, develop, and maintain Talend Big Data applications to meet business requirements. Collaborate with cross-functional teams to analyze business requirements and develop solutions to meet those requirements. Develop and maintain technical documentation for Talend Big Data applications. Provide technical guidance and support to the development team. Stay updated with the latest advancements in Talend Big Data and related technologies. Professional & Technical Skills: Must To Have Skills:Strong experience in Talend Big Data. Good To Have Skills:Experience with Hadoop, Hive, and Spark. Strong understanding of data integration and ETL processes. Experience with SQL and NoSQL databases. Experience with version control systems such as Git. Solid grasp of software development best practices and methodologies. Additional Information: The candidate should have a minimum of 7.5 years of experience in Talend Big Data. The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering impactful data-driven solutions. This position is based at our Gurugram office. Qualifications Any Qualification
Posted 3 months ago
3 - 8 years
5 - 10 Lacs
Pune
Work from Office
Project Role : Quality Engineer (Tester) Project Role Description : Enables full stack solutions through multi-disciplinary team planning and ecosystem integration to accelerate delivery and drive quality across the application lifecycle. Performs continuous testing for security, API, and regression suite. Creates automation strategy, automated scripts and supports data and environment configuration. Participates in code reviews, monitors, and reports defects to support continuous improvement activities for the end-to-end testing process. Must have skills : Data Warehouse ETL Testing Good to have skills : Talend ETL Minimum 3 year(s) of experience is required Educational Qualification : Minimum 15 years of full time education Summary :As a Quality Engineer (Tester), you will be responsible for enabling full stack solutions through multi-disciplinary team planning and ecosystem integration to accelerate delivery and drive quality across the application lifecycle. Your typical day will involve performing continuous testing for security, API, and regression suite, creating automation strategy, automated scripts and supporting data and environment configuration, participating in code reviews, monitoring, and reporting defects to support continuous improvement activities for the end-to-end testing process. Roles & Responsibilities: Perform continuous testing for security, API, and regression suite. Create automation strategy, automated scripts and support data and environment configuration. Participate in code reviews, monitor, and report defects to support continuous improvement activities for the end-to-end testing process. Enable full stack solutions through multi-disciplinary team planning and ecosystem integration to accelerate delivery and drive quality across the application lifecycle. Professional & Technical Skills: Must To Have Skills:Experience in Data Warehouse ETL Testing. Good To Have Skills:Experience in Talend ETL. Strong understanding of software testing methodologies and processes. Experience in test automation using tools like Selenium, JMeter, or similar. Experience in SQL and database testing. Experience in Agile methodology and working in Agile teams. Additional Information: The candidate should have a minimum of 3 years of experience in Data Warehouse ETL Testing. The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering high-quality testing solutions. This position is based at our Pune office. Qualifications Minimum 15 years of full time education
Posted 3 months ago
7 - 12 years
9 - 14 Lacs
Mumbai
Work from Office
Project Role : Software Development Lead Project Role Description : Develop and configure software systems either end-to-end or for a specific stage of product lifecycle. Apply knowledge of technologies, applications, methodologies, processes and tools to support a client, project or entity. Must have skills : IBM InfoSphere DataStage Good to have skills : NA Minimum 7.5 year(s) of experience is required Educational Qualification : 15 years of full time education is required Summary :As a Software Development Lead, you will be responsible for developing and configuring software systems using IBM InfoSphere DataStage. Your typical day will involve applying your knowledge of technologies, applications, methodologies, processes, and tools to support a client, project, or entity. Roles & Responsibilities: Lead the development and configuration of software systems using IBM InfoSphere DataStage. Apply your knowledge of technologies, applications, methodologies, processes, and tools to support a client, project, or entity. Collaborate with cross-functional teams to ensure the successful delivery of software systems. Provide technical guidance and mentorship to junior team members. Stay updated with the latest advancements in software development and apply innovative approaches for sustained competitive advantage. Professional & Technical Skills: Must To Have Skills:Expertise in IBM InfoSphere DataStage. Good To Have Skills:Experience with other ETL tools such as Talend, Informatica, or SSIS. Strong understanding of software development methodologies and processes. Experience with database technologies such as Oracle, SQL Server, or MySQL. Experience with scripting languages such as Python or Perl. Solid grasp of software testing and debugging techniques. Additional Information: The candidate should have a minimum of 7.5 years of experience in IBM InfoSphere DataStage. The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering impactful software solutions. This position is based at our Mumbai office. Qualifications 15 years of full time education is required
Posted 3 months ago
5 - 10 years
7 - 12 Lacs
Bengaluru
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Salesforce Marketing Cloud Customer Data Platform Good to have skills : Salesforce Marketing Cloud Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. You will be responsible for managing the team and collaborating with multiple teams to make key decisions. Your typical day will involve with the application development process, implementing solutions to problems, and ensuring the successful delivery of projects. Roles & Responsibilities: Understand the ins and outs of Salesforce CDP platform Execute technical feasibility assessments and project estimates for moving databases and data processing to Salesforce CDP. Plan, design, and build CIM schemas in Salesforce CDP to host imported data. Focus on Data collection Strategy, Customer profile unification through CDP, segmentation, Personalization and activation. Design and advocate solutions using modern cloud technologies, design principles, integration points, and automation methods. Share knowledge with customers as well as provide reviews, discussions, and prototypes. Participate in overall engagement from strategy, assessment, migration, and implementations. Work with customers to deploy, manage, and audit best practices for cloud products. The SF CDP Engineer creates quality software and data structures that meet the functional and non-functional project requirements in the implementation, enhancement, and support of marketing projects. This includes producing application code on-time, on-budget, and in compliance with company implementation standards & practices as well as general industry & platform-specific best practices Lead, design, develop, and deliver large-scale data systems, data processing, and data transformation projects. Professional & Technical Skills: 3+ years of hands-on experience with marketing data Minimum 1 year implementation experience for one or more Customer Data Platforms Minimum 3 years hands on SQL experience Bachelor's degree or equivalent (minimum 12 years) work experience Additional Information: Recent, hands-on experience with Salesforce CDP (formerly Customer 360 Audiences) Experience with one or more Customer Data Platforms and MDM such as Adobe Experience Platform, Salesforce Interaction Studio, Microsoft D365 Customer Insights, Oracle CX Unity, Reltio, Redpoint Global, Lytics, Segment, Amperity or ActionIQ Understanding of Technologies and Processes for Marketing, Personalization, and Data Orchestration such as Adobe Marketing Cloud, Oracle Marketing Cloud, Salesforce Marketing Cloud Experience in integration solution like Mulesoft or ETL tools like Informatica, Talend is always preferred. Experience with marketing campaign design and implementation Full lifecycle implementation experience using various SDLC methodologies Experience with marketing customer data models, marketing analytics Experience in a consulting environment and/or digital agency with demonstrated track record of continuing responsibilities, creativity and innovation Qualifications 15 years full time education
Posted 3 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2