Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 8.0 years
9 - 14 Lacs
Chennai
Remote
Healthcare experience is Mandatory Position Overview : We are seeking an experienced Data Modeler/Lead with deep expertise in health plan data models and enterprise data warehousing to drive our healthcare analytics and reporting initiatives. The candidate should have hands-on experience with modern data platforms and a strong understanding of healthcare industry data standards. Key Responsibilities : Data Architecture & Modeling : - Design and implement comprehensive data models for health plan operations, including member enrollment, claims processing, provider networks, and medical management - Develop logical and physical data models that support analytical and regulatory reporting requirements (HEDIS, Stars, MLR, risk adjustment) - Create and maintain data lineage documentation and data dictionaries for healthcare datasets - Establish data modeling standards and best practices across the organization Technical Leadership : - Lead data warehousing initiatives using modern platforms like Databricks or traditional ETL tools like Informatica - Architect scalable data solutions that handle large volumes of healthcare transactional data - Collaborate with data engineers to optimize data pipelines and ensure data quality Healthcare Domain Expertise : - Apply deep knowledge of health plan operations, medical coding (ICD-10, CPT, HCPCS), and healthcare data standards (HL7, FHIR, X12 EDI) - Design data models that support analytical, reporting and AI/ML needs - Ensure compliance with healthcare regulations including HIPAA/PHI, and state insurance regulations - Partner with business stakeholders to translate healthcare business requirements into technical data solutions Data Governance & Quality : - Implement data governance frameworks specific to healthcare data privacy and security requirements - Establish data quality monitoring and validation processes for critical health plan metrics - Lead eAorts to standardize healthcare data definitions across multiple systems and data sources Required Qualifications : Technical Skills : - 10+ years of experience in data modeling with at least 4 years focused on healthcare/health plan data - Expert-level proficiency in dimensional modeling, data vault methodology, or other enterprise data modeling approaches - Hands-on experience with Informatica PowerCenter/IICS or Databricks platform for large-scale data processing - Strong SQL skills and experience with Oracle Exadata and cloud data warehouses (Databricks) - Proficiency with data modeling tools (Hackolade, ERwin, or similar) Healthcare Industry Knowledge : - Deep understanding of health plan data structures including claims, eligibility, provider data, and pharmacy data - Experience with healthcare data standards and medical coding systems - Knowledge of regulatory reporting requirements (HEDIS, Medicare Stars, MLR reporting, risk adjustment) - Familiarity with healthcare interoperability standards (HL7 FHIR, X12 EDI) Leadership & Communication : - Proven track record of leading data modeling projects in complex healthcare environments - Strong analytical and problem-solving skills with ability to work with ambiguous requirements - Excellent communication skills with ability to explain technical concepts to business stakeholders - Experience mentoring team members and establishing technical standards Preferred Qualifications : - Experience with Medicare Advantage, Medicaid, or Commercial health plan operations - Cloud platform certifications (AWS, Azure, or GCP) - Experience with real-time data streaming and modern data lake architectures - Knowledge of machine learning applications in healthcare analytics - Previous experience in a lead or architect role within healthcare organization
Posted 3 weeks ago
3.0 - 8.0 years
9 - 14 Lacs
Kolkata
Work from Office
Position Overview : We are seeking an experienced Data Modeler/Lead with deep expertise in health plan data models and enterprise data warehousing to drive our healthcare analytics and reporting initiatives. The candidate should have hands-on experience with modern data platforms and a strong understanding of healthcare industry data standards. Key Responsibilities : Data Architecture & Modeling : - Design and implement comprehensive data models for health plan operations, including member enrollment, claims processing, provider networks, and medical management - Develop logical and physical data models that support analytical and regulatory reporting requirements (HEDIS, Stars, MLR, risk adjustment) - Create and maintain data lineage documentation and data dictionaries for healthcare datasets - Establish data modeling standards and best practices across the organization Technical Leadership : - Lead data warehousing initiatives using modern platforms like Databricks or traditional ETL tools like Informatica - Architect scalable data solutions that handle large volumes of healthcare transactional data - Collaborate with data engineers to optimize data pipelines and ensure data quality Healthcare Domain Expertise : - Apply deep knowledge of health plan operations, medical coding (ICD-10, CPT, HCPCS), and healthcare data standards (HL7, FHIR, X12 EDI) - Design data models that support analytical, reporting and AI/ML needs - Ensure compliance with healthcare regulations including HIPAA/PHI, and state insurance regulations - Partner with business stakeholders to translate healthcare business requirements into technical data solutions Data Governance & Quality : - Implement data governance frameworks specific to healthcare data privacy and security requirements - Establish data quality monitoring and validation processes for critical health plan metrics - Lead eAorts to standardize healthcare data definitions across multiple systems and data sources Required Qualifications : Technical Skills : - 10+ years of experience in data modeling with at least 4 years focused on healthcare/health plan data - Expert-level proficiency in dimensional modeling, data vault methodology, or other enterprise data modeling approaches - Hands-on experience with Informatica PowerCenter/IICS or Databricks platform for large-scale data processing - Strong SQL skills and experience with Oracle Exadata and cloud data warehouses (Databricks) - Proficiency with data modeling tools (Hackolade, ERwin, or similar) Healthcare Industry Knowledge : - Deep understanding of health plan data structures including claims, eligibility, provider data, and pharmacy data - Experience with healthcare data standards and medical coding systems - Knowledge of regulatory reporting requirements (HEDIS, Medicare Stars, MLR reporting, risk adjustment) - Familiarity with healthcare interoperability standards (HL7 FHIR, X12 EDI) Leadership & Communication : - Proven track record of leading data modeling projects in complex healthcare environments - Strong analytical and problem-solving skills with ability to work with ambiguous requirements - Excellent communication skills with ability to explain technical concepts to business stakeholders - Experience mentoring team members and establishing technical standards Preferred Qualifications : - Experience with Medicare Advantage, Medicaid, or Commercial health plan operations - Cloud platform certifications (AWS, Azure, or GCP) - Experience with real-time data streaming and modern data lake architectures - Knowledge of machine learning applications in healthcare analytics - Previous experience in a lead or architect role within healthcare organization
Posted 3 weeks ago
8.0 - 13.0 years
14 - 24 Lacs
Gurugram
Work from Office
I. The role holder is responsible to manage various reporting requirements like Business performance, KPIs, Performance Management, Incentives, CPP & PIP etc To provide insights within organisation across all hierarchy levels right from ExCO to ground level staff and to distribution spread across Bank Staff, Business partners. This role requires maintenance of existing reporting application Saral which is built on SQL & Pentaho. Also person is required to build / automate new dashboards on this reporting application, hence extensive knowledge of various databases & ETL tool is required. Objective of the role is to reduce execution time and associated risk, by automating existing manual reports and to create process and control over various aspects of business intelligence framework, Reporting automation platforms, data dictionary, requirement documents and traceability documents. III. IV. Use Powerpoint, MS Excel, MS Access, PL/SQL, ETL, OLTP System, Macro scripting, BI tools, data warehouse, and all available resources to create / automate reports and its presentation to stakeholders V. Monitoring various sales KPI such as issuance target achievement, login, pipeline cases, commission earned, new business and renewal business, branch & people activation and productivity, etc to the distribution partners and business development teams VI. Creation of Sales force management policy for each and every level of hierarchy for one or more distribution channels VII. Creation of incentive structure for each and every level of hierarchy for one or more distribution channels and management of budgets and opex VIII. Creation of CPP / PIP for each and every level of hierarchy for one or more distribution channels IX. Automation of new reports and continuous tuning / enhancement of already automated reports for improvement opportunities. Gather requirements from in-house teams by understanding existing reporting process and creating automated reports for same I. Handling end to end delivery from Requirement understanding, Development, SIT/Dev testing before releasing for Business UAT, UAT testing support & production Go Live II. Should have hands on experience of SQL & ETL tools for faster learning experience and delivery III. Knowledge of insurance industry is MUST to have quick hand holding. IV. Providing qualitative and timely information to decision makers on all of the aforesaid parameters in form of various dashboards, such as HO business update (automated), early claims dashboard, surrender payout monitoring, etc V. responsible for publishing sales metrics in terms of timely and accurate incentive calculation for 5000+ sales staff on monthly basis approved sales incentive scheme as a part of SFM (Sales Force Management) I. Develop and implement BI strategies aligned with business objectives to leverage the automation in organisation. II. Ensure that automated outputs match with the existing manual reports. However, with an objective of reducing manual efforts & execution time for reports streamlining and efficiency. III. Provide training, guidance and support to ensure team performance and development. IV. Identification of potential risk items, which may lead to data inconsistency & suggest mitigation plan. Define & revise key measures used across organization and collaborate for documentation V. Manage and maintain BI in-house tools and systems for seamless delivery of multiple dashboards. VI. Develop and implement automated workflows for data ingestion, transformation, loading (ETL) and maintenance of data sanity. VII. Manage the projects of design & deploy new reports and tracking dashboards, within least possible turn around and should have hands on experience in all or most of below mentioned tools:-
Posted 3 weeks ago
3.0 - 6.0 years
8 - 10 Lacs
Bengaluru
Hybrid
Description: Role: Data Engineer/ETL Developer - Talend/Power BI Job Description: 1. Study, analyze and understand business requirements in context to business intelligence and provide the end-to-end solutions. 2. Design and Implement ETL pipelines with data quality and integrity across platforms like Talend Enterprise, informatica 3. Load the data from heterogeneous sources like Oracle, MSSql, File system, FTP services, Rest APIs etc.. 4. Design and map data models to shift raw data into meaningful insights and build data catalog. 5. Develop strong data documentation about algorithms, parameters, models. 6. Analyze previous and present data for better decision making. 7. Make essential technical changes to improvise present business intelligence systems. 8. Optimizing ETL processes for improved performance, monitoring ETL jobs and troubleshooting issues. 9. Lead and oversee the Team deliverables, ensure best practices are followed for development. 10. Participate/lead in requirements gathering and analysis. Required Skillset and Experience: 1. Over all up to 3 years of working experience, preferably in SQL, ETL (Talend) 2. Must have 1+ years of experience in Talend Enterprise/Open studio and related tools like Talend API, Talend Data Catalog, TMC, TAC etc. 3. Must have understanding of database design, data modeling 4. Hands-on experience in any of the coding language (Java or Python etc.) Secondary Skillset/Good to have: 1. Experience in BI Tool like MS Power Bi. 2. Utilize Power BI to build interactive and visually appealing dashboards and reports. Required Personal & Interpersonal Skills • Strong Analytical skills • Good communication skills, both written and verbal. • Highly motivated and result-oriented • Self-driven independent work ethics that drives internal and external accountability • Ability to interpret instructions to executives and technical resources. • Advanced problem-solving skills dealing with complex distributed applications. • Experience of working in multicultural environment. Enable Skills-Based Hiring No interested candidates reach out to akram.m@acesoftlabs.com 6387195529
Posted 3 weeks ago
5.0 - 10.0 years
11 - 21 Lacs
Hyderabad
Hybrid
We are seeking a skilled Database Developer to design, develop, and optimize database solutions for enterprise applications. The ideal candidate should have expertise in SQL Server, database development, query optimization, and data modeling . This role requires collaboration with software engineers, data analysts, and business teams to ensure data integrity, high performance, and scalability. Required Qualifications: 5+ years of experience as a Database Engineer. Strong expertise in SQL Server, MySQL, PostgreSQL, or Oracle databases . Advanced knowledge of SQL programming, T-SQL, PL/SQL . Strong understanding of database indexing, query execution plans, and performance tuning . Experience in ETL tools (SSIS, Talend, Informatica, etc.) for data migration and transformation. Familiarity with cloud-based databases (GCP) . Proficiency in data modeling, normalization, and database architecture . Experience with NoSQL databases (MongoDB, Redis) is a plus .
Posted 4 weeks ago
8.0 - 12.0 years
25 - 35 Lacs
Surat
Work from Office
Role Overview : We are seeking a highly skilled and experienced Senior Data Architect to join our team on a contract basis. This role will be pivotal in designing and implementing robust data architectures, ensuring data governance, and driving data-driven insights. The ideal candidate will possess deep expertise in MS Dynamics, data lake architecture, ETL processes, data modeling, and data integration. You will collaborate closely with stakeholders to understand their data needs and translate them into scalable and efficient solutions. Responsibilities : Data Architecture Design and Development: - Design and implement comprehensive data architectures, including data lakes, data warehouses, and data integration strategies. - Develop and maintain conceptual, logical, and physical data models. - Define and enforce data standards, policies, and procedures. - Evaluate and select appropriate data technologies and tools. - Ensure scalability, performance, and security of data architectures. - MS Dynamics and Data Lake Integration : - Lead the integration of MS Dynamics with data lake environments. - Design and implement data pipelines for efficient data movement between systems. - Troubleshoot and resolve integration issues. - Optimize data flow and performance within the integrated environment. ETL and Data Integration : - Design, develop, and implement ETL processes for data extraction, transformation, and loading. - Ensure data quality and consistency throughout the integration process. - Develop and maintain data integration documentation. - Implement data validation and error handling mechanisms. Data Modeling and Data Governance : - Develop and maintain data models that align with business requirements. - Implement and enforce data governance policies and procedures. - Ensure data security and compliance with relevant regulations. - Establish and maintain data dictionaries and metadata repositories. Issue Resolution and Troubleshooting : - Proactively identify and resolve architectural issues. - Conduct root cause analysis and implement corrective actions. - Provide technical guidance and support to development teams. - Communicate issues and risks proactively. Collaboration and Communication : - Collaborate with stakeholders to understand data requirements and translate them into technical solutions. - Communicate effectively with technical and non-technical audiences. - Participate in design reviews and code reviews. - Work as good single contributor and good team player. Qualifications : Experience : - 8-12 years of hands-on experience in data architecture and related fields. - Minimum 4 years of experience in architectural design and integration. - Experience working with cloud based data solutions. Technical Skills : - Strong expertise in MS Dynamics and data lake architecture. - Proficiency in ETL tools and techniques (e.g., Azure Data Factory, SSIS, etc.). - Expertise in data modeling techniques (e.g., dimensional modeling, relational modeling). - Strong understanding of data warehousing concepts and best practices. - Proficiency in SQL and other data query languages. - Experience with data quality assurance and data governance. - Experience with cloud platforms such as Azure or AWS. Soft Skills : - Strong analytical and problem-solving skills. - Excellent communication and interpersonal skills. - Ability to work independently and as part of a team. - Flexible and adaptable to changing priorities. - Proactive and self-motivated. - Ability to deal with ambiguity. - Open to continuous learning. - Self-confident and humble. - Intelligent, rigorous thinker who can operate successfully amongst bright people.
Posted 1 month ago
8.0 - 12.0 years
19 - 22 Lacs
Pune
Remote
Job Description: Senior Data Architect (Contract) Company : Emperen Technologies Location: India (Remote) Type: Contract (8-12 Months) Experience: 8-12 Years Role Overview : We are seeking a highly skilled and experienced Senior Data Architect to join our team on a contract basis. This role will be pivotal in designing and implementing robust data architectures, ensuring data governance, and driving data-driven insights. The ideal candidate will possess deep expertise in MS Dynamics, data lake architecture, ETL processes, data modeling, and data integration. You will collaborate closely with stakeholders to understand their data needs and translate them into scalable and efficient solutions. Responsibilities : Data Architecture Design and Development: - Design and implement comprehensive data architectures, including data lakes, data warehouses, and data integration strategies. - Develop and maintain conceptual, logical, and physical data models. - Define and enforce data standards, policies, and procedures. - Evaluate and select appropriate data technologies and tools. - Ensure scalability, performance, and security of data architectures. - MS Dynamics and Data Lake Integration : - Lead the integration of MS Dynamics with data lake environments. - Design and implement data pipelines for efficient data movement between systems. - Troubleshoot and resolve integration issues. - Optimize data flow and performance within the integrated environment. ETL and Data Integration : - Design, develop, and implement ETL processes for data extraction, transformation, and loading. - Ensure data quality and consistency throughout the integration process. - Develop and maintain data integration documentation. - Implement data validation and error handling mechanisms. Data Modeling and Data Governance : - Develop and maintain data models that align with business requirements. - Implement and enforce data governance policies and procedures. - Ensure data security and compliance with relevant regulations. - Establish and maintain data dictionaries and metadata repositories. Issue Resolution and Troubleshooting : - Proactively identify and resolve architectural issues. - Conduct root cause analysis and implement corrective actions. - Provide technical guidance and support to development teams. - Communicate issues and risks proactively. Collaboration and Communication : - Collaborate with stakeholders to understand data requirements and translate them into technical solutions. - Communicate effectively with technical and non-technical audiences. - Participate in design reviews and code reviews. - Work as good single contributor and good team player. Qualifications : Experience : - 8-12 years of hands-on experience in data architecture and related fields. - Minimum 4 years of experience in architectural design and integration. - Experience working with cloud based data solutions. Technical Skills : - Strong expertise in MS Dynamics and data lake architecture. - Proficiency in ETL tools and techniques (e.g., Azure Data Factory, SSIS, etc.). - Expertise in data modeling techniques (e.g., dimensional modeling, relational modeling). - Strong understanding of data warehousing concepts and best practices. - Proficiency in SQL and other data query languages. - Experience with data quality assurance and data governance. - Experience with cloud platforms such as Azure or AWS. Soft Skills : - Strong analytical and problem-solving skills. - Excellent communication and interpersonal skills. - Ability to work independently and as part of a team. - Flexible and adaptable to changing priorities. - Proactive and self-motivated. - Ability to deal with ambiguity. - Open to continuous learning. - Self-confident and humble. - Intelligent, rigorous thinker who can operate successfully amongst bright people.
Posted 1 month ago
8.0 - 12.0 years
12 - 18 Lacs
Chennai
Remote
Type: Contract (8-12 Months) Role Overview : We are seeking a highly skilled and experienced Senior Data Architect to join our team on a contract basis. This role will be pivotal in designing and implementing robust data architectures, ensuring data governance, and driving data-driven insights. The ideal candidate will possess deep expertise in MS Dynamics, data lake architecture, ETL processes, data modeling, and data integration. You will collaborate closely with stakeholders to understand their data needs and translate them into scalable and efficient solutions. Responsibilities : Data Architecture Design and Development: - Design and implement comprehensive data architectures, including data lakes, data warehouses, and data integration strategies. - Develop and maintain conceptual, logical, and physical data models. - Define and enforce data standards, policies, and procedures. - Evaluate and select appropriate data technologies and tools. - Ensure scalability, performance, and security of data architectures. - MS Dynamics and Data Lake Integration : - Lead the integration of MS Dynamics with data lake environments. - Design and implement data pipelines for efficient data movement between systems. - Troubleshoot and resolve integration issues. - Optimize data flow and performance within the integrated environment. ETL and Data Integration : - Design, develop, and implement ETL processes for data extraction, transformation, and loading. - Ensure data quality and consistency throughout the integration process. - Develop and maintain data integration documentation. - Implement data validation and error handling mechanisms. Data Modeling and Data Governance : - Develop and maintain data models that align with business requirements. - Implement and enforce data governance policies and procedures. - Ensure data security and compliance with relevant regulations. - Establish and maintain data dictionaries and metadata repositories. Issue Resolution and Troubleshooting : - Proactively identify and resolve architectural issues. - Conduct root cause analysis and implement corrective actions. - Provide technical guidance and support to development teams. - Communicate issues and risks proactively. Collaboration and Communication : - Collaborate with stakeholders to understand data requirements and translate them into technical solutions. - Communicate effectively with technical and non-technical audiences. - Participate in design reviews and code reviews. - Work as good single contributor and good team player. Qualifications : Experience : - 8-12 years of hands-on experience in data architecture and related fields. - Minimum 4 years of experience in architectural design and integration. - Experience working with cloud based data solutions. Technical Skills : - Strong expertise in MS Dynamics and data lake architecture. - Proficiency in ETL tools and techniques (e.g., Azure Data Factory, SSIS, etc.). - Expertise in data modeling techniques (e.g., dimensional modeling, relational modeling). - Strong understanding of data warehousing concepts and best practices. - Proficiency in SQL and other data query languages. - Experience with data quality assurance and data governance. - Experience with cloud platforms such as Azure or AWS. Soft Skills : - Strong analytical and problem-solving skills. - Excellent communication and interpersonal skills. - Ability to work independently and as part of a team. - Flexible and adaptable to changing priorities. - Proactive and self-motivated. - Ability to deal with ambiguity. - Open to continuous learning. - Self-confident and humble. - Intelligent, rigorous thinker who can operate successfully amongst bright people.
Posted 1 month ago
8.0 - 12.0 years
19 - 22 Lacs
Ahmedabad
Remote
Type: Contract (8-12 Months) Role Overview : We are seeking a highly skilled and experienced Senior Data Architect to join our team on a contract basis. This role will be pivotal in designing and implementing robust data architectures, ensuring data governance, and driving data-driven insights. The ideal candidate will possess deep expertise in MS Dynamics, data lake architecture, ETL processes, data modeling, and data integration. You will collaborate closely with stakeholders to understand their data needs and translate them into scalable and efficient solutions. Responsibilities : Data Architecture Design and Development: - Design and implement comprehensive data architectures, including data lakes, data warehouses, and data integration strategies. - Develop and maintain conceptual, logical, and physical data models. - Define and enforce data standards, policies, and procedures. - Evaluate and select appropriate data technologies and tools. - Ensure scalability, performance, and security of data architectures. - MS Dynamics and Data Lake Integration : - Lead the integration of MS Dynamics with data lake environments. - Design and implement data pipelines for efficient data movement between systems. - Troubleshoot and resolve integration issues. - Optimize data flow and performance within the integrated environment. ETL and Data Integration : - Design, develop, and implement ETL processes for data extraction, transformation, and loading. - Ensure data quality and consistency throughout the integration process. - Develop and maintain data integration documentation. - Implement data validation and error handling mechanisms. Data Modeling and Data Governance : - Develop and maintain data models that align with business requirements. - Implement and enforce data governance policies and procedures. - Ensure data security and compliance with relevant regulations. - Establish and maintain data dictionaries and metadata repositories. Issue Resolution and Troubleshooting : - Proactively identify and resolve architectural issues. - Conduct root cause analysis and implement corrective actions. - Provide technical guidance and support to development teams. - Communicate issues and risks proactively. Collaboration and Communication : - Collaborate with stakeholders to understand data requirements and translate them into technical solutions. - Communicate effectively with technical and non-technical audiences. - Participate in design reviews and code reviews. - Work as good single contributor and good team player. Qualifications : Experience : - 8-12 years of hands-on experience in data architecture and related fields. - Minimum 4 years of experience in architectural design and integration. - Experience working with cloud based data solutions. Technical Skills : - Strong expertise in MS Dynamics and data lake architecture. - Proficiency in ETL tools and techniques (e.g., Azure Data Factory, SSIS, etc.). - Expertise in data modeling techniques (e.g., dimensional modeling, relational modeling). - Strong understanding of data warehousing concepts and best practices. - Proficiency in SQL and other data query languages. - Experience with data quality assurance and data governance. - Experience with cloud platforms such as Azure or AWS. Soft Skills : - Strong analytical and problem-solving skills. - Excellent communication and interpersonal skills. - Ability to work independently and as part of a team. - Flexible and adaptable to changing priorities. - Proactive and self-motivated. - Ability to deal with ambiguity. - Open to continuous learning. - Self-confident and humble. - Intelligent, rigorous thinker who can operate successfully amongst bright people.
Posted 1 month ago
5.0 - 10.0 years
7 - 12 Lacs
Pune
Work from Office
Job Specification Maintenance and support of existing VB.NET applications. Maintenance and support of new and existing Python solutions Maintenance and support of new and existing scheduling and ETL solutions Support existing reports Ability to clearly communicate technical ideas to both technical stakeholders and business end users. Working with the product owner to deliver solutions with a smooth transition through to the production environment. Adhere to change management and release management procedures. Working closely with other team members and end users. Skills Required 5+ years design and support experience. Solid knowledge and experience in VB.NET, Python and JSON. Experience in PL/SQL . Comfortable with Git source control. Knowledge and experience of software engineering life cycle. Experience working with automation/scheduling tools, such as VisualCron or Control-M Experience working with ETL tools, such as Alphafuse or X-Gen Experience in the area of data warehouse and reporting solutions, such as OpenText Great attention to details and the passion to drive projects forward. Able to produce Technical documentation. Excellent time management and decision-making skills. Excellent communication skills in both English written and verbal. Good understanding of financial solution and process. Background in the financial industry preferable. Disclaimer : Unsolicited CVs sent to Apex (Talent Acquisition Team or Hiring Managers) by recruitment agencies will not be accepted for this position. Apex operates a direct sourcing model and where agency assistance is required, the Talent Acquisition team will engage directly with our exclusive recruitment partners.
Posted 1 month ago
8.0 - 12.0 years
19 - 22 Lacs
Bengaluru
Work from Office
Job Description: Senior Data Architect (Contract) Company : Emperen Technologies Location: India (Remote) Type: Contract (8-12 Months) Experience: 8-12 Years About Emperen Technologies: Emperen Technologies is a dynamic consulting firm founded in 2010, dedicated to delivering impactful results for clients through strong, collaborative partnerships. We specialize in implementing strategic visions for Fortune 500 companies, non-profit organizations, and innovative startups. Our client-centric, values-driven approach, coupled with a scalable business model, empowers clients to navigate and thrive in the ever-evolving technology landscape. We are committed to building a team of top-tier talent who are inspired by our vision and driven to excellence. Role Overview : We are seeking a highly skilled and experienced Senior Data Architect to join our team on a contract basis. This role will be pivotal in designing and implementing robust data architectures, ensuring data governance, and driving data-driven insights. The ideal candidate will possess deep expertise in MS Dynamics, data lake architecture, ETL processes, data modeling, and data integration. You will collaborate closely with stakeholders to understand their data needs and translate them into scalable and efficient solutions. Responsibilities : Data Architecture Design and Development: - Design and implement comprehensive data architectures, including data lakes, data warehouses, and data integration strategies. - Develop and maintain conceptual, logical, and physical data models. - Define and enforce data standards, policies, and procedures. - Evaluate and select appropriate data technologies and tools. - Ensure scalability, performance, and security of data architectures. - MS Dynamics and Data Lake Integration : - Lead the integration of MS Dynamics with data lake environments. - Design and implement data pipelines for efficient data movement between systems. - Troubleshoot and resolve integration issues. - Optimize data flow and performance within the integrated environment. ETL and Data Integration : - Design, develop, and implement ETL processes for data extraction, transformation, and loading. - Ensure data quality and consistency throughout the integration process. - Develop and maintain data integration documentation. - Implement data validation and error handling mechanisms. Data Modeling and Data Governance : - Develop and maintain data models that align with business requirements. - Implement and enforce data governance policies and procedures. - Ensure data security and compliance with relevant regulations. - Establish and maintain data dictionaries and metadata repositories. Issue Resolution and Troubleshooting : - Proactively identify and resolve architectural issues. - Conduct root cause analysis and implement corrective actions. - Provide technical guidance and support to development teams. - Communicate issues and risks proactively. Collaboration and Communication : - Collaborate with stakeholders to understand data requirements and translate them into technical solutions. - Communicate effectively with technical and non-technical audiences. - Participate in design reviews and code reviews. - Work as good single contributor and good team player. Qualifications : Experience : - 8-12 years of hands-on experience in data architecture and related fields. - Minimum 4 years of experience in architectural design and integration. - Experience working with cloud based data solutions. Technical Skills : - Strong expertise in MS Dynamics and data lake architecture. - Proficiency in ETL tools and techniques (e.g., Azure Data Factory, SSIS, etc.). - Expertise in data modeling techniques (e.g., dimensional modeling, relational modeling). - Strong understanding of data warehousing concepts and best practices. - Proficiency in SQL and other data query languages. - Experience with data quality assurance and data governance. - Experience with cloud platforms such as Azure or AWS. Soft Skills : - Strong analytical and problem-solving skills. - Excellent communication and interpersonal skills. - Ability to work independently and as part of a team. - Flexible and adaptable to changing priorities. - Proactive and self-motivated. - Ability to deal with ambiguity. - Open to continuous learning. - Self-confident and humble. - Intelligent, rigorous thinker who can operate successfully amongst bright people.
Posted 1 month ago
8.0 - 12.0 years
19 - 22 Lacs
Jaipur
Remote
Role Overview : We are seeking a highly skilled and experienced Senior Data Architect to join our team on a contract basis. This role will be pivotal in designing and implementing robust data architectures, ensuring data governance, and driving data-driven insights. The ideal candidate will possess deep expertise in MS Dynamics, data lake architecture, ETL processes, data modeling, and data integration. You will collaborate closely with stakeholders to understand their data needs and translate them into scalable and efficient solutions. Responsibilities : Data Architecture Design and Development: - Design and implement comprehensive data architectures, including data lakes, data warehouses, and data integration strategies. - Develop and maintain conceptual, logical, and physical data models. - Define and enforce data standards, policies, and procedures. - Evaluate and select appropriate data technologies and tools. - Ensure scalability, performance, and security of data architectures. - MS Dynamics and Data Lake Integration : - Lead the integration of MS Dynamics with data lake environments. - Design and implement data pipelines for efficient data movement between systems. - Troubleshoot and resolve integration issues. - Optimize data flow and performance within the integrated environment. ETL and Data Integration : - Design, develop, and implement ETL processes for data extraction, transformation, and loading. - Ensure data quality and consistency throughout the integration process. - Develop and maintain data integration documentation. - Implement data validation and error handling mechanisms. Data Modeling and Data Governance : - Develop and maintain data models that align with business requirements. - Implement and enforce data governance policies and procedures. - Ensure data security and compliance with relevant regulations. - Establish and maintain data dictionaries and metadata repositories. Issue Resolution and Troubleshooting : - Proactively identify and resolve architectural issues. - Conduct root cause analysis and implement corrective actions. - Provide technical guidance and support to development teams. - Communicate issues and risks proactively. Collaboration and Communication : - Collaborate with stakeholders to understand data requirements and translate them into technical solutions. - Communicate effectively with technical and non-technical audiences. - Participate in design reviews and code reviews. - Work as good single contributor and good team player. Qualifications : Experience : - 8-12 years of hands-on experience in data architecture and related fields. - Minimum 4 years of experience in architectural design and integration. - Experience working with cloud based data solutions. Technical Skills : - Strong expertise in MS Dynamics and data lake architecture. - Proficiency in ETL tools and techniques (e.g., Azure Data Factory, SSIS, etc.). - Expertise in data modeling techniques (e.g., dimensional modeling, relational modeling). - Strong understanding of data warehousing concepts and best practices. - Proficiency in SQL and other data query languages. - Experience with data quality assurance and data governance. - Experience with cloud platforms such as Azure or AWS. Soft Skills : - Strong analytical and problem-solving skills. - Excellent communication and interpersonal skills. - Ability to work independently and as part of a team. - Flexible and adaptable to changing priorities. - Proactive and self-motivated. - Ability to deal with ambiguity. - Open to continuous learning. - Self-confident and humble. - Intelligent, rigorous thinker who can operate successfully amongst bright people.
Posted 1 month ago
8.0 - 12.0 years
10 - 14 Lacs
Hyderabad
Work from Office
Role Overview : We are seeking a highly skilled and experienced Senior Data Architect to join our team on a contract basis. This role will be pivotal in designing and implementing robust data architectures, ensuring data governance, and driving data-driven insights. The ideal candidate will possess deep expertise in MS Dynamics, data lake architecture, ETL processes, data modeling, and data integration. You will collaborate closely with stakeholders to understand their data needs and translate them into scalable and efficient solutions. Responsibilities : Data Architecture Design and Development: - Design and implement comprehensive data architectures, including data lakes, data warehouses, and data integration strategies. - Develop and maintain conceptual, logical, and physical data models. - Define and enforce data standards, policies, and procedures. - Evaluate and select appropriate data technologies and tools. - Ensure scalability, performance, and security of data architectures. - MS Dynamics and Data Lake Integration : - Lead the integration of MS Dynamics with data lake environments. - Design and implement data pipelines for efficient data movement between systems. - Troubleshoot and resolve integration issues. - Optimize data flow and performance within the integrated environment. ETL and Data Integration : - Design, develop, and implement ETL processes for data extraction, transformation, and loading. - Ensure data quality and consistency throughout the integration process. - Develop and maintain data integration documentation. - Implement data validation and error handling mechanisms. Data Modeling and Data Governance : - Develop and maintain data models that align with business requirements. - Implement and enforce data governance policies and procedures. - Ensure data security and compliance with relevant regulations. - Establish and maintain data dictionaries and metadata repositories. Issue Resolution and Troubleshooting : - Proactively identify and resolve architectural issues. - Conduct root cause analysis and implement corrective actions. - Provide technical guidance and support to development teams. - Communicate issues and risks proactively. Collaboration and Communication : - Collaborate with stakeholders to understand data requirements and translate them into technical solutions. - Communicate effectively with technical and non-technical audiences. - Participate in design reviews and code reviews. - Work as good single contributor and good team player. Qualifications : Experience : - 8-12 years of hands-on experience in data architecture and related fields. - Minimum 4 years of experience in architectural design and integration. - Experience working with cloud based data solutions. Technical Skills : - Strong expertise in MS Dynamics and data lake architecture. - Proficiency in ETL tools and techniques (e.g., Azure Data Factory, SSIS, etc.). - Expertise in data modeling techniques (e.g., dimensional modeling, relational modeling). - Strong understanding of data warehousing concepts and best practices. - Proficiency in SQL and other data query languages. - Experience with data quality assurance and data governance. - Experience with cloud platforms such as Azure or AWS. Soft Skills : - Strong analytical and problem-solving skills. - Excellent communication and interpersonal skills. - Ability to work independently and as part of a team. - Flexible and adaptable to changing priorities. - Proactive and self-motivated. - Ability to deal with ambiguity. - Open to continuous learning. - Self-confident and humble. - Intelligent, rigorous thinker who can operate successfully amongst bright people.
Posted 1 month ago
8.0 - 12.0 years
10 - 14 Lacs
Lucknow
Work from Office
Role Overview : We are seeking a highly skilled and experienced Senior Data Architect to join our team on a contract basis. This role will be pivotal in designing and implementing robust data architectures, ensuring data governance, and driving data-driven insights. The ideal candidate will possess deep expertise in MS Dynamics, data lake architecture, ETL processes, data modeling, and data integration. You will collaborate closely with stakeholders to understand their data needs and translate them into scalable and efficient solutions. Responsibilities : Data Architecture Design and Development: - Design and implement comprehensive data architectures, including data lakes, data warehouses, and data integration strategies. - Develop and maintain conceptual, logical, and physical data models. - Define and enforce data standards, policies, and procedures. - Evaluate and select appropriate data technologies and tools. - Ensure scalability, performance, and security of data architectures. - MS Dynamics and Data Lake Integration : - Lead the integration of MS Dynamics with data lake environments. - Design and implement data pipelines for efficient data movement between systems. - Troubleshoot and resolve integration issues. - Optimize data flow and performance within the integrated environment. ETL and Data Integration : - Design, develop, and implement ETL processes for data extraction, transformation, and loading. - Ensure data quality and consistency throughout the integration process. - Develop and maintain data integration documentation. - Implement data validation and error handling mechanisms. Data Modeling and Data Governance : - Develop and maintain data models that align with business requirements. - Implement and enforce data governance policies and procedures. - Ensure data security and compliance with relevant regulations. - Establish and maintain data dictionaries and metadata repositories. Issue Resolution and Troubleshooting : - Proactively identify and resolve architectural issues. - Conduct root cause analysis and implement corrective actions. - Provide technical guidance and support to development teams. - Communicate issues and risks proactively. Collaboration and Communication : - Collaborate with stakeholders to understand data requirements and translate them into technical solutions. - Communicate effectively with technical and non-technical audiences. - Participate in design reviews and code reviews. - Work as good single contributor and good team player. Qualifications : Experience : - 8-12 years of hands-on experience in data architecture and related fields. - Minimum 4 years of experience in architectural design and integration. - Experience working with cloud based data solutions. Technical Skills : - Strong expertise in MS Dynamics and data lake architecture. - Proficiency in ETL tools and techniques (e.g., Azure Data Factory, SSIS, etc.). - Expertise in data modeling techniques (e.g., dimensional modeling, relational modeling). - Strong understanding of data warehousing concepts and best practices. - Proficiency in SQL and other data query languages. - Experience with data quality assurance and data governance. - Experience with cloud platforms such as Azure or AWS. Soft Skills : - Strong analytical and problem-solving skills. - Excellent communication and interpersonal skills. - Ability to work independently and as part of a team. - Flexible and adaptable to changing priorities. - Proactive and self-motivated. - Ability to deal with ambiguity. - Open to continuous learning. - Self-confident and humble. - Intelligent, rigorous thinker who can operate successfully amongst bright people.
Posted 1 month ago
4.0 - 7.0 years
3 - 7 Lacs
Hyderabad
Work from Office
Notice Period : Immediate Joiners only Mandatory Skills : ETL Tester, ETL Tools, SQL, Data warehouse, Automic, Control-M, Test scenarios, Test case preparations, Test execution ,Agile Team Interview Availability : 6&7 Feb 2025 Face to Face Job Description : Primary Skills : - Strong understanding of Data warehouse concepts and ETL testing - Experience in automating ETL testing and Data testing using any ETL tool and scripting - Expertise building configurable ETL test automation suites to increase efficiency - Expertise building granular and reusable test cases to be used across different modules in a software line - Plans and defines testing approach and provides prioritization of testing activities to support risks in project timelines or test scenarios - Should have excellent analytical and problem-solving skills - Should have excellent requirement gathering and analysis skills to identify the gaps - Responsible for validating the data sources, extraction of data, applying transformation logic, and loading the data in the target tables - Experience on creating Test Scenarios & Test Case Preparation, Test Execution, Defects & status reporting - Responsible for writing complex SQL queries, Stored Procedures and functions - Responsible for troubleshooting defects, Logging defects, Retesting and Bug Closure - Experience of working in an Agile team - Experience with scheduling tools like Automic, Control - M - Work with multiple teams to define stories and acceptance criteria - Experience in JIRA and Q-Test Secondary Skills : - Concept knowledge of Cloud Computing Services like AWS - Concept knowledge of using APIs like REST - Having Unix scripting knowledge is an add on
Posted 1 month ago
6.0 - 10.0 years
20 - 30 Lacs
Pune, Gurugram, Bengaluru
Work from Office
Lead o9 platform implementation, configure solutions, build Python plugins, integrate via SSIS/T-SQL, support demand/supply planning modules, perform fit-gap analysis, and drive end-to-end delivery in a hybrid work setup Required Candidate profile 6–8 years total experience with 4+ years in o9/SCM, strong in configuration, Python plugins, SSIS, T-SQL, IBPL, and SCM modules. Excellent communication and client interaction skills
Posted 1 month ago
7.0 - 12.0 years
15 - 19 Lacs
Mumbai, Delhi / NCR, Bengaluru
Work from Office
Job Title: SF Lead Developer Experience: 7-15 Years Location: Remote, Chennai, Hyderabad, Kolkata, Pune, Ahmedabad iSource Services is hiring for one of their client for the position of SF Lead developer. Job Description: We are looking for a skilled and experienced SF Lead Developer to join our team. The ideal candidate will have proficiency in Salesforce platform setup, configuration, and customization. The role will involve working on Salesforce declarative tools, data management, automation, and scripting, alongside Salesforce integration and data migration. Desired Candidate Profile: 7-15 years of experience in Salesforce development and platform expertise. Strong knowledge of Salesforce declarative tools and development skills in Apex and JavaScript. Strong understanding of data migration and integration processes. Excellent problem-solving, communication, and collaboration skills. Key Responsibilities: 1. Salesforce Platform Expertise: Proficient in Salesforce setup, configuration, and declarative customization. Strong knowledge of Salesforce declarative tools, including UI Configuration, Flow, and Validation Rules. Experience with Salesforce data modeling, object relationships, custom metadata types, and field types. Understanding of Salesforce security concepts such as data visibility hierarchy, profiles, roles, and sharing rules. 2. Data Management: Knowledgeable in data migration and integration tools (Data Loader, ETL tools). Ability to understand and apply data cleansing, de-duplication, and validation processes. Proficient in creating and managing reports and dashboards in Salesforce. 3. Automation and Scripting: (For Developer Consultant role) Experience with Salesforce automation tools (Apex triggers, Lightning Web Component, Batch Apex, Apex classes, etc.). Familiarity with scripting languages such as Apex and JavaScript. Preferred Certifications: Salesforce Certified Administrator Salesforce Certified Platform App Builder Salesforce Certified Platform Developer 1 Desired Candidate Profile: 7-15 years of experience in Salesforce development and platform expertise. Strong knowledge of Salesforce declarative tools and development skills in Apex and JavaScript. Strong understanding of data migration and integration processes. Excellent problem-solving, communication, and collaboration skills.
Posted 1 month ago
10.0 - 15.0 years
25 - 30 Lacs
Hyderabad
Hybrid
We are seeking a Senior Manager - Pricing Analytics for the pricing team in Thomson Reuters. Central Pricing Team works with Pricing Managers, Business Units, Product Marketing Managers, Finance and Sales in price execution of new product launches, maintenance of existing ones, and creation & maintenance of data products for reporting & analytics. The team is responsible for providing product and pricing information globally to all internal stakeholders and collaborating with upstream and downstream teams to ensure offer pricing readiness. Apart from BAU, the team works on various automation, pricing transformation projects & pricing analytics initiatives. About the Role In this role as a Senior Manager - Pricing Analytics , you will: Lead and mentor a team of pricing analysts, data engineers, and BI developers Drive operational excellence by fostering a culture of data quality, accountability, and continuous improvement. Manage team capacity, project prioritization, and cross-functional coordination with Segment Pricing, Finance, Sales, and Analytics teams Partner closely with the Pricing team to translate business objectives into actionable analytics deliverables. Drive insights on pricing performance, discounting trends, segmentation, and monetization opportunities. Oversee design and execution of robust ETL pipelines to consolidate data from multiple sources (e.g., Salesforce, EMS, UNISON, SAP, Pendo, Product usage platforms etc). Ensure delivery of intuitive, self-service dashboards and reports that track key pricing KPIs, sales performance, and customer behaviour. Strategize, deploy and promote scalable analytics architecture and best practices in data governance, modelling, and visualization. Act as a trusted advisor to Pricing leadership by delivering timely, relevant, and accurate data insights. Collaborate with analytics, finance, segment pricing and data platform teams to align on data availability, definitions, and architecture. Shift Timings: 2 PM to 11 PM (IST) Work from office for 2 days in a week (Mandatory) About You Youre a fit for the role of Senior Marketing Analyst, if your background includes: 10+ years of experience in analytics, data science, or business intelligence, with 3+ years in a people leadership or managerial role. Proficiency in SQL, ETL tools (e.g. Alteryx, dbt, airflow), and BI platforms (e.g., Tableau, Power BI, Looker) Knowledge of Python, R, or other statistical tools is a plus Experience with data from Salesforce, SAP, other CRM, ERP or CPQ tools Ability to translate complex data into actionable insights and communicate effectively with senior stakeholders. Strong understanding of data analytics, monetization metrics, and SaaS pricing practices Proven experience working in a B2B SaaS or software product company preferred MBA, Masters in Analytics, Engineering, or a quantitative field preferred
Posted 1 month ago
6.0 - 11.0 years
8 - 12 Lacs
Chennai
Hybrid
Work Mode: Hybrid Interview Mode: Virtual (2 Rounds) Type: Contract-to-Hire (C2H) Job Summary We are looking for a skilled PySpark Developer with hands-on experience in building scalable data pipelines and processing large datasets. The ideal candidate will have deep expertise in Apache Spark , Python , and working with modern data engineering tools in cloud environments such as AWS . Key Skills & Responsibilities Strong expertise in PySpark and Apache Spark for batch and real-time data processing. Experience in designing and implementing ETL pipelines, including data ingestion, transformation, and validation. Proficiency in Python for scripting, automation, and building reusable components. Hands-on experience with scheduling tools like Airflow or Control-M to orchestrate workflows. Familiarity with AWS ecosystem, especially S3 and related file system operations. Strong understanding of Unix/Linux environments and Shell scripting. Experience with Hadoop, Hive, and platforms like Cloudera or Hortonworks. Ability to handle CDC (Change Data Capture) operations on large datasets. Experience in performance tuning, optimizing Spark jobs, and troubleshooting. Strong knowledge of data modeling, data validation, and writing unit test cases. Exposure to real-time and batch integration with downstream/upstream systems. Working knowledge of Jupyter Notebook, Zeppelin, or PyCharm for development and debugging. Understanding of Agile methodologies, with experience in CI/CD tools (e.g., Jenkins, Git). Preferred Skills Experience in building or integrating APIs for data provisioning. Exposure to ETL or reporting tools such as Informatica, Tableau, Jasper, or QlikView. Familiarity with AI/ML model development using PySpark in cloud environments Skills: ci / cd , zeppelin , pycharm , pyspark , etl tools,control-m,unit test cases,tableau,performance tuning , jenkins , qlikview , informatica , jupyter notebook,api integration,unix/linux,git,aws s3 , hive , cloudera , jasper , airflow , cdc , pyspark , apache spark, python, aws s3, airflow/control-m, sql, unix/linux, hive, hadoop, data modeling, and performance tuning,agile methodologies,aws,s3,data modeling,data validation,ai/ml model development,batch integration,apache spark,python,etl pipelines,shell scripting,hortonworks,real-time integration,hadoop
Posted 1 month ago
5.0 - 10.0 years
20 - 35 Lacs
Pune, Gurugram
Work from Office
In one sentence We are seeking a skilled Database Migration Specialist with deep expertise in mainframe modernization and data migration to cloud platforms such as AWS, Azure, or GCP. The ideal candidate will have hands-on experience migrating legacy systems (COBOL, DB2, IMS, VSAM, etc.) to modern cloud-native databases like PostgreSQL, Oracle, or NoSQL. What will your job look like? Lead and execute end-to-end mainframe-to-cloud database migration projects. Analyze legacy systems (z/OS, Unisys) and design modern data architectures. Extract, transform, and load (ETL) complex datasets ensuring data integrity and taxonomy alignment. Collaborate with cloud architects and application teams to ensure seamless integration. Optimize performance and scalability of migrated databases. Document migration processes, tools, and best practices. Required Skills & Experience 5+ years in mainframe systems (COBOL, CICS, DB2, IMS, JCL, VSAM, Datacom). Proven experience in cloud migration (AWS DMS, Azure Data Factory, GCP Dataflow, etc.). Strong knowledge of ETL tools, data modeling, and schema conversion. Experience with PostgreSQL, Oracle, or other cloud-native databases. Familiarity with data governance, security, and compliance in cloud environments. Excellent problem-solving and communication skills.
Posted 1 month ago
6.0 - 8.0 years
5 - 8 Lacs
Mumbai
Hybrid
Work Mode: Hybrid Interview Mode: Virtual (2 Rounds) Type: Contract-to-Hire (C2H) Job Summary We are looking for a skilled PySpark Developer with hands-on experience in building scalable data pipelines and processing large datasets. The ideal candidate will have deep expertise in Apache Spark , Python , and working with modern data engineering tools in cloud environments such as AWS . Key Skills & Responsibilities Strong expertise in PySpark and Apache Spark for batch and real-time data processing. Experience in designing and implementing ETL pipelines, including data ingestion, transformation, and validation. Proficiency in Python for scripting, automation, and building reusable components. Hands-on experience with scheduling tools like Airflow or Control-M to orchestrate workflows. Familiarity with AWS ecosystem, especially S3 and related file system operations. Strong understanding of Unix/Linux environments and Shell scripting. Experience with Hadoop, Hive, and platforms like Cloudera or Hortonworks. Ability to handle CDC (Change Data Capture) operations on large datasets. Experience in performance tuning, optimizing Spark jobs, and troubleshooting. Strong knowledge of data modeling, data validation, and writing unit test cases. Exposure to real-time and batch integration with downstream/upstream systems. Working knowledge of Jupyter Notebook, Zeppelin, or PyCharm for development and debugging. Understanding of Agile methodologies, with experience in CI/CD tools (e.g., Jenkins, Git). Preferred Skills Experience in building or integrating APIs for data provisioning. Exposure to ETL or reporting tools such as Informatica, Tableau, Jasper, or QlikView. Familiarity with AI/ML model development using PySpark in cloud environments Skills: ci/cd,zeppelin,pycharm,pyspark,etl tools,control-m,unit test cases,tableau,performance tuning,jenkins,qlikview,informatica,jupyter notebook,api integration,unix/linux,git,aws s3,hive,cloudera,jasper,airflow,cdc,pyspark, apache spark, python, aws s3, airflow/control-m, sql, unix/linux, hive, hadoop, data modeling, and performance tuning,agile methodologies,aws,s3,data modeling,data validation,ai/ml model development,batch integration,apache spark,python,etl pipelines,shell scripting,hortonworks,real-time integration,hadoop
Posted 1 month ago
4.0 - 9.0 years
14 - 24 Lacs
Chennai, Bengaluru, Delhi / NCR
Work from Office
ETL+ GCP or SQL+GCP • Minimum 3+ years of application development experience. • Should be strong in any ETL tools • Should be strong in Shell Scripting • Should be very strong in SQL • Should have good hands on experience in GC P( Google Cloud Platform) • Exposure to ELT framework is an added advantage. Healthcare knowledge preferred Should be flexible to work during overlap hours up to 8:30 PM IST based on business requirements
Posted 1 month ago
7.0 - 12.0 years
7 - 17 Lacs
Bengaluru
Work from Office
In this role, you will: Act as an advisor to leadership to develop or influence applications, network, information security, database, operating systems, or web technologies for highly complex business and technical needs across multiple groups Lead the strategy and resolution of highly complex and unique challenges requiring in-depth evaluation across multiple areas or the enterprise, delivering solutions that are long-term, large-scale and require vision, creativity, innovation, advanced analytical and inductive thinking Translate advanced technology experience, an in-depth knowledge of the organizations tactical and strategic business objectives, the enterprise technological environment, the organization structure, and strategic technological opportunities and requirements into technical engineering solutions Provide vision, direction and expertise to leadership on implementing innovative and significant business solutions Maintain knowledge of industry best practices and new technologies and recommends innovations that enhance operations or provide a competitive advantage to the organization Strategically engage with all levels of professionals and managers across the enterprise and serve as an expert advisor to leadership Required Qualifications: 7+ years of Engineering experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education Desired Qualifications: Strong experience working in GCP-Cloud transformation and implementation Experience working with Abinitio, Informatica ETL Tools or equivalent, Teradata, Google BigQuery Well versed with Data Warehousing methodologies, dimensional modelling, Tuning Strong communication and interpersonal skills Cloud Strategy/Transition Lead the Google Cloud Platform (GCP) transformation and implementation initiative for Commercial Bank Partner with the architects & data leaders in coming up with cloud migration path Design and do POCs on cloud platforms for various usecases Evaluate and share the progress on cloud transition on a regular periodic basis Stay on top of the enterprise cloud strategy Data Environment Transformation/Simplification Lead the application transformation and rationalization initiatives Analyzes performance trends and recommends process improvements; assesses changes for risk to production systems and assures quality, security and compliance requirements Evaluate, incubate, adopt modern technologies and engineering practices and recommends innovations that provide competitive advantage to the organization Develop strategies to improve developer productivity and reduce technology debt Develop strategy to improve data quality and maintenance of data lineage Architecture Oversight Develops consistent architecture strategy and delivers safe, secure and consistent architecture solutions. Reduces technology risk by working closely with architect and design solutions that aligns to architecture roadmap, enterprise principles, policies and standards Partner with Enterprise cloud platform, CI/CD pipelines, platform teams, architects, engineering managers and the developer community. Job Expe ctations: Act as liaison between business and technical organization by planning, conducting, and directing the analysis of highly complex business problems to be solved with automated systems. Provides technical assistance in identifying, evaluating, and developing systems and procedures that are cost effective and meet business requirements Acts as an internal consultant within technology and business groups by using quality tools and process definition/improvement to re-engineer technical processes for greater efficiencies Candidate must be based out of posted location (Hyderabad/Bangalore) and will be required to work in the office as per organizations In Office Adherence / Return To Office (RTO)
Posted 1 month ago
1.0 - 3.0 years
3 - 5 Lacs
Hyderabad
Work from Office
What you will do In this vital role you will be responsible for designing, building, maintaining, analyzing, and interpreting data to provide actionable insights that drive business decisions. This role involves working with large datasets, developing reports, supporting and performing data governance initiatives and, visualizing data to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has deep technical skills, experience with big data technologies, and a deep understanding of data architecture and ETL processes. Roles & Responsibilities: Design, develop, and maintain data solutions for data generation, collection, and processing Be a crucial team member that assists in design and development of the data pipeline Build data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions Take ownership of data pipeline projects from inception to deployment, manage scope, timelines, and risks Collaborate with cross-functional teams to understand data requirements and design solutions that meet business needs Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency Implement data security and privacy measures to protect sensitive data Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions Collaborate and communicate effectively with product teams Collaborate with Data Architects, Business SMEs, and Data Scientists to design and develop end-to-end data pipelines to meet fast-paced business needs across geographic regions Identify and resolve complex data-related challenges Adhere to best practices for coding, testing, and designing reusable code/component Explore new tools and technologies that will help to improve ETL platform performance Participate in sprint planning meetings and provide estimations on technical implementation Basic Qualifications: Masters degree and 1 to 3 years of Computer Science, IT or related field experience OR Bachelors degree and 3 to 5 years of Computer Science, IT or related field experience OR Diploma and 7 to 9 years of Computer Science, IT or related field experience Preferred Qualifications: Must-Have Skills: Hands-on experience with big data technologies and platforms, such as Databricks, Apache Spark (PySpark, SparkSQL), workflow orchestration, performance tuning on big data processing Proficiency in data analysis tools (eg. SQL) and experience with data visualization tools Excellent problem-solving skills and the ability to work with large, complex datasets Solid understanding of data governance frameworks, tools, and best practices. Knowledge of data protection regulations and compliance requirements Good-to-Have Skills: Experience with ETL tools such as Apache Spark, and various Python packages related to data processing, machine learning model development Good understanding of data modeling, data warehousing, and data integration concepts Knowledge of Python/R, Databricks, SageMaker, cloud data platforms Professional Certifications Certified Data Engineer / Data Analyst (preferred on Databricks or cloud environments) Soft Skills: Excellent critical-thinking and problem-solving skills Good communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills
Posted 1 month ago
7.0 - 10.0 years
10 - 20 Lacs
Bengaluru
Work from Office
Job location Bangalore Experience 7-10 Years Job Description Must have hands on exp (min 6-8 years) in SnapLogic Pipeline Development with good debugging skills. ETL jobs migration exp into Snaplogic, Platform Moderation and cloud exposure on AWS Good to have SnapLogic developer certification, hands on exp in Snowflake. Should be strong in SQL, PL/SQL and RDBMS. Should be strong in ETL Tools like DataStage, informatica etc with data quality. Proficiency in configuring SnapLogic components, including snaps, pipelines, and transformations Designing and developing data integration pipelines using the SnapLogic platform to connect various systems, applications, and data sources. Building and configuring SnapLogic components such as snaps, pipelines, and transformations to handle data transformation, cleansing, and mapping. Experience in Design, development and deploying the reliable solutions. Ability to work with business partners and provide long lasting solutions Snaplogic Integration - Pipeline Development. Staying updated with the latest SnapLogic features, enhancements, and best practices to leverage the platform effectively.
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough