Jobs
Interviews

1781 Data Architecture Jobs - Page 37

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

10.0 - 15.0 years

25 - 30 Lacs

Bengaluru

Work from Office

Req ID: 328244 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a APAC Presales Solution Architect to join our team in Bangalore, Karn taka (IN-KA), India (IN). "Job Duties: Senior Data and AI Architect - Presales Grade 11 Seeking a senior data solution architect to closely work with India and APAC sales teams for technical solutioning and presales work. The Consultant is a seasoned level expert who is responsible for participating in the delivery of multi-technology consulting services to clients by providing strategies and solutions on all aspects of infrastructure and related technology components. This role collaborates with other stakeholders on the development of the architectural approach for one or more layer of a solution. This role has the primary objective is to work on strategic projects that ensure the optimal functioning of the client s technology infrastructure. Key Responsibilities: Ability and experience to have conversations with the CEO, Business owners and CTO/CDO Break down intricate business challenges, devise effective solutions, and focus on client needs. Craft high level innovative solution approach for complex business problems Utilize best practices and creativity to address challenges Leverage market research, formulate perspectives, and communicate insights to clients Establish strong client relationships Interact at appropriate levels to ensure client satisfaction Minimum Skills Required: Knowledge and Attributes: Ability to focus on detail with an understanding of how it impacts the business strategically. Excellent client service orientation. Ability to work in high-pressure situations. Ability to establish and manage processes and practices through collaboration and the understanding of business. Ability to create new and repeat business for the organization. Ability to contribute information on relevant vertical markets Ability to contribute to the improvement of internal effectiveness by contributing to the improvement of current methodologies, processes and tools. Academic Qualifications and Certifications: BE/BTech or equivalent in Information Technology and/or Business Management or a related field. Scaled Agile certification desirable. Relevant consulting and technical certifications preferred, for example TOGAF. Required Experience: 12-15 years Seasoned demonstrable experience in a similar role within a large scale (preferably multi- national) technology services environment. Very good understanding of Data, AI, Gen AI and Agentic AI Must have Data Architecture and Solutioning experience. Capable of E2E Data Architecture and GenAI Solution design. Must be able to work on Data & AI RFP responses as Solution Architect 10+ years of experience in Solution Architecting of Data & Analytics, AI/ML & Gen AI Technical Architect Develop On-prem, Cloud-native technical approach and proposal plans identifying the best practice solutions meeting the requirements for a successful proposal. Create, edit, and review documents, diagrams, and other artifacts in response to RPPs RFQs and Contribute to and participate in presentations to customers regarding proposed solutions. Experience with large scale consulting and program execution engagements in AI and data Seasoned multi-technology infrastructure design experience. Seasoned demonstrable level of expertise coupled with consulting and client engagement experience, demonstrating good experience in client needs assessment and change management. Additional Job Description Knowledge and application: Seasoned, experienced professional; has complete knowledge and understanding of area of specialization. Uses evaluation, judgment, and interpretation to select right course of action. Problem solving: Works on problems of diverse scope where analysis of information requires evaluation of identifiable factors. Resolves and assesses a wide range of issues in creative ways and suggests variations in approach. Interaction: Enhances relationships and networks with senior internal/external partners who are not familiar with the subject m"

Posted 1 month ago

Apply

8.0 - 12.0 years

19 - 22 Lacs

Kolkata

Remote

Type: Contract (8-12 Months) Role Overview : We are seeking a highly skilled and experienced Senior Data Architect to join our team on a contract basis. This role will be pivotal in designing and implementing robust data architectures, ensuring data governance, and driving data-driven insights. The ideal candidate will possess deep expertise in MS Dynamics, data lake architecture, ETL processes, data modeling, and data integration. You will collaborate closely with stakeholders to understand their data needs and translate them into scalable and efficient solutions. Responsibilities : Data Architecture Design and Development: - Design and implement comprehensive data architectures, including data lakes, data warehouses, and data integration strategies. - Develop and maintain conceptual, logical, and physical data models. - Define and enforce data standards, policies, and procedures. - Evaluate and select appropriate data technologies and tools. - Ensure scalability, performance, and security of data architectures. - MS Dynamics and Data Lake Integration : - Lead the integration of MS Dynamics with data lake environments. - Design and implement data pipelines for efficient data movement between systems. - Troubleshoot and resolve integration issues. - Optimize data flow and performance within the integrated environment. ETL and Data Integration : - Design, develop, and implement ETL processes for data extraction, transformation, and loading. - Ensure data quality and consistency throughout the integration process. - Develop and maintain data integration documentation. - Implement data validation and error handling mechanisms. Data Modeling and Data Governance : - Develop and maintain data models that align with business requirements. - Implement and enforce data governance policies and procedures. - Ensure data security and compliance with relevant regulations. - Establish and maintain data dictionaries and metadata repositories. Issue Resolution and Troubleshooting : - Proactively identify and resolve architectural issues. - Conduct root cause analysis and implement corrective actions. - Provide technical guidance and support to development teams. - Communicate issues and risks proactively. Collaboration and Communication : - Collaborate with stakeholders to understand data requirements and translate them into technical solutions. - Communicate effectively with technical and non-technical audiences. - Participate in design reviews and code reviews. - Work as good single contributor and good team player. Qualifications : Experience : - 8-12 years of hands-on experience in data architecture and related fields. - Minimum 4 years of experience in architectural design and integration. - Experience working with cloud based data solutions. Technical Skills : - Strong expertise in MS Dynamics and data lake architecture. - Proficiency in ETL tools and techniques (e.g., Azure Data Factory, SSIS, etc.). - Expertise in data modeling techniques (e.g., dimensional modeling, relational modeling). - Strong understanding of data warehousing concepts and best practices. - Proficiency in SQL and other data query languages. - Experience with data quality assurance and data governance. - Experience with cloud platforms such as Azure or AWS. Soft Skills : - Strong analytical and problem-solving skills. - Excellent communication and interpersonal skills. - Ability to work independently and as part of a team. - Flexible and adaptable to changing priorities. - Proactive and self-motivated. - Ability to deal with ambiguity. - Open to continuous learning. - Self-confident and humble. - Intelligent, rigorous thinker who can operate successfully amongst bright people.

Posted 1 month ago

Apply

8.0 - 12.0 years

19 - 22 Lacs

Mumbai

Remote

Type: Contract (8-12 Months) Role Overview : We are seeking a highly skilled and experienced Senior Data Architect to join our team on a contract basis. This role will be pivotal in designing and implementing robust data architectures, ensuring data governance, and driving data-driven insights. The ideal candidate will possess deep expertise in MS Dynamics, data lake architecture, ETL processes, data modeling, and data integration. You will collaborate closely with stakeholders to understand their data needs and translate them into scalable and efficient solutions. Responsibilities : Data Architecture Design and Development: - Design and implement comprehensive data architectures, including data lakes, data warehouses, and data integration strategies. - Develop and maintain conceptual, logical, and physical data models. - Define and enforce data standards, policies, and procedures. - Evaluate and select appropriate data technologies and tools. - Ensure scalability, performance, and security of data architectures. - MS Dynamics and Data Lake Integration : - Lead the integration of MS Dynamics with data lake environments. - Design and implement data pipelines for efficient data movement between systems. - Troubleshoot and resolve integration issues. - Optimize data flow and performance within the integrated environment. ETL and Data Integration : - Design, develop, and implement ETL processes for data extraction, transformation, and loading. - Ensure data quality and consistency throughout the integration process. - Develop and maintain data integration documentation. - Implement data validation and error handling mechanisms. Data Modeling and Data Governance : - Develop and maintain data models that align with business requirements. - Implement and enforce data governance policies and procedures. - Ensure data security and compliance with relevant regulations. - Establish and maintain data dictionaries and metadata repositories. Issue Resolution and Troubleshooting : - Proactively identify and resolve architectural issues. - Conduct root cause analysis and implement corrective actions. - Provide technical guidance and support to development teams. - Communicate issues and risks proactively. Collaboration and Communication : - Collaborate with stakeholders to understand data requirements and translate them into technical solutions. - Communicate effectively with technical and non-technical audiences. - Participate in design reviews and code reviews. - Work as good single contributor and good team player. Qualifications : Experience : - 8-12 years of hands-on experience in data architecture and related fields. - Minimum 4 years of experience in architectural design and integration. - Experience working with cloud based data solutions. Technical Skills : - Strong expertise in MS Dynamics and data lake architecture. - Proficiency in ETL tools and techniques (e.g., Azure Data Factory, SSIS, etc.). - Expertise in data modeling techniques (e.g., dimensional modeling, relational modeling). - Strong understanding of data warehousing concepts and best practices. - Proficiency in SQL and other data query languages. - Experience with data quality assurance and data governance. - Experience with cloud platforms such as Azure or AWS. Soft Skills : - Strong analytical and problem-solving skills. - Excellent communication and interpersonal skills. - Ability to work independently and as part of a team. - Flexible and adaptable to changing priorities. - Proactive and self-motivated. - Ability to deal with ambiguity. - Open to continuous learning. - Self-confident and humble. - Intelligent, rigorous thinker who can operate successfully amongst bright people.

Posted 1 month ago

Apply

3.0 - 6.0 years

7 - 11 Lacs

Hyderabad

Work from Office

We are looking forward to hire SnowFlake Professionals in the following areas : Senior Snowflake Developer : Responsible for designing and implementing data pipelines, ETL processes, and data modeling in Snowflake Responsible to translate business requirements into ELT pipelines using data replication tools and data transformation tools (such as DBT) or advanced SQL scripting (views, Snowflake Store Procedure, UDF). Deep understanding of Snowflake architecture and processing Exp with performance tuning of Snowflake data warehouse, Experience working with Snowflake Functions, hands on exp with Snowflake utilities, stage and file upload features, time travel, fail safe Responsible for development, deployment, code reviews, and production support. Maintain and implement best practices for Snowflake infrastructure Hands-on in complex SQL, parsing complex data sets Primary Skills: Must have 4 to 6 yrs. in IT, 3+ years working as a Snowflake Developer, and 5+ years in Data warehouse, ETL, and BI projects. Must have experience in at least one complex implementation of a Snowflake Data Warehouse and DBT hands-on experience Expertise in Snowflake data modeling, ELT using Snowflake SQL or Modern Data Replication tools Snowflake Store Procedures / UDF / advanced SQL scripting, and standard Data Lake / Data Warehouse concepts. Expertise in Snowflake advanced concepts like setting up resource monitors, RBAC controls, virtual warehouse sizing, query performance tuning, Zero copy clone, and time travel. Expertise in deploying Snowflake features such as data sharing, events, and lake-house patterns. Hands-on experience with Snowflake utilities, SnowSQL, SnowPipe and Big Data model techniques. Deep understanding of relational data stores, methods, and approaches (star and snowflake, dimensional modeling). Hands-on experience with DBT Core or DBT Cloud, including dev and prod deployment using CI/CD (BitBucket) is a plus. Should be able to develop and maintain documentation of the data architecture, data flow, and data models of the data warehouse. Good communication skills Python and API experience is a plus Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture

Posted 1 month ago

Apply

8.0 - 12.0 years

16 - 25 Lacs

Bengaluru

Hybrid

Your tasks As a team member of the Group Sector Central Function Technology of ContiTech, you will be globally responsible to drive success of applied data science and effective usage in all areas of the organization. Design and implement innovative data-drive solutions to solve complex business problems, using cutting-edge techniques to build up customer centric platforms to reduce complexity, ensure standards, maximize cost savings as well as developing innovative material-driven solutions for a carbon-neutral and 100% circular product portfolio. You Consult on effectively applying Data Science methodology, technology, and good practices, complementing AI/DS platform owners, AI/DS sector owners, AI/DS project owners. Create and execute a global Data Science/Data Architecture strategy based on the overall CT Technology strategy together with the CFO IT Data Science team. Portfolio Management - Define the Process of Portfolio and bring it into execution Ideation Facilitator: Explore with Business Areas to keep the Use case pipeline filled continuously Develop best practices, guidelines, and standards for data science in collaboration with AI/DS stakeholders to build solutions. Maintain a deep understanding of the business and industry to identify emerging trends and research to actively drive technology opportunities. Work closely with cross-functional teams to identify opportunities for improving business performance and delivering impactful results. Develop and implement models and algorithms using machine learning, deep learning, natural language processing, and statistical analysis. Lead the development and implementation of data pipelines and infrastructure to support data analysis and modeling. Establish and maintain partnerships with internal and external partners. Optimize data-driven decision-making processes to enhance operational efficiency and effectiveness. Ensure data quality and integrity through rigorous validation and testing procedures. Mentor and train junior data scientists to build a strong and capable team. Your Qualifications University degree in Computer Science, Data Science, Mathematics, Statistics, or related fields. Extensive knowledge of data science techniques including descriptive and inferential statistics, big data and data mining technologies, and machine learning. Experience with programming languages such as Python or R. More than 8 years of dedicated experience in various data science-related roles in the industry. Experience in agile project management with internal and external partners. Strong project management skills with the ability to manage multiple projects simultaneously. Excellent communication and leadership skills. Critical thinking, organizational skills, and a problem-solving mindset. Experience in a manufacturing industry is a plus. Inspire your stakeholders and contact partners with effective communication and high intercultural sensitivity. Collaborative, team player attitude and convinced that networking and knowledge sharing are key drivers for success. Up-to-date on the latest trends in the industry concerning classical data science, data mining, advanced analytics, etc. Proven ability to translate complex data into actionable insights for non-technical stakeholders. Demonstrated ability to work in a fast-paced, dynamic environment and adapt to changing priorities. Additional Information You inspire your stakeholder and contact partner with an effective communication of high intercultural sensitivity Collaborative, team player attitude and convinced that networking and knowledge sharing are key drivers for success up to date on the latest trends in the industry concerning classical data science, data mining, advanced analytics, etc.

Posted 1 month ago

Apply

10.0 - 15.0 years

55 - 60 Lacs

Mumbai, Delhi / NCR, Bengaluru

Work from Office

Position Overview We are seeking an experienced Data Modeler/Lead with deep expertise in health plan data models and enterprise data warehousing to drive our healthcare analytics and reporting initiatives. The candidate should have hands-on experience with modern data platforms and a strong understanding of healthcare industry data standards. Key Responsibilities Data Architecture & Modeling Design and implement comprehensive data models for health plan operations, including member enrollment, claims processing, provider networks, and medical management Develop logical and physical data models that support analytical and regulatory reporting requirements (HEDIS, Stars, MLR, risk adjustment) Create and maintain data lineage documentation and data dictionaries for healthcare datasets Establish data modeling standards and best practices across the organization Technical Leadership Lead data warehousing initiatives using modern platforms like Databricks or traditional ETL tools like Informatica Architect scalable data solutions that handle large volumes of healthcare transactional data Collaborate with data engineers to optimize data pipelines and ensure data quality Healthcare Domain Expertise Apply deep knowledge of health plan operations, medical coding (ICD-10, CPT, HCPCS), and healthcare data standards (HL7, FHIR, X12 EDI) Design data models that support analytical, reporting and AI/ML needs Ensure compliance with healthcare regulations including HIPAA/PHI, and state insurance regulations Partner with business stakeholders to translate healthcare business requirements into technical data solutions Data Governance & Quality Implement data governance frameworks specific to healthcare data privacy and security requirements Establish data quality monitoring and validation processes for critical health plan metrics Lead eUorts to standardize healthcare data definitions across multiple systems and data sources. Required Qualifications Technical Skills 10+ years of experience in data modeling with at least 4 years focused on healthcare/health plan data Expert-level proficiency in dimensional modeling, data vault methodology, or other enterprise data modeling approaches Hands-on experience with Informatica PowerCenter/IICS or Databricks platform for large-scale data processing Strong SQL skills and experience with Oracle Exadata and cloud data warehouses (Databricks) Proficiency with data modeling tools (Hackolade, ERwin, or similar) Healthcare Industry Knowledge Deep understanding of health plan data structures including claims, eligibility, provider data, and pharmacy data Experience with healthcare data standards and medical coding systems Knowledge of regulatory reporting requirements (HEDIS, Medicare Stars, MLR reporting, risk adjustment) Familiarity with healthcare interoperability standards (HL7 FHIR, X12 EDI) Leadership & Communication Proven track record of leading data modeling projects in complex healthcare environments Strong analytical and problem-solving skills with ability to work with ambiguous requirements Excellent communication skills with ability to explain technical concepts to business stakeholders Experience mentoring team members and establishing technical standards. Preferred Qualifications Experience with Medicare Advantage, Medicaid, or Commercial health plan operations Cloud platform certifications (AWS, Azure, or GCP) Experience with real-time data streaming and modern data lake architectures Knowledge of machine learning applications in healthcare analytics Previous experience in a lead or architect role within healthcare organizations. Locations : Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, Remote

Posted 1 month ago

Apply

8.0 - 12.0 years

10 - 14 Lacs

Mumbai, New Delhi, Bengaluru

Work from Office

We are seeking an experienced Data Architect to join our team , working with our UAE-based client. The ideal candidate should have 8-12 years of hands-on experience in data architecture, with at least 4 years in architectural design and integration. Strong expertise in MS Dynamics and data lake architecture is required, along with proficiency in ETL, data modeling, data integration, and data quality assurance. The candidate should have a strong problem-solving mindset, the ability to handle architectural issues, and experience in troubleshooting. They should also be a proactive contributor and a team player with a flexible attitude. The role requires immediate availability and the ability to work as per UAE timings. Location: Chennai, Hyderabad, Kolkata, Pune, Ahmedabad, Remote

Posted 1 month ago

Apply

8.0 - 12.0 years

12 - 22 Lacs

Hyderabad

Work from Office

We are seeking a highly experienced and self-driven Senior Data Engineer to design, build, and optimize modern data pipelines and infrastructure. This role requires deep expertise in Snowflake, DBT, Python, and cloud data ecosystems. You will play a critical role in enabling data-driven decision-making across the organization by ensuring the availability, quality, and integrity of data. Key Responsibilities: Design and implement robust, scalable, and efficient data pipelines using ETL/ELT frameworks. Develop and manage data models and data warehouse architecture within Snowflake . Create and maintain DBT models for transformation, lineage tracking, and documentation. Write modular, reusable, and optimized Python scripts for data ingestion, transformation, and automation. Collaborate closely with data analysts, data scientists, and business teams to gather and fulfill data requirements. Ensure data integrity, consistency, and governance across all stages of the data lifecycle. Monitor pipeline performance and implement optimization strategies for queries and storage. Follow best practices for data engineering including version control (Git), testing, and CI/CD integration. Required Skills and Qualifications: 8+ years of experience in Data Engineering or related roles. Deep expertise in Snowflake : schema design, performance tuning, security, and access controls. Proficiency in Python , particularly for scripting, data transformation, and workflow automation. Strong understanding of data modeling techniques (e.g., star/snowflake schema, normalization). Proven experience with DBT for building modular, tested, and documented data pipelines. Familiarity with ETL/ELT tools and orchestration platforms like Apache Airflow or Prefect . Advanced SQL skills with experience handling large and complex data sets. Exposure to cloud platforms such as AWS , Azure , or GCP and their data services. Preferred Qualifications: Experience implementing data quality checks and governance frameworks. Understanding of modern data stack and CI/CD pipelines for data workflows. Contributions to data engineering best practices, open-source projects, or thought leadership.

Posted 1 month ago

Apply

8.0 - 12.0 years

19 - 22 Lacs

Ahmedabad

Remote

Type: Contract (8-12 Months) Role Overview : We are seeking a highly skilled and experienced Senior Data Architect to join our team on a contract basis. This role will be pivotal in designing and implementing robust data architectures, ensuring data governance, and driving data-driven insights. The ideal candidate will possess deep expertise in MS Dynamics, data lake architecture, ETL processes, data modeling, and data integration. You will collaborate closely with stakeholders to understand their data needs and translate them into scalable and efficient solutions. Responsibilities : Data Architecture Design and Development: - Design and implement comprehensive data architectures, including data lakes, data warehouses, and data integration strategies. - Develop and maintain conceptual, logical, and physical data models. - Define and enforce data standards, policies, and procedures. - Evaluate and select appropriate data technologies and tools. - Ensure scalability, performance, and security of data architectures. - MS Dynamics and Data Lake Integration : - Lead the integration of MS Dynamics with data lake environments. - Design and implement data pipelines for efficient data movement between systems. - Troubleshoot and resolve integration issues. - Optimize data flow and performance within the integrated environment. ETL and Data Integration : - Design, develop, and implement ETL processes for data extraction, transformation, and loading. - Ensure data quality and consistency throughout the integration process. - Develop and maintain data integration documentation. - Implement data validation and error handling mechanisms. Data Modeling and Data Governance : - Develop and maintain data models that align with business requirements. - Implement and enforce data governance policies and procedures. - Ensure data security and compliance with relevant regulations. - Establish and maintain data dictionaries and metadata repositories. Issue Resolution and Troubleshooting : - Proactively identify and resolve architectural issues. - Conduct root cause analysis and implement corrective actions. - Provide technical guidance and support to development teams. - Communicate issues and risks proactively. Collaboration and Communication : - Collaborate with stakeholders to understand data requirements and translate them into technical solutions. - Communicate effectively with technical and non-technical audiences. - Participate in design reviews and code reviews. - Work as good single contributor and good team player. Qualifications : Experience : - 8-12 years of hands-on experience in data architecture and related fields. - Minimum 4 years of experience in architectural design and integration. - Experience working with cloud based data solutions. Technical Skills : - Strong expertise in MS Dynamics and data lake architecture. - Proficiency in ETL tools and techniques (e.g., Azure Data Factory, SSIS, etc.). - Expertise in data modeling techniques (e.g., dimensional modeling, relational modeling). - Strong understanding of data warehousing concepts and best practices. - Proficiency in SQL and other data query languages. - Experience with data quality assurance and data governance. - Experience with cloud platforms such as Azure or AWS. Soft Skills : - Strong analytical and problem-solving skills. - Excellent communication and interpersonal skills. - Ability to work independently and as part of a team. - Flexible and adaptable to changing priorities. - Proactive and self-motivated. - Ability to deal with ambiguity. - Open to continuous learning. - Self-confident and humble. - Intelligent, rigorous thinker who can operate successfully amongst bright people.

Posted 1 month ago

Apply

8.0 - 13.0 years

25 - 40 Lacs

Pune, Bengaluru

Hybrid

Job Title: Data Modeller Architect About Us Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount . With our presence across 32 cities across globe, we support 100+ clients across banking, financial and Energy sectors. We are recognized for our deep transformation execution and delivery. WHY JOIN CAPCO? You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial services industry. MAKE AN IMPACT Innovative thinking, delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services. #BEYOURSELFATWORK Capco has a tolerant, open culture that values diversity, inclusivity, and creativity. CAREER ADVANCEMENT With no forced hierarchy at Capco, everyone could grow as we grow, taking their career into their own hands. DIVERSITY & INCLUSION We believe that diversity of people and perspective gives us a competitive advantage Job Description: JD – Data modeler requirement The business architecture team supports the delivery of key front office to back-office (F2B) transformation priorities of the management board. The data architecture team plays the central role of defining data model that will align the business processes and ensure data lineage, effective control and implement client strategy and reporting solutions. This will require building strong relationships with key stakeholders and help delivering tangible value. Key responsibilities: - - Define and manage data models used to implement solutions to automate business processes and controls. - Ensure the models follow bank’s data modelling standards and principles and influence them as necessary. - Active partner with various functional leads % teams to socialize the data models towards adoption and execution of front-to-back solutions Skills and experience: - - 8 + years in financial services, preferably strategy and solutions in Corporate and investment Banking domain. - Strong knowledge of transaction banking domain process and controls for banking & trading business to drive conversation with business SMEs. Experience in developing models for transacting banking products is preferable. - Knowledge of loans or cash/Deposits lifecycle and or customer lifecycle and related business data required to manage operations and analytics are desirable. - Well-developed business requirements analysis skills, including good communication abilities (both spoken and listening) and stakeholder management -all levels.

Posted 1 month ago

Apply

2.0 - 5.0 years

7 - 11 Lacs

Bengaluru

Work from Office

Way of working - Remote : Employees will have the freedom to work remotely all through the year. These employees, who form a large majority, will come together in their base location for a week, once every quarter. Position Overview: As a Data Engineer at Swiggy, you will be at the heart of our data-driven approach, collaborating with cross-functional teams to transform raw data into actionable insights. Your role will encompass and Join us as a Data Engineer at Swiggy to contribute significantly to our data ecosystem, drive operational efficiency, and be an integral part of our data-driven journey. Your expertise will play a pivotal role in influencing our strategic decisions and reshaping the food delivery landscape What will you get to do here Join hands with our Data Engineering team to ensure efficient data collection, storage, and processing. Collaborate in designing and optimizing data pipelines for seamless data movement. Work jointly on data architecture decisions to enhance analytics capabilities. Dive into large, complex datasets to create efficient and optimized queries for analysis. Identify bottlenecks and optimize data processing pipelines for improved performance. Implement best practices for query optimization, ensuring swift data retrieval. Contribute to the DataOps framework, automating data processes and enhancing data quality. Implement monitoring and alerting systems to ensure smooth data operations. Collaborate with the team to develop self-serve platforms for recurring analysis. What qualities are we looking for Bachelors or Master s degree in Engineering, Mathematics, Statistics, or a related quantitative field. 2-5 years of data engineering experience. Proficiency (2-5 years) in SQL, R, Python, Excel, etc, for effective data manipulation. Hands-on experience with Snowflake and Spark/Databricks, adept at Query Profiles and bottleneck identification. Apply creative thinking to solve real-world problems using data-driven insights. Embrace a "fail fast, learn faster" approach in a dynamic, fast-paced environment. Exhibit proficient verbal and written communication skills. Thrive in an unstructured environment, demonstrating attention to detail and self-direction. Foster collaboration and partnerships across functions.

Posted 1 month ago

Apply

4.0 - 9.0 years

5 - 9 Lacs

Gurugram

Work from Office

Key Responsibilities: ETL Development and Maintenance Design, develop, and implement ETL processes using SSIS to support data integration and warehousing requirements. Maintain and enhance existing ETL workflows to ensure data accuracy and integrity. Collaborate with data analysts, data architects, and other stakeholders to understand data requirements and translate them into technical specifications. Extract, transform, and load data from various source systems into the data warehouse. Perform data profiling, validation, and cleansing to ensure high data quality. Monitor ETL processes to ensure timely and accurate data loads. Write and optimize complex SQL queries to extract and manipulate data. Work with SQL Server to manage database objects, indexes, and performance tuning. Ensure data security and compliance with industry standards and regulations. Business Intelligence and Reporting: Develop and maintain interactive dashboards and reports using Power BI or SSRS. Collaborate with business users to gather requirements and create visualizations that provide actionable insights. Integrate Power BI with other data sources and platforms for comprehensive reporting. Scripting and Automation: Utilize Python for data manipulation, automation, and integration tasks. Develop scripts to automate repetitive tasks and improve efficiency. Insurance Domain Expertise: Leverage knowledge of insurance industry processes and terminology to effectively manage and interpret insurance data. Work closely with business users and stakeholders within the insurance domain to understand their data needs and provide solutions. Qualifications Required Skills and Qualifications: Technical Skills: Proficient in SQL and experience with SQL Server. Strong experience with SSIS for ETL development and data integration. Proficiency in Python for data manipulation and scripting. Experience with Power BI/SSRS for developing interactive dashboards and reports. Knowledge of data warehousing concepts and best practices. Domain KnowledgeSolid understanding of insurance industry processes, terminology, and data structures. Experience working with insurance-related data, such as policies, claims, underwriting, and actuarial data. Additional Skills: Strong problem-solving and analytical skills. Excellent communication and collaboration abilities. Job Location

Posted 1 month ago

Apply

3.0 - 7.0 years

5 - 9 Lacs

Bengaluru

Work from Office

enior SAP SAC Consultant - Job Responsibilities Solution Design & Architecture Lead the design and architecture of SAP Analytics Cloud (SAC) solutions aligned with business objectives. Translate complex business requirements into technical SAC models and dashboards. Define data architecture, models (live/acquired), and connectivity with SAP and non-SAP systems (e.g., BW/4HANA, S/4HANA, HANA, SQL, etc.). Dashboard & Story Development Develop interactive and visually compelling SAC stories and dashboards using advanced scripting and calculation capabilities. Customize UI/UX using SAC features like widgets, charts, filters, and responsive pages. Data Modeling & Integration Design and build data models within SAC and integrate external datasets as needed. Ensure high performance and accuracy through optimized data transformations and blending. Configure and manage data import/export jobs and schedules. Advanced Analytics & Planning Utilize SAC s predictive capabilities, Smart Insights, and Smart Discovery to provide actionable insights. Implement and manage planning scenarios, input forms, allocation logic, and forecast models (if applicable). Stakeholder Collaboration Act as the key point of contact between business users and IT, gathering requirements and providing best-practice solutions. Conduct workshops, training sessions, and end-user support activities. Performance Optimization & Governance Optimize SAC reports and stories for performance and usability. Enforce data governance, security roles, and access controls within SAC. Project Management & Leadership Lead end-to-end project lifecycle for SAC implementations, upgrades, and enhancements. Mentor junior consultants and provide technical guidance to cross-functional teams. Documentation & Compliance Prepare technical documentation, user guides, and test scripts. Ensure compliance with internal data security and external regulatory standards. Innovation & Continuous Improvement Stay current with SAC updates, roadmap features, and SAP BTP innovations. Proactively suggest improvements to enhance analytics maturity and value delivery.

Posted 1 month ago

Apply

15.0 - 22.0 years

50 - 60 Lacs

Hyderabad

Work from Office

We areseeking an experienced Data Solution Architect to lead the design andimplementation of scalable, secure, and high-performing data solutions acrosscloud and hybrid environments. The ideal candidate will bring deep expertise in Data Engineering, APIs, Python, Spark/PySpark , and enterprise cloudplatforms such as AWS and Azure . This is a strategic,client-facing role that involves working closely with stakeholders, engineeringteams, and business leaders to architect and deliver robust data platforms. KeyResponsibilities: Architect end-to-end data solutions across cloud (AWS/Azure) and on-premises environments Develop and integrate RESTful APIs for data ingestion, transformation, and distribution Define data architecture standards, best practices, and governance frameworks Work with DevOps and cloud teams to deploy solutions using CI/CD and infrastructure-as-code Guide and mentor data engineering teams in solution implementation and performance optimization Ensure high availability, scalability, and data security compliance across platforms Collaborate with product owners and stakeholders to translate business needs into technical specifications Conduct architecture reviews, risk assessments, and solution validation Requirements RequiredSkills & Experience: 15 to 22 years of total experience in IT, with at least 5+ years in data architecture roles Strong experience in data processing frameworks and building the ETL solutions Proven expertise in designing and deploying solutions on AWS and Azure cloud platforms Hands-on experience with data integration, real-time streaming , and API-based data access Proficient in data modeling (structured, semi-structured, unstructured data) Deep understanding of data lakes, data warehouses, and modern data mesh/architecture patterns Experience with tools such as Airflow, Glue, Data bricks, Synapse, Redshift, or similar Knowledge of security, compliance, and governance practices in large-scale data platforms Strong communication, leadership, and client-facing skills Benefits Standard Company Benefits ","

Posted 1 month ago

Apply

3.0 - 8.0 years

8 - 12 Lacs

Kolkata, Hyderabad, Pune

Work from Office

Salesforce CRMA Consultant1 Job Title: Salesforce CRMA Consultant Location: Offshore Duration: 6 Months : We are seeking a highly skilled Salesforce CRM Analytics (CRMA/Tableau CRM) Consultant to join our offshore team on a 6-month FTE engagement. The successful candidate will be responsible for designing and developing interactive dashboards, transforming Salesforce data into actionable insights, and collaborating closely with business stakeholders to meet reporting and analytics needs. Key Responsibilities: Design, develop, and deploy dashboards and visualizations using Salesforce CRMA/Tableau CRM Create and manage dataflows and recipes for data preparation and transformation Utilize SAQL , SOQL , and JSON-based dashboards to build customized analytical solutions Ensure alignment with the Salesforce data model and security protocols Optimize dashboard performance for scalability and responsiveness Partner with business and technical teams to understand requirements and translate them into technical solutions Maintain best practices for CRMA development and documentation Required Skills: Hands-on experience with Salesforce CRMA (Tableau CRM) Proficiency in SAQL, SOQL , and JSON Strong understanding of dataflows, recipes , and Salesforce data architecture Experience with dashboard performance tuning and optimization Familiarity with Salesforce data security model (sharing rules, field-level security, etc.) Location - Pune,Hyderabad,Kolkata,Jaipur,Chandigarh

Posted 1 month ago

Apply

10.0 - 15.0 years

25 - 30 Lacs

Bengaluru

Work from Office

Join us as a Data & Analytics Analyst This is an opportunity to take on a purpose-led role in a cutting edge Data & Analytics team You ll be consulting with our stakeholders to understand their needs and identify suitable data and analytics solutions to meet them along with business challenges in line with our purpose You ll bring advanced analytics to life through visualisation to tell powerful stories and influence important decisions for key stakeholders, giving you excellent recognition for your work Were offering this role as associate vice presidentr level What youll do As a Data & Analytics Analyst, you ll be driving the use of advanced analytics in your team to develop business solutions which increase the understanding of our business, including its customers, processes, channels and products. You ll be working closely with business stakeholders to define detailed, often complex and ambiguous business problems or opportunities which can be supported through advanced analytics, making sure that new and existing processes are designed to be efficient, simple and automated where possible. As well as this, you ll be: Leading and coaching your colleagues to plan and deliver strategic project and scrum outcomes Planning and delivering data and analytics resource, expertise and solutions, which brings commercial and customer value to business challenges Communicating data and analytics opportunities and bringing them to life in a way that business stakeholders can understand and engage with Adopting and embedding new tools, technologies and methodologies to carry out advanced analytics Developing strong stakeholder relationships to bring together advanced analytics, data science and data engineering work that is easily understandable and links back clearly to our business needs The skills youll need We re looking for someone with a passion for data and analytics together with knowledge of data architecture, key tooling and relevant coding languages. Along with advanced analytics knowledge, you ll bring an ability to simplify data into clear data visualisations and compelling insight using appropriate systems and tooling . You ll also demonstrate: Strong knowledge of data management practices and principles Experience of translating data and insights for key stakeholders Good knowledge of data engineering, data science and decisioning disciplines Strong communication skills with the ability to engage with a wide range of stakeholders Coaching and leadership experience with an ability to support and motivate colleagues Hours 45 Job Posting Closing Date: 06/07/2025

Posted 1 month ago

Apply

8.0 - 13.0 years

32 - 45 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Job Title: Data Scientist Architect Location: Pan India - hybrid Experience: 8+ years Position Overview: We are seeking a Data Scientist Architect to lead and drive data science and architecture initiatives within Brillio. The ideal candidate will have a deep understanding of data science, data engineering, and architecture, and be highly proficient in implementing cutting-edge solutions using tools like DataBricks, AWS, and Bedrock/Mistral . The role requires an individual with extensive experience in designing, building, and deploying large-scale data systems and machine learning models, along with the ability to lead and mentor cross-functional teams. As a Data Scientist Architect, you will have the opportunity to innovate and make a lasting impact across our diverse client base, providing them with tailored solutions that drive their data strategy forward. Key Responsibilities: Lead Data Architecture & Science Initiatives: Design and implement advanced data architectures and solutions to support complex data science and machine learning workflows. Build and deploy scalable, production-grade data pipelines and models leveraging cloud platforms like AWS and tools like DataBricks. Architect solutions involving large-scale data ingestion, transformation, and storage, focusing on performance, scalability, and reliability. Platform Development & Integration: Implement and manage cloud-based infrastructure for data engineering, analytics, and machine learning on platforms like AWS, leveraging services like S3, Lambda, EC2, etc. Work with Bedrock/Mistral to deploy and manage machine learning models at scale, ensuring continuous optimization and improvement. Skills and Qualifications: Experience: 8+ years of experience in Data Science, Data Architecture with a focus on large-scale data systems and cloud platforms. Proven track record of leading data science architecture projects from inception to deployment. Technical Skills: Proficiency in DataBricks, AWS (S3, EC2, Lambda, Redshift, SageMaker, etc.), and Bedrock/Mistral.

Posted 1 month ago

Apply

6.0 - 11.0 years

5 - 5 Lacs

Mumbai, Navi Mumbai, Mumbai (All Areas)

Work from Office

Role & responsibilities : Hands-on experience in relevant field (within essential requirement of post basic qualification experience) in: Proven track record in Database Performance Tuning, Database Security, Promoting Process Improvement, Problem Solving, Presenting Technical Information, Quality Focus, Database Management, Data Maintenance, Operating Systems, Attention to Detail, Information Security Policies Proven enterprise database administration experience in Oracle/Microsoft SQL Server Proficient in design and development of relational (& optionally hierarchical) databases, including ability to design for performance, scalability, availability, flexibility & extensibility, meeting security requirements. Strong knowledge of database engineering activities, including installation and configuration of RDBMS software. Work Experience as Database Administrator (DBA) Oracle Database 19C/ RAC configuration/ Oracle Golden Gate configuration. Experience with any Application Performance Monitoring Tools such as Dynatrace / AppDynamics BASIC QUALIFICATION (AS ON 31.03.2025): B.Tech / B.E. in Computer Science/ Computer Science & Engineering/ Information Technology/ Electronics/ Electronics & Communications Engineering or Equivalent Degree in above specified disciplines with minimum 60% score. Or MCA or equivalent Or M.Tech / M.Sc. in Computer Science/ Computer Science & Engineering/ Information Technology/ Electronics/ Electronics & Communications Engineering or Equivalent Degree in above specified disciplines. (From a University/ Institution/ Board recognised by Govt. Of India/ approved by Govt. Regulatory Bodies) OTHER QUALIFICATION (AS ON 31.03.2025) Preferred Certifications: (valid as on 31.03.2025) Oracle Certified Associate (OCA) Database Administrator Oracle Certified Professional (OCP) SQL Queries and PL/SQL WORK EXPERIENCE : Minimum 6 years post qualification experience in IT Industry.

Posted 1 month ago

Apply

5.0 - 10.0 years

20 - 25 Lacs

Bengaluru

Work from Office

Do you want to help create the future of healthcare? Our name, Siemens Healthineers, was selected to honor our people who dedicate their energy and passion to this cause. It reflects their pioneering spirit combined with our long history of engineering in the ever-evolving healthcare industry. We offer you a flexible and dynamic environment with opportunities to go beyond your comfort zone in order to grow personally and professionally. Sound interesting? Then come and join our global team as an Enterprise Architect (f/m/d) in IT to design the enterprise architecture for a large business unit or the entire company and to be responsible for the application landscape as well as for the technologies and development tools used. Your tasks and responsibilities: You will be responsible for enterprise architecture management (including business IT alignment and analysis of the application portfolio) of a large business unit or process domain and derive IT strategies from business requirements, ensuring alignment with the overall enterprise architecture You will drive the architecture roadmap and the application and data architecture for the business unit with a focus on security, scalability, and reliability of the IT landscape You will prepare decisions on the use of new technologies and platforms You will model IT architecture and processes and promote consistent design, planning and implementation of IT solutions You will be responsible for the coordination of communication with all important decision makers and relevant stakeholders and advise them on the development of the IT landscape You will drive the composition of the IT landscape and balance organizational needs with enterprise architecture decisions and objectives You will identify digitalization opportunities and synergies within the system landscape and represent system interrelationships holistically Restricted To find out more about the specific business, have a look at https: / / www.siemens-healthineers.com / products-services Your qualifications and experience: You have a degree in computer science, Industrial Engineering or a comparable qualification You have 10+ years of experience in global IT organizations, ideally in a mix of operational and architecture roles You have 5+ years of experience as a solution / application or enterprise architect Based on your very good understanding of complex IT processes and your openness to new technologies, you have acquired in-depth knowledge of software development, application management, enterprise architecture, enterprise architecture methodologies, governance structures and frameworks (e.g. TOGAF) You also have deep technological expertise and several years of experience in complex technology landscapes Functional or IT implementation experience across all key IT functions with a focus on PLM, SCM, Order-to-Cash and Accounting In-depth knowledge in at least one business domain such as CRM/Sales, R&D/PLM or SCM You have experience in business process analysis and modelling You bring several years of proven experience with working on different business process management models with Enterprise Architecture tools such as LeanIX or BizzDesign Further, you have a very good understanding of the interrelationships between functional business and technical IT structures Your attributes and skills: For working with specialist departments at home and abroad, we require very good English language skills, both spoken and written. Ideally you also have very good German language skills Restricted You are an organizational talent and impress with good communication and presentation skills - at very different levels in the organizational hierarchy You are a team player with a high level of social competence who can operate confidently in a global environment We dont compromise on quality - you work results- and quality-oriented with high commitment and possess good analytical and conceptual skills You are flexible in thought and action, have a quick grasp and constructive assertiveness Our global team: Siemens Healthineers is a leading global medical technology company. 50,000 dedicated colleagues in over 70 countries are driven to shape the future of healthcare. An estimated 5 million patients across the globe benefit every day from our innovative technologies and services in the areas of diagnostic and therapeutic imaging, laboratory diagnostics and molecular medicine, as well as digital health and enterprise services. Our culture: Our culture embraces different perspectives, open debate, and the will to challenge convention. Change is a constant aspect of our work. We aspire to lead the change in our industry rather than just react to it. That s why we invite you to take on new challenges, test your ideas, and celebrate success. Check our Careers Site at https: / / www.siemens-healthineers.com / de / careers As an equal opportunity employer, we welcome applications from individuals with disabilities.

Posted 1 month ago

Apply

15.0 - 20.0 years

20 - 25 Lacs

Noida

Work from Office

Exp 15+ year of exp Require knowledge of Talend but also knowledge of other Data related tools like Databricks or Snowflake The Senior Talend Developer/Architect role is responsible to lead the design, development and manage the INSEAD data infrastructure for the CRM ecosystem, to develop Talend Jobs & Flows and to act as a mentor for the other 3-4 Talend Developers. This role will be instrumental in driving data pipeline architecture and ensuring data integrity, performance, and scalability using the Talend platform. This role is key part of the HARMONIA project team while the engagement is active. The role will also be part of the Harmonia Data Quality project and Data Operations scrum teams. It will contribute to additional activities such data modelling & design, architecture, integration and propose technology strategy. The position holder must organize and plan her/his work under special consideration of a frictionless information flow in the Digital Solutions and the relevant business department. He/She will collaborate closely with cross-functional teams to deliver high-quality data solutions that support strategic business objectives. Job Requirements Details Design, develop, and deploy scalable ETL/ELT solutions using Talend (e.g. Data Stewardship, Management Console, Studio). Architect end-to-end data integration workflows. Establish development best practices, reusable components, and job templates to optimize performance and maintainability. Responsible for delivering robust data architecture, tested, validated and deployable jobs/flows to production environments. He/she will follow Talend best practices and JIRA development framework development practices. Translate business requirements into efficient and scalable Talend solutions. Assist with the developer input/feedback for those requirements wherever deemed necessary. These are to be done by actively leading brainstorming sessions arranged by the project manager. Work closely with the Manager of Data Operations and Quality, project manager, business analysts, data analysts, developers and other subject matter experts to align technical solutions with operational needs. Ensure alignment with data governance, security, and compliance standards. Responsible for ensuring that newly produced developments follow standard styles which are already part of the current Talend jobs & flows developed by the current integrator, and INSEAD teams. Actively participate to the project related activities and ensure the SDLC process is followed. Participate in the implementation and execution of data cleansing and normalization, deduplication and transformation projects. Conduct performance tuning, error handling, monitoring, and troubleshooting of Talend jobs and environments. Contribute to sprint planning and agile ceremonies with the Harmonia Project Team and Data Operations Team. Document technical solutions, data flows, and design decisions to support operational transparency. Stay current with Talend product enhancements and industry trends, recommending upgrades or changes where appropriate. No budget responsibility

Posted 1 month ago

Apply

1.0 - 6.0 years

15 - 19 Lacs

Bengaluru

Work from Office

In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purposeled and valuesdriven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . Responsibilities Build, Train, and Deploy ML Models using Python on Azure/AWS 1+ years of Experience in building Machine Learning and Deep Learning models in Python Experience on working on AzureML/AWS Sagemaker Ability to deploy ML models with REST based APIs Proficient in distributed computing environments / big data platforms (Hadoop, Elasticsearch, etc.) as well as common database systems and value stores (SQL, Hive, HBase, etc.) Ability to work directly with customers with good communication skills. Ability to analyze datasets using SQL, Pandas Experience of working on Azure Data Factory, PowerBI Experience on PySpark, Airflow etc. Experience of working on Docker/Kubernetes Mandatory skill sets Data Science, Machine Learning Preferred skill sets Data Science, Machine Learning Education qualification B.Tech / M.Tech / MBA / MCA Education Degrees/Field of Study required Master of Business Administration, Bachelor of Engineering, Master of Engineering Degrees/Field of Study preferred Required Skills Data Science Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline {+ 27 more} Travel Requirements Government Clearance Required?

Posted 1 month ago

Apply

1.0 - 6.0 years

15 - 19 Lacs

Bengaluru

Work from Office

At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purposeled and valuesdriven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . Responsibilities Build, Train, and Deploy ML Models using Python on Azure/AWS 1+ years of Experience in building Machine Learning and Deep Learning models in Python Experience on working on AzureML/AWS Sagemaker Ability to deploy ML models with REST based APIs Proficient in distributed computing environments / big data platforms (Hadoop, Elasticsearch, etc.) as well as common database systems and value stores (SQL, Hive, HBase, etc.) Ability to work directly with customers with good communication skills. Ability to analyze datasets using SQL, Pandas Experience of working on Azure Data Factory, PowerBI Experience on PySpark, Airflow etc. Experience of working on Docker/Kubernetes Mandatory skill sets Data Science, Machine Learning Preferred skill sets Data Science, Machine Learning Education qualification B.Tech / M.Tech / MBA / MCA Education Degrees/Field of Study required Master of Business Administration, Master of Engineering, Bachelor of Engineering Degrees/Field of Study preferred Required Skills Data Science Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline {+ 27 more} Travel Requirements Government Clearance Required?

Posted 1 month ago

Apply

4.0 - 7.0 years

6 - 10 Lacs

Kolkata

Work from Office

In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decisionmaking for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. Why PWC Learn more about us . s Experience of 4 to 7 years who has adequate knowledge Scalas objectoriented programming. Scala code written in the backend is the basis of the finance module reports which are accessed via QuickSight. To assess scala code written for Finance module reports, figure out the issues and fix the same. Mandatory skill sets Scala and OOP Preferred skill sets Scala and OOP Education qualification B.tech/MBA/MCA Education Degrees/Field of Study required Bachelor of Engineering, Master of Business Administration Degrees/Field of Study preferred Required Skills Scala (Programming Language) Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Communication, Competitive Advantage, Continuous Process Improvement, Creativity, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Inclusion, Industry Trend Analysis {+ 16 more} No

Posted 1 month ago

Apply

5.0 - 8.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Founded in 1976, CGI is among the worlds largest independent IT and business consulting services firms. With 94,000 consultants and professionals globally, CGI delivers an end-to-end portfolio of capabilities, from strategic IT and business consulting to systems integration, managed IT and business process services, and intellectual property solutions. CGI works with clients through a local relationship model complemented by a global delivery network that helps clients digitally transform their organizations and accelerate results. CGI Fiscal 2024 reported revenue is CA$14.68 billion, and CGI shares are listed on the TSX (GIB.A) and the NYSE (GIB). Learn more at cgi.com. Job Title: Perl Scripting Developer / Lead Position: SSE / LA Experience: 5 to 8yrs. Category: Software Development/ Engineering Job location: Bangalore or Chennai Position ID: J0425-1535 Work Type - Hybrid Employment Type: Full Time / Permanent Education: Bachelor s or Master s degree in Computer Science, Engineering, or a related field. The Technical Expert is responsible for performing the following tasks: Handling assistance requests, corrections, and enhancements Drafting technical specifications Producing estimates for minor and major enhancements Carrying out unit and integration testing Anticipating system failures or degradations as part of preventive maintenance Updating documentation Supervising the application Improving platform reliability Technical skills : Excellent command of scripting: Perl, Shell, preferably in an AIS environment Strong expertise in Java/J2EE (JDBC, JSP, Servlet) Excellent command of SQL/Oracle/PostGre Good command of development tools: GIT, an IDE, Linux system commands Proficiency with Tomcat, SOAP and REST web services Mastery of DevOps tools: Git, GitHub, CI/CD Pipelines (DevOps Pipeline, GitHub Actions, Jenkins Pipeline) Experience working in Agile Scrum #LI-GB9 Skills: Data Analysis Data Architecture Perl

Posted 1 month ago

Apply

5.0 - 10.0 years

25 - 30 Lacs

Kolkata

Work from Office

Some careers have more impact than others. If you re looking for a career where you can make a real impression, join HSBC and discover how valued you ll be. HSBC is one of the largest banking and financial services organizations in the world, with operations in 62 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realize their ambitions. We are currently seeking an experienced professional to join our team in the role of Product Owner Change and Digital Enablement Regulatory Compliance Principal responsibilities The individual will be responsible for reporting RC AI & Analytics scorecard and key performance indicators in a timely and accurate manner. Promote a culture of data driven decision making, aligning short term decisions and investments with longer term vision and objectives. Help the business to manage regulatory risk in a more effective, efficient, and commercial way through the adoption of data science (AI/ML and advanced analytics) Support communication and engagement with stakeholders and partners to increase understanding and adoption of data science products and services also research opportunities. Collaborate with other analytics teams across the banks to share insight and best practice. Foster a collaborative, open and agile delivery culture. Build positive momentum for change across the organization with the active support and buy-in of all stakeholders. The ability to communicate often complex analytical solutions to the wider department, ensuring a strong transfer of key findings & intelligence. Requirements University degree in technology, data analytics or related discipline or relevant work experience in computer or Data Science Understanding of Regulatory Compliance, risks and direct experience of deployment of controls and analytics to manage those risks. Experience in Financial Services (experience within a tier one bank) or related industry, Knowledge of the HSBC Group structure, its business and personnel, and HSBC s corporate culture Experience of agile development and active contribution to strategy and innovation. Solid understanding of applied mathematics, statistics, data science principles and advanced computing (machine learning, modelling, NLP and Generative AI) Experience working within the Hadoop ecosystem in addition to strong technical skills in analytical languages such as Python and SQL. Specific knowledge of GCP, AWS, Azure, Spark and/or graph theory an advantage. Experience of visualization tools and techniques including Qlik and Tableau Solid understanding of data & architecture concepts including cloud, ETL, Ontology, Data Modelling. Experience of using JIRA, GIT, Confluence, Teams, Slack, Advanced Excel You ll achieve more at HSBC

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies