Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
12.0 - 16.0 years
0 Lacs
karnataka
On-site
As an Experienced Senior Data Engineer at Adobe, you will utilize Big Data and Google Cloud technologies to develop large-scale, on-cloud data processing pipelines and data warehouses. Your role will involve consulting with customers worldwide on their data engineering needs around Adobe's Customer Data Platform and supporting pre-sales discussions regarding complex and large-scale cloud data engineering solutions. You will design custom solutions on cloud by integrating Adobe's solutions in a scalable and performant manner. Additionally, you will deliver complex, large-scale, enterprise-grade on-cloud data engineering and integration solutions in a hands-on manner. To be successful in this role, you should have a total of 12 to 15 years of experience, with 3 to 4 years of experience leading Data Engineer teams in developing enterprise-grade data processing pipelines on Google Cloud. You must have led at least one project of medium to high complexity involving the migration of ETL pipelines and Data warehouses to the cloud. Your recent 3 to 5 years of experience should be with premium consulting companies. Profound hands-on expertise with Google Cloud Platform services, especially BigQuery, Dataform, Dataplex, etc., is essential. Exceptional communication skills are crucial for effectively engaging with Data Engineers, Technology, and Business leadership. Furthermore, the ability to leverage knowledge of GCP to other cloud environments is highly desirable. It would be advantageous to have experience consulting with customers in India and possess multi-cloud expertise, with knowledge of AWS and GCP. At Adobe, creativity, curiosity, and continuous learning are valued qualities that contribute to your career growth journey. To pursue a new opportunity at Adobe, ensure to update your Resume/CV and Workday profile, including your unique Adobe experiences and volunteer work. Familiarize yourself with the Internal Mobility page on Inside Adobe to understand the process and set up job alerts for roles that interest you. Prepare for interviews by following the provided tips. Upon applying for a role via Workday, the Talent Team will contact you within 2 weeks. If you progress to the official interview process with the hiring team, inform your manager to support your career growth. At Adobe, you will experience an exceptional work environment recognized globally. You will collaborate with colleagues dedicated to mutual growth through the Check-In approach, where ongoing feedback is encouraged. If you seek to make an impact, Adobe is the ideal place for you. Explore employee career experiences on the Adobe Life blog and discover the meaningful benefits offered. For individuals with disabilities or special needs requiring accommodation to navigate the Adobe.com website or complete the application process, contact accommodations@adobe.com or call (408) 536-3015.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
At PwC, the infrastructure team focuses on designing and implementing secure IT systems that support business operations. The primary goal is to ensure the smooth functioning of networks, servers, and data centers to enhance performance and minimize downtime. In the infrastructure engineering role at PwC, you will be tasked with creating robust and scalable technology infrastructure solutions for clients. This will involve working on network architecture, server management, and cloud computing. As a Data Modeler, we are seeking candidates with a solid background in data modeling, metadata management, and data system optimization. Your responsibilities will include analyzing business requirements, developing long-term data models, and ensuring the efficiency and consistency of our data systems. Key areas of expertise for this role include: - Analyzing and translating business needs into long-term data model solutions. - Evaluating existing data systems and suggesting enhancements. - Defining rules for translating and transforming data across various models. - Collaborating with the development team to create conceptual data models and data flows. - Establishing best practices for data coding to maintain consistency within the system. - Reviewing modifications of existing systems for cross-compatibility. - Implementing data strategies and developing physical data models. - Updating and optimizing local and metadata models. - Utilizing canonical data modeling techniques to improve data system efficiency. - Evaluating implemented data systems for variances, discrepancies, and efficiency. - Troubleshooting and optimizing data systems for optimal performance. - Demonstrating strong expertise in relational and dimensional modeling (OLTP, OLAP). - Using data modeling tools like Erwin, ER/Studio, Visio, PowerDesigner effectively. - Proficiency in SQL and database management systems (Oracle, SQL Server, MySQL, PostgreSQL). - Understanding of NoSQL databases (MongoDB, Cassandra) and their data structures. - Experience with data warehouses and BI tools (Snowflake, Redshift, BigQuery, Tableau, Power BI). - Familiarity with ETL processes, data integration, and data governance frameworks. - Strong analytical, problem-solving, and communication skills. Qualifications for this position include: - A Bachelor's degree in Engineering or a related field. - 5 to 9 years of experience in data modeling or a related field. - 4+ years of hands-on experience with dimensional and relational data modeling. - Expert knowledge of metadata management and related tools. - Proficiency with data modeling tools such as Erwin, Power Designer, or Lucid. - Knowledge of transactional databases and data warehouses. Preferred Skills: - Experience in cloud-based data solutions (AWS, Azure, GCP). - Knowledge of big data technologies (Hadoop, Spark, Kafka). - Understanding of graph databases and real-time data processing. - Certifications in data management, modeling, or cloud data engineering. - Excellent communication and presentation skills. - Strong interpersonal skills to collaborate effectively with various teams.,
Posted 2 weeks ago
10.0 - 14.0 years
0 Lacs
haryana
On-site
You are a Senior BI Platform Engineer with over 10 years of experience and specialized knowledge in Tableau, Power BI, Alteryx, and MicroStrategy (MSTR). In this role, you will act as a technical lead and platform administrator for our BI platforms, ensuring consistent performance, providing advanced user support (L3), and engaging with stakeholders. Your responsibilities will also include establishing and managing CI/CD pipelines for BI assets to guarantee scalable, automated, and governed deployment processes. As the platform administrator, you will oversee Tableau, Power BI, Alteryx, and MSTR, managing permissions, data sources, server performance, and upgrades. You will provide Level 3 (L3) support for BI platforms, handling complex technical issues, performing root cause analysis, and troubleshooting at the platform level. Designing, implementing, and maintaining CI/CD pipelines for BI dashboards, dataflows, and platform configurations to facilitate agile development and deployment will also be part of your role. Collaboration with cross-functional teams to gather requirements and ensure the proper implementation of dashboards and analytics solutions is essential. Monitoring and optimizing BI platform performance, usage, and adoption will be key to your success. Working closely with data engineering teams to ensure data quality and availability for reporting needs is also a critical aspect of the role. Your duties will encompass creating and maintaining documentation for governance, support processes, and best practices. You will be responsible for training and mentoring users and junior team members on BI tools and reporting standards. Acting as a liaison between business stakeholders and technical teams to ensure alignment and timely issue resolution is another crucial aspect of the position. Furthermore, you will be tasked with managing all BI upgrades, optimizing the capacity of Power BI gateway, Tableau bridge, Alteryx server, and other BI platforms, as well as enabling new features in each of the BI platforms. Managing licenses optimally, including automated assignments, off-boarding users, and managing licensing, as well as managing RBAC for all BI platforms will also fall under your purview. The qualifications for this role include a minimum of 10 years of experience in a BI support or engineering role. Advanced proficiency in Tableau, Power BI, Alteryx, and MSTR, encompassing administrative functions, troubleshooting, and user support, is required. Demonstrated experience providing L3 support and managing CI/CD pipelines for BI platforms is vital. Strong knowledge of BI architecture, data visualization best practices, and data modeling concepts is essential. Excellent problem-solving and communication skills, with the ability to interact confidently with senior business leaders, are necessary. Experience with SQL, data warehouses, and cloud platforms (e.g., Azure, Snowflake) is preferred. A Bachelor's degree in computer science, Information Systems, or a related field is mandatory. Preferred qualifications include experience with Tableau Server/Cloud, Power BI Service, and MSTR administration, familiarity with enterprise data governance and access control policies, and certifications in Tableau, Power BI, Alteryx, or MSTR are considered advantageous.,
Posted 2 weeks ago
5.0 - 15.0 years
0 Lacs
noida, uttar pradesh
On-site
HCLTech is seeking a Data and AI Principal / Senior Manager (Generative AI) for their Noida location. As a global technology company with a workforce of over 218,000 employees in 59 countries, HCLTech specializes in digital, engineering, cloud, and AI solutions. The company collaborates with clients across various industries such as Financial Services, Manufacturing, Life Sciences, Healthcare, Technology, Telecom, Retail, and Public Services, offering innovative technology services and products. With consolidated revenues of $13.7 billion as of the 12 months ending September 2024, HCLTech aims to drive progress and transformation for its clients globally. Key Responsibilities: In this role, you will be responsible for providing hands-on technical leadership and oversight, including leading the design of AI and GenAI solutions, machine learning pipelines, and data architectures. You will actively contribute to coding, solution design, and troubleshooting critical components, collaborating with Account Teams, Client Partners, and Domain SMEs to ensure technical solutions align with business needs. Additionally, you will mentor and guide engineers across various functions to foster a collaborative and high-performance team environment. As part of the role, you will design and implement system and API architectures, integrating microservices, RESTful APIs, cloud-based services, and machine learning models seamlessly into GenAI and data platforms. You will lead the integration of AI, GenAI, and Agentic applications, NLP models, and large language models into scalable production systems. You will also architect ETL pipelines, data lakes, and data warehouses using tools like Apache Spark, Airflow, and Google BigQuery, and drive deployment using cloud platforms such as AWS, Azure, and GCP. Furthermore, you will lead the design and deployment of machine learning models using frameworks like PyTorch, TensorFlow, and scikit-learn, ensuring accurate and reliable outputs. You will develop prompt engineering techniques for GenAI models and implement best practices for ML model performance monitoring and continuous training. The role also involves expertise in CI/CD pipelines, Infrastructure-as-Code, cloud management, stakeholder communication, agile development, performance optimization, and scalability strategies. Required Qualifications: - 15+ years of hands-on technical experience in software engineering, with at least 5+ years in a leadership role managing cross-functional teams in AI, GenAI, machine learning, data engineering, and cloud infrastructure. - Proficiency in Python and experience with Flask, Django, or FastAPI for API development. - Extensive experience in building and deploying ML models using TensorFlow, PyTorch, scikit-learn, and spaCy, and integrating them into AI frameworks. - Familiarity with ETL pipelines, data lakes, data warehouses, and data processing tools like Apache Spark, Airflow, and Kafka. - Strong expertise in CI/CD pipelines, containerization, Infrastructure-as-Code, and API security for high-traffic systems. If you are interested in this position, please share your profile with the required details including Overall Experience, Skills, Current and Preferred Location, Current and Expected CTC, and Notice Period to paridhnya_dhawankar@hcltech.com.,
Posted 2 weeks ago
4.0 - 8.0 years
0 Lacs
karnataka
On-site
At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we're counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY-ConsultingKinaxis Rapid Response Planning Senior Consultant (4-8 years) The opportunity EY GDS is a global major in value-added Digital Supply Chain services for its clients. As part of this rapidly growing business segment, you will play a critical role in developing solutions, implementations, and performance improvement of Kinaxis Rapid Response. This is a fantastic opportunity to be part of a leading firm whilst being instrumental in the growth of a new service offering. Your key responsibilities Provide solutions proficiency to analyze and identify gaps, to lead the solution design, and implement the Rapid Response application to meet business requirements. Lead implementation, configurations, testing, training, knowledge transfer, and documentation activities. Able to conduct workshops to understand end-to-end business process requirements and propose the best possible solution. Deliver high-quality client solutions that meet and exceed client/EY expectations and are delivered on-time and on-budget. Manage client solution delivery, including defining project approach, motivating project teams, monitoring, managing project risks, managing client and EY key stakeholders, and successfully delivering client solutions. Identifying new business opportunities, including building strong client relations, understanding client needs and EY solution offerings, communicating client opportunities to EY leadership, and helping develop client opportunities. Skills and attributes for success Gather Business requirements/ lead design discussions with customer & business teams. Work on Proposal and RFPs. Analyze business requirements and Perform Fit-Gap Analysis. Develop detailed solution design based on business requirements. Strong expertise in detailed configuration and testing of Kinaxis Rapid Response planning tool. Assist customer/business teams during UAT phase. Prepare and Review project documentations. To qualify for the role, you must have Functional: In-depth knowledge of demand planning and forecasting and exposure to various forecasting techniques and the concepts like promotion planning, consensus demand planning. Technical: Workbook development - Table based, composite, data modification, Alerts - Monitoring, Hierarchies & Filters, Scenario hierarchy setup, Control Table Configuration, Planning Engine Knowledge, Data Model modification including custom fields and custom tables. Knowledge of integrating Kinaxis with host ERP systems through Data Warehouses for both Inbound and Outbound Interfaces, workflows, query development, preparation of detailed functional specifications for enhancements, layouts, and reports etc. 4 to 8 years of experience in supply chain consulting or operations role with proven experience in Kinaxis Rapid Response. Prior Implementation experience of end-to-end demand planning projects using the tool Kinaxis Rapid Response. Good understanding of functional and technical architecture to support working on data integration skills with multiple source and target systems. Ideally, you'll also have Overall, 4 to 8 years of experiences as SCM planner and responsibilities delivering projects in Supply Chain Management, Planning & Logistics domain. Working experience with Onsite & Offshore delivery model environment is preferred. Engaging with business partners and IT to understand requirements from various parts of an organization to drive the design, programming execution, and UAT for future state capabilities within the platform. Working in a fast-paced and dynamic environment while managing multiple projects and strict deadlines. Good understanding of outsourcing and offshoring, building win/win strategies and contracts with suppliers. What we look for Consulting experience, including assessments and implementations. Functional and technical Experience SCM Planning. Documenting requirements and processes (e.g., Process flows). Working collaboratively in a team environment. Excellent oral and written communication skills. Kinaxis Rapid Response Author certification or Contributor certification will be an added advantage. What working at EY offers At EY, we're dedicated to helping our clients, from startups to Fortune 500 companies - and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees, and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching, and feedback from some of the most engaging colleagues around. Opportunities to develop new skills and progress your career. The freedom and flexibility to handle your role in a way that's right for you. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform, and operate. Working across assurance, consulting, law, strategy, tax, and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.,
Posted 2 weeks ago
5.0 - 15.0 years
0 Lacs
noida, uttar pradesh
On-site
HCLTech is looking for a Data and AI Principal / Senior Manager (Generative AI) to join their team in Noida. As a global technology company with a strong presence in 59 countries and over 218,000 employees, HCLTech is a leader in digital, engineering, cloud, and AI services. They collaborate with clients in various industries such as Financial Services, Manufacturing, Life Sciences, Healthcare, Technology, Telecom, Media, Retail, and Public Services. With consolidated revenues of $13.7 billion, HCLTech aims to provide industry-leading capabilities to drive progress for their clients. In this role, you will be responsible for providing hands-on technical leadership and oversight. This includes leading the design of AI, GenAI solutions, machine learning pipelines, and data architectures to ensure performance, scalability, and resilience. You will actively contribute to coding, code reviews, and solution design, while working closely with Account Teams, Client Partners, and Domain SMEs to align technical solutions with business needs. Mentoring and guiding engineers across various functions will be an essential aspect of this role, fostering a collaborative and high-performance team environment. Your role will also involve designing and implementing system and API architectures, integrating AI, GenAI, and Agentic applications into production systems, and architecting ETL pipelines, data lakes, and data warehouses using industry-leading tools. You will drive the deployment and scaling of solutions using cloud platforms like AWS, Azure, and GCP, while leading the integration of machine learning models into end-to-end production workflows. Additionally, you will be responsible for leading CI/CD pipeline efforts, infrastructure automation, and ensuring robust integration with cloud platforms. Stakeholder communication, promoting Agile methodologies, and optimizing performance and scalability of applications will be key responsibilities. The ideal candidate will have at least 15 years of hands-on technical experience in software engineering, with a focus on AI, GenAI, machine learning, data engineering, and cloud infrastructure. If you meet the qualifications and are passionate about driving innovation in AI and data technologies, we invite you to share your profile with us. Kindly email your details to paridhnya_dhawankar@hcltech.com including your overall experience, skills, current and preferred location, current and expected CTC, and notice period. We look forward to hearing from you and exploring the opportunity to work together at HCLTech.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
At PwC, our team in infrastructure is dedicated to designing and implementing secure and robust IT systems that facilitate business operations. We focus on ensuring the smooth functioning of networks, servers, and data centers to enhance performance and reduce downtime. As part of the infrastructure engineering team at PwC, your role will involve creating and implementing scalable technology infrastructure solutions for our clients. This will encompass tasks such as network architecture, server management, and experience in cloud computing. We are currently seeking a Data Modeler with a solid background in data modeling, metadata management, and optimizing data systems. In this role, you will be responsible for analyzing business requirements, developing long-term data models, and maintaining the efficiency and consistency of our data systems. Key Responsibilities: - Analyze business needs and translate them into long-term data model solutions. - Evaluate existing data systems and suggest enhancements. - Define rules for data translation and transformation across different models. - Collaborate with the development team to design conceptual data models and data flows. - Establish best practices for data coding to ensure system consistency. - Review modifications to existing systems to ensure cross-compatibility. - Implement data strategies and create physical data models. - Update and optimize local and metadata models. - Utilize canonical data modeling techniques to improve system efficiency. - Evaluate implemented data systems for discrepancies, variances, and efficiency. - Troubleshoot and optimize data systems to achieve optimal performance. Key Requirements: - Proficiency in relational and dimensional modeling (OLTP, OLAP). - Experience with data modeling tools such as Erwin, ER/Studio, Visio, and PowerDesigner. - Strong skills in SQL and database management systems like Oracle, SQL Server, MySQL, and PostgreSQL. - Familiarity with NoSQL databases such as MongoDB and Cassandra, including their data structures. - Hands-on experience with data warehouses and BI tools like Snowflake, Redshift, BigQuery, Tableau, and Power BI. - Knowledge of ETL processes, data integration, and data governance frameworks. - Excellent analytical, problem-solving, and communication skills. Qualifications: - Bachelor's degree in Engineering or a related field. - 5 to 9 years of experience in data modeling or related areas. - 4+ years of practical experience in dimensional and relational data modeling. - Expertise in metadata management and relevant tools. - Proficiency in data modeling tools like Erwin, Power Designer, or Lucid. - Understanding of transactional databases and data warehouses. Preferred Skills: - Experience in cloud-based data solutions, such as AWS, Azure, and GCP. - Knowledge of big data technologies like Hadoop, Spark, and Kafka. - Understanding of graph databases and real-time data processing. - Certifications in data management, modeling, or cloud data engineering. - Strong communication and presentation skills. - Excellent interpersonal skills to collaborate effectively with diverse teams.,
Posted 2 weeks ago
0.0 - 4.0 years
0 Lacs
karnataka
On-site
As an Analytics Engineer Intern at Workato, you will play a crucial role in conceptualizing and building data models to support internal stakeholders. Your responsibilities will include designing visually compelling reporting solutions, optimizing analytical workflows, and developing frontend analytics to facilitate data-driven decision-making. Additionally, you will actively contribute to Data Governance initiatives and collaborate with business stakeholders to address data challenges effectively. Your primary duties will involve building efficient data models, optimizing analytical workflows using macros and packages, and developing frontend analytics tools. By working closely with internal business units, you will identify data pain points and propose solutions to enhance operational efficiency. Your role will also require you to support various operational needs across the organization. To excel in this role, you should be currently pursuing a degree in Business Analytics, Information Systems, or a related field, and be able to commit for a minimum of 6 months. Proficiency in Database concepts, intermediate SQL skills, and experience in using Business Intelligence Tools such as Tableau or PowerBI are essential. Familiarity with Extract-Load-Transform (ELT) scripts and data warehouses like Snowflake is advantageous. Apart from technical skills, you should possess strong soft skills and personal characteristics. Being a fast mover who thrives in a fast-paced environment, you must exhibit excellent organizational skills and the ability to manage multiple projects simultaneously. Effective analytical, verbal, and written communication skills are necessary to convey actionable insights to business leaders. A collaborative team player with a growth mindset and a proactive attitude towards learning will thrive in this role. If you are passionate about analytics, data modeling, and driving operational efficiency through insights, we invite you to apply for this exciting opportunity at Workato. Join us in our mission to empower organizations of all sizes and contribute to shaping the future of enterprise orchestration. We look forward to welcoming you to our dynamic and innovative team!,
Posted 3 weeks ago
10.0 - 14.0 years
0 Lacs
chennai, tamil nadu
On-site
We are looking for a highly motivated and experienced Data and Analytics Senior Architect to lead our Master Data Management (MDM) and Data Analytics team. As the Architect Lead, you will be responsible for defining and implementing the overall data architecture strategy to ensure alignment with business goals and support data-driven decision-making. Your role will involve designing scalable, secure, and efficient data systems, including databases, data lakes, and data warehouses. You will evaluate and recommend tools and technologies for data integration, processing, storage, and analytics while staying updated on industry trends. Additionally, you will lead a high-performing team, foster a collaborative and innovative culture, and ensure data integrity, consistency, and availability across the organization. Our existing MDM solution is based on Microsoft Data Lake gen 2, Snowflake as the DWH, and Power BI managing data from most of our core applications. You will be managing the existing solution and driving further development to handle additional data and capabilities, as well as supporting our AI journey. The ideal candidate will possess strong leadership skills, a deep understanding of data management and technology principles, and the ability to collaborate effectively across different departments and functions. **Principle Duties and Responsibilities:** **Team Leadership:** - Lead, mentor, and develop a high-performing team of data analysts and MDM specialists. - Foster a collaborative and innovative team culture that encourages continuous improvement and efficiency. - Provide technical leadership and guidance to the development teams and oversee the implementation of IT solutions. **Architect:** - Define the overall data architecture strategy, aligning it with business goals and ensuring it supports data-driven decision-making. - Identify, evaluate, and establish shared enabling technical capabilities for the division in collaboration with IT to ensure consistency, quality, and business value. - Design and oversee the implementation of data systems, including databases, data lakes, and data warehouses, ensuring they are scalable, secure, efficient, and cost-effective. - Evaluate and recommend tools and technologies for data integration, processing, storage, and analytics, staying updated on industry trends. **Strategic Planning:** - Take part in developing and implementing the MDM and analytics strategy aligned with the overall team and organizational goals. - Collaborate with the Enterprise architect to align on the overall strategy and application landscape securing that MDM and data analytics fit into the overall ecosystem. - Identify opportunities to enhance data quality, governance, and analytics capabilities. **Project Management:** - Oversee project planning, execution, and delivery to ensure timely and successful completion of initiatives and support. - Monitor project progress and cost, identify risks, and implement mitigation strategies. **Stakeholder Engagement:** - Collaborate with cross-functional teams to understand data needs and deliver solutions that support business objectives. - Serve as a key point of contact for data-related inquiries and support requests. - Actively develop business cases and proposals for IT investments and present them to senior management, executives, and stakeholders. **Data/Information Governance:** - Establish and enforce data/information governance policies and standards to ensure compliance and data integrity. - Champion best practices in data management and analytics across the organization. **Reporting and Analysis:** - Utilize data analytics to derive insights and support decision-making processes. - Document and present findings and recommendations to senior management and stakeholders. **Knowledge, Skills and Abilities Required:** - Bachelor's degree in computer science, Data Science, Information Management, or a related field; master's degree preferred. - 10+ years of experience in data management, analytics, or a related field, with at least 2 years in a leadership role. - Management advisory skills, such as strategic thinking, problem-solving, business acumen, stakeholder management, and change management. - Strong knowledge of master data management concepts, data governance, data technology, data modeling, ETL processes, database management, big data technologies, and data integration techniques. - Excellent project management skills with a proven track record of delivering complex projects on time and within budget. - Strong analytical, problem-solving, and decision-making abilities. - Exceptional communication and interpersonal skills, with the ability to engage and influence stakeholders at all levels. - Team player, result-oriented, structured, attention to detail, drive for accuracy, and strong work ethic. **Special Competencies required:** - Proven leader with excellent structural skills, good at documenting as well as presenting. - Strong executional skills to make things happen, not generate ideas alone but also getting things done of value for the entire organization. - Proven experience in working with analytics tools as well as data ingestion and platforms like Power BI, Azure Data Lake, Snowflake, etc. - Experience in working in any MDM solution and preferably TIBCO EBX. - Experience in working with Jira/Confluence. **Additional Information:** - Office, remote, or hybrid working. - Ability to function within variable time zones. - International travel may be required. Join us at the ASSA ABLOY Group, where our innovations make spaces physical and virtual safer, more secure, and easier to access. As an employer, we value results and empower our people to build their career around their aspirations and our ambitions. We foster diverse, inclusive teams and welcome different perspectives and experiences.,
Posted 3 weeks ago
9.0 - 13.0 years
0 Lacs
karnataka
On-site
At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we're counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. As a Lead Data Engineer at EY, you will play a crucial role in leading large scale solution architecture design and optimization to provide streamlined insights to partners throughout the business. You will lead a team of Mid- and Senior data engineers to collaborate with visualization on data quality and troubleshooting needs. Your key responsibilities will include implementing data processes for the data warehouse and internal systems, leading a team of Junior and Senior Data Engineers in executing data processes, managing data architecture, designing ETL processes, cleaning, aggregating, and organizing data from various sources, and transferring it to data warehouses. You will be responsible for leading the development, testing, and maintenance of data pipelines and platforms to enable data quality utilization within business dashboards and tools. Additionally, you will support team members and direct reports in refining and validating data sets, create, maintain, and support the data platform and infrastructure, and collaborate with various teams to understand data requirements and design solutions that enable advanced analytics, machine learning, and predictive modeling. To qualify for this role, you must have a Bachelor's degree in Engineering, Computer Science, Data Science, or related field, along with 9+ years of experience in software development, data engineering, ETL, and analytics reporting development. You should possess expertise in building and maintaining data and system integrations using dimensional data modeling and optimized ETL pipelines, as well as experience with modern data architecture and frameworks like data mesh, data fabric, and data product design. Other essential skillsets include proficiency in data engineering programming languages such as Python, distributed data technologies like Pyspark, cloud platforms and tools like Kubernetes and AWS services, relational SQL databases, DevOps, continuous integration, and more. You should have a deep understanding of database architecture and administration, excellent written and verbal communication skills, strong organizational skills, problem-solving abilities, and the capacity to work in a fast-paced environment while adapting to changing business priorities. Desired skillsets for this role include a Master's degree in Engineering, Computer Science, Data Science, or related field, as well as experience in a global working environment. Travel requirements may include access to transportation to attend meetings and the ability to travel regionally and globally. Join EY in building a better working world, where diverse teams in over 150 countries provide trust through assurance and help clients grow, transform, and operate across various sectors.,
Posted 3 weeks ago
6.0 - 10.0 years
0 Lacs
hyderabad, telangana
On-site
You will be joining as a Data Analytics Engineer at our office located in DLF Cyber City, Gachibowli, Hyderabad. Your primary responsibility will be to utilize your technical knowledge of data systems to solve business problems effectively. You should have a proven track record in developing optimal data models that drive valuable business insights, with a preference for experience using DBT. The ideal candidate will possess expert proficiency in at least one leading BI Platform such as DOMO, Tableau, or Power BI. You should have demonstrated experience working in various business subject areas like marketing, finance, sales, product, customer success, customer support, engineering, or people. Experience in extracting and transforming data from cloud-native sources like Salesforce and NetSuite will be an added advantage. To qualify for this role, you must have a minimum of 6 years of experience in the Data space as an analyst, engineer, scientist, or in a similar role. Additionally, you should have over 2 years of experience in managing the same data model system and evolving it to meet new business requirements. The role requires you to have successfully led at least four analytics projects from inception to operationalization. You should also have a demonstrated ability to design and socialize Entity Relationship Diagrams and reference SQL scripts to enhance data understanding and adoption across the organization. Furthermore, the ideal candidate will have experience working with multiple commercial data warehouses, ETL processes, and data visualization tools. Extensive experience in 2 or more major data subject areas, such as marketing, sales, finance, product, or people, will be highly valued for this position. If you are a proactive problem-solver with a passion for leveraging data to drive business success, we encourage you to apply for this exciting opportunity.,
Posted 3 weeks ago
8.0 - 12.0 years
0 Lacs
pune, maharashtra
On-site
As the Director/Head of Data Engineering for India, you will be responsible for developing and maintaining the data strategy for Singapore implementation. Your primary goal will be to create a model implementation that can be replicated across wider PBWM Organisation for compliance in other jurisdictions. You will define and execute the data engineering strategy in alignment with business goals and technology roadmaps. Collaborating with the Chief Data Officer/Chief Operating Officer, you will understand the Critical Data Elements (CDE) and establish controls around them. Your role will involve designing data models, efficient data pipelines, ensuring data quality and integrity, collaborating with data science and analytics teams, and scaling data solutions. Additionally, you will oversee data security and compliance, continuously learn and implement the latest technologies, manage and train the data engineering team, and implement cloud migration for data with appropriate hydrations. Budgeting, resource allocation, implementing data products, ensuring data reconciliation, and upholding high standards and quality in data are also key aspects of this role. In this strategic and senior leadership position, you will oversee data strategy, data engineering, data infrastructure, and data management practices within Private Banking and Wealth Management. Your responsibilities will include managing and developing the data team, delivering outstanding customer-focused service, ensuring quality and quantity are equally prioritized, adhering to policies and procedures, and advocating Barclays values and principles. You will lead effective data management, compliance, and analytics to support business goals, enhance customer experiences, and improve operational efficiencies. Recruiting, training, and developing the data engineering team, fostering collaboration and innovation, providing strategic guidance, and defining KPIs aligned with PBWM goals will be part of your duties. Collaborating with executive leadership, you will ensure data initiatives support the bank's growth, profitability, and risk management. You will oversee budgeting for data-related initiatives, allocate resources efficiently, and track performance indicators for the data engineering team and infrastructure to drive continuous improvement. The purpose of your role is to build and maintain systems that collect, store, process, and analyze data to ensure accuracy, accessibility, and security. Your accountabilities will include building and maintaining data architectures pipelines, designing and implementing data warehouses and data lakes, developing processing and analysis algorithms, and collaborating with data scientists to deploy machine learning models. As a Director, you are expected to manage a business function, contribute to strategic initiatives, provide expert advice, manage resourcing and budgeting, ensure compliance, and monitor external environments. Demonstrating leadership behaviours such as listening, inspiring, aligning, and developing others, along with upholding Barclays Values and Mindset, will be key to excelling in this role.,
Posted 3 weeks ago
9.0 - 13.0 years
0 Lacs
hyderabad, telangana
On-site
As a Technical Solutions Consultant, you will be the primary point of contact for our largest advertising clients and product partners, overseeing the technical aspect of our relationship. You will collaborate with cross-functional teams in Engineering, Sales, and Product Management to implement cutting-edge technologies for external clients. Your responsibilities will range from conceptual design and testing to data analysis and support, ensuring the technical execution and business operations of Google's online advertising platforms and product partnerships. Your role will involve striking a balance between business requirements and technical constraints, developing innovative solutions, and serving as a trusted consultant to your collaborators. You will lead the development of tools and automation products, manage the technical aspects of Google's partnerships, drive product strategy, and prioritize projects and resources effectively. The Google Pixel team is dedicated to enhancing the mobile experience by designing and delivering world-class products. By leveraging advanced designs, technologies, and experiences in consumer electronics, the team aims to shape the future of Pixel devices and services. This involves integrating Google's artificial intelligence, software, and hardware to create transformative experiences for users globally. Your key responsibilities will include: - Developing Extract, Transform, and Load (ETL) processes to automate data collection and reporting tasks using various large-scale distributed data systems. - Creating and reviewing technical and end-user documentation, such as requirements and design documents, data standards, and policies. - Designing, optimizing, and managing data warehouses and reporting systems. - Leading a team of Data Engineers and Business Intelligence Engineers, collaborating closely with Software Engineering teams to design and acquire log-level data. - Designing dashboards, optimizing statistical models, and presenting accurate findings to stakeholders. Your background should ideally include a Bachelor's degree in Science, Technology, Engineering, Mathematics, or related fields, along with at least 9 years of experience in data engineering or data sciences. A Master's degree and proficiency in programming languages like Python or Java would be advantageous. Experience in source control systems, statistical analysis, and modeling will also be beneficial for this role. Additionally, your ability to drive the development of business solutions and infrastructure, translate analysis results into actionable insights, and prioritize tasks effectively will be critical to your success in this position.,
Posted 3 weeks ago
12.0 - 15.0 years
35 - 40 Lacs
Gurugram, Bengaluru, Mumbai (All Areas)
Work from Office
Role: Data Solution Architect Location: Bangalore, Chennai, Hyderabad, Kochi, Mumbai, Kolkata, Noida, Gurgaon (Any office but he/she need to be local) Experience: 12-15 Years Budget: 35-40 LPA Only Immediate Joiners Role Objectives Design and maintain enterprise data architecture to support the analysis and interpretation of complex datasets from diverse sources, transforming them into actionable business insights. Lead deep technical investigations into complex data challenges to define solution architecture , propose target state models, and identify key benefits. Apply industry-standard templates, methodologies, and best practices for data architecture, integration, and governance. Implement enterprise data management capabilities including data catalogues , business glossaries , metadata management , data lineage , and reference/master data frameworks . Drive adoption of cloud-native platforms such as Azure Data Platform , Informatica , and automation tools for scalable data solutions. Architect automated data processing workflows using tools like Power Automate , Python , macros , and custom scripts to streamline data operations. Identify and implement data architecture optimizations for efficiency, performance, and scalability. Collaborate with cross-functional teams to enable data-driven decision-making through architectural guidance and scalable data solutions. Enforce and promote data management best practices and architectural governance standards. Stakeholder Management Partner with Product Owners and Data Governance Leaders to promote adoption and alignment of data architecture with business objectives. Ensure data solutions are understandable and usable by technical and non-technical stakeholders. Act as a data architecture evangelist , capturing feedback and driving continuous improvement. Governance and Compliance Operate in alignment with organizational policies, data standards, and regulatory requirements . Ensure adherence to enterprise data governance frameworks, policies, and audit controls . Essential Skills Strong experience as a Data Solution Architect in Financial Services (banking, superannuation, insurance, etc.) and expertise in customer data architecture . Proficiency in data architecture , data modeling , data warehouses , data lake design , and advanced analytics platforms . Experience in data transformation logic , data ingestion patterns , and data quality frameworks . Expertise in documenting and aligning business data requirements with technical solutions. Strategic thinker with strong collaboration and leadership capabilities. Excellent communication skills , translating business objectives into scalable data solutions. Project and stakeholder management , ensuring architectural deliverables are on track. Proven ability to manage multiple concurrent data architecture initiatives . Specific Skills We Are Seeking Candidates to Possess and Demonstrate Are: Ability to lead and operate independently , while collaborating with business users, architects, data engineers, testers, and analysts. Strong understanding of business workflows, enterprise systems , and gathering technical and business data requirements . Proficiency in creating data architecture artifacts , including ERDs, interface specs, data dictionaries, transformation logic, and lineage diagrams . Hands-on experience with complex SQL , ETL pipelines , and cloud-based data platforms . Additionally, It Would Be Advantageous If Candidates Can: Develop Power BI semantic models and design visual analytics dashboards for key business metrics. Evaluate and optimize data warehouse architecture and cloud migration strategies . Create data quality frameworks and reconciliation strategies across enterprise data sources. Lead or contribute to data governance initiatives and architectural enhancements.
Posted 3 weeks ago
6.0 - 10.0 years
0 Lacs
pune, maharashtra
On-site
You are a talented Data Engineer with a strong background in data engineering, and we are seeking your expertise to design, build, and maintain data pipelines using various technologies, focusing on the Microsoft Azure cloud platform. Your responsibilities will include designing, developing, and implementing data pipelines using Azure Data Factory (ADF) or other orchestration tools. You will be required to write efficient SQL queries for data extraction, transformation, and loading (ETL) into Azure Synapse Analytics. Utilizing PySpark and Python, you will handle complex data processing tasks on large datasets within Azure Databricks. Collaboration with data analysts to understand data requirements and ensure data quality is a key aspect of your role. You will also be responsible for designing and developing Datalakes and Warehouses, implementing data governance practices for security and compliance, monitoring and maintaining data pipelines for optimal performance, and developing unit tests for data pipeline code. Working collaboratively with other engineers and data professionals in an Agile development environment is essential. Preferred Skills & Experience: - Good knowledge of PySpark & working knowledge of Python - Full stack Azure Data Engineering skills (Azure Data Factory, DataBricks, and Synapse Analytics) - Experience with large dataset handling - Hands-on experience in designing and developing Datalakes and Warehouses Job Types: Full-time, Permanent Schedule: - Day shift - Monday to Friday Application Question(s): - When can you join Mention in days. - Are you serving Notice Period (Yes/No) - What is your current and expected CTC Education: - Bachelor's (Preferred) Experience: - Total work: 6 years (Required) - Data engineering-Azure: 6 years (Required) Location: - Pune, Maharashtra (Required) Work Location: In person Only immediate joiners are preferred.,
Posted 3 weeks ago
4.0 - 8.0 years
0 Lacs
chennai, tamil nadu
On-site
The Company Our beliefs are the foundation for how you conduct business every day. You live each day guided by our core values of Inclusion, Innovation, Collaboration, and Wellness. Together, our values ensure that you work together as one global team with our customers at the center of everything you do and they push you to ensure you take care of yourselves, each other, and our communities. Job Description Summary: What you need to know about the role: A Business Systems Analyst passionate about delivering quality deliverables in a fast-paced environment with an undivided customer focus. Meet our team: The Finance Technology team consists of a diverse group of well-talented, driven, hive-minded subject matter experts that relentlessly work towards enabling the best-in-class solutions for our customers to transform current state solutions. You will work with this team to set up finance solutions, explore avenues to automate, challenge the status quo, and simplify the current state through transformation. Job Description: Your way to impact Your day to day: - Build scalable systems by leading discussions with the business, understanding the requirements from both Customer and Business, and delivering requirements to the engineering team to guide them in building a robust, scalable solution. - Have hands-on technical experience to support across multiple platforms (GCP, Python, Hadoop, SAP, Teradata, Machine Learning). - Establish a consistent project management framework and develop processes to deliver high-quality software in rapid iterations for business partners in multiple geographies. - Participate in a team that designs, develops, troubleshoots, and debugs software programs for databases, applications, tools, etc. - Experience in balancing production platform stability, feature delivery, and the reduction of technical debt across a broad landscape of technologies. What Do You Need To Bring: - You have consistently high standards, and your passion for quality is inherent in everything you do. - Experience with GCP BQ, SQL, data flow. - 4+ years of relevant experience. - Data warehouses, Data marts, distributed data platforms, and data lakes. - Data Modeling, Schema design. - Reporting/Visualization Looker, Tableau, Power BI. - Knowledge of Statistical and machine learning models. - Excellent structured thinking skills, with the ability to break down multi-dimensional problems. - Ability to navigate ambiguity and work in a fast-moving environment with multiple stakeholders. - We know the confidence gap and imposter syndrome can get in the way of meeting spectacular candidates. Please don't hesitate to apply. Our Benefits: Who We Are: To learn more about our culture and community, visit https://about.pypl.com/who-we-are/default.aspx Commitment to Diversity and Inclusion Any general requests for consideration of your skills, please Join our Talent Community. We know the confidence gap and imposter syndrome can get in the way of meeting spectacular candidates. Please don't hesitate to apply. REQ ID R0115599,
Posted 3 weeks ago
10.0 - 14.0 years
1 - 10 Lacs
Bengaluru
Work from Office
Responsibilities: * Design enterprise architectures for AI deployments, data lakes & warehouses. * Lead legacy system modernization initiatives. * Ensure compliance with NIST, ISO & GDPR standard * Align AI, cloud, and security with business goals.
Posted 3 weeks ago
7.0 - 12.0 years
15 - 25 Lacs
Hyderabad, Bengaluru, Mumbai (All Areas)
Hybrid
Job Summary: We are seeking a highly motivated and experienced Senior Data Engineer to join our team. This role requires a deep curiosity about our business and a passion for technology and innovation. You will be responsible for designing and developing robust, scalable data engineering solutions that drive our business intelligence and data-driven decision-making processes. If you thrive in a dynamic environment and have a strong desire to deliver top-notch data solutions, we want to hear from you. Key Responsibilities: Collaborate with agile teams to design and develop cutting-edge data engineering solutions. Build and maintain distributed, low-latency, and reliable data pipelines ensuring high availability and timely delivery of data. Design and implement optimized data engineering solutions for Big Data workloads to handle increasing data volumes and complexities. Develop high-performance real-time data ingestion solutions for streaming workloads. Adhere to best practices and established design patterns across all data engineering initiatives. Ensure code quality through elegant design, efficient coding, and performance optimization. Focus on data quality and consistency by implementing monitoring processes and systems. Produce detailed design and test documentation, including Data Flow Diagrams, Technical Design Specs, and Source to Target Mapping documents. Perform data analysis to troubleshoot and resolve data-related issues. Automate data engineering pipelines and data validation processes to eliminate manual interventions. Implement data security and privacy measures, including access controls, key management, and encryption techniques. Stay updated on technology trends, experimenting with new tools, and educating team members. Collaborate with analytics and business teams to improve data models and enhance data accessibility. Communicate effectively with both technical and non-technical stakeholders. Qualifications: Education: Bachelors degree in Computer Science, Computer Engineering, or a related field. Experience: Minimum of 5+ years in architecting, designing, and building data engineering solutions and data platforms. Proven experience in building Lakehouse or Data Warehouses on platforms like Databricks or Snowflake. Expertise in designing and building highly optimized batch/streaming data pipelines using Databricks. Proficiency with data acquisition and transformation tools such as Fivetran and DBT. Strong experience in building efficient data engineering pipelines using Python and PySpark. Experience with distributed data processing frameworks such as Apache Hadoop, Apache Spark, or Flink. Familiarity with real-time data stream processing using tools like Apache Kafka, Kinesis, or Spark Structured Streaming. Experience with various AWS services, including S3, EC2, EMR, Lambda, RDS, DynamoDB, Redshift, and Glue Catalog. Expertise in advanced SQL programming and performance tuning. Key Skills: Strong problem-solving abilities and perseverance in the face of ambiguity. Excellent emotional intelligence and interpersonal skills. Ability to build and maintain productive relationships with internal and external stakeholders. A self-starter mentality with a focus on growth and quick learning. Passion for operational products and creating outstanding employee experiences.
Posted 1 month ago
3.0 - 6.0 years
40 - 45 Lacs
Kochi, Kolkata, Bhubaneswar
Work from Office
We are seeking experienced Data Engineers with over 3 years of experience to join our team at Intuit, through Cognizant. The selected candidates will be responsible for developing and maintaining scalable data pipelines, managing data warehousing solutions, and working with advanced cloud environments. The role requires strong technical proficiency and the ability to work onsite in Bangalore. Key Responsibilities: Design, build, and maintain data pipelines to ingest, process, and analyze large datasets using PySpark. Work on Data Warehouse and Data Lake solutions to manage structured and unstructured data. Develop and optimize complex SQL queries for data extraction and reporting. Leverage AWS cloud services such as S3, EC2, EMR, Athena, and Redshift for data storage, processing, and analytics. Collaborate with cross-functional teams to ensure the successful delivery of data solutions that meet business needs. Monitor data pipelines and troubleshoot any issues related to data integrity or system performance. Required Skills: 3 years of experience in data engineering or related fields. In-depth knowledge of Data Warehouses and Data Lakes. Proven experience in building data pipelines using PySpark. Strong expertise in SQL for data manipulation and extraction. Familiarity with AWS cloud services, including S3, EC2, EMR, Athena, Redshift, and other cloud computing platforms. Preferred Skills: Python programming experience is a plus. Experience working in Agile environments with tools like JIRA and GitHub.
Posted 1 month ago
10.0 - 14.0 years
12 - 16 Lacs
Hyderabad
Work from Office
Proven expert at writing SQL code with at least 10 years of experience. Must have 5+ years of experience working with large data with transactions in the order of 5 10M records. 5+ years of experience modeling loosely coupled relational databases that can store tera or petabytes of data. 3+ years of proven expertise in working with large Data Warehouses. Expert at ETL transformations using SSIS.
Posted 1 month ago
10.0 - 15.0 years
25 - 40 Lacs
Bengaluru
Work from Office
About Client Hiring for One of the Most Prestigious Multinational Corporations! Job Description Job Title : Aws Solution Architect Qualification : Any Graduate or Above Relevant Experience : 10 -15Years Required Technical Skill Set (Skill Name) : Data lakes, data warehouses, AWS Glue, Aurora with Postgres, MySQL and DynamoDB Location : Bangalore CTC Range : 25 LPA-40 LPA Notice period : Any Shift Timing : N/A Mode of Interview : Virtual Mode of Work : WFO( Work From Office) Pooja Singh KS IT Staffing Analyst Black and White Business solutions PVT Ltd Bangalore, Karnataka, INDIA. pooja.singh@blackwhite.in I www.blackwhite.in
Posted 1 month ago
3.0 - 5.0 years
20 - 22 Lacs
Udaipur
Work from Office
3-5 years of experience in Data Engineering or similar roles Strong foundation in cloud-native data infrastructure and scalable architecture design Build and maintain reliable, scalable ETL/ELT pipelines using modern cloud-based tools Design and optimize Data Lakes and Data Warehouses for real-time and batch processing
Posted 1 month ago
7.0 - 12.0 years
50 - 95 Lacs
Bengaluru
Hybrid
Seeking a Principal Data Engineer with the following skills, an excellent opportunity for top-tier coding experts . Experience with building and maintaining a Lakehouse architecture . Advanced expertise in Python and SQL. Expert in leveraging cloud-based data platform ( Snowflake , Databricks ) capabilities to create an enterprise lake house. Advanced expertise with AWS ecosystem Experience using spark for data ingestion/ processing Key Responsibilities Design, develop and maintain scalable data pipelines and architectures to support data ingestion, integration and analytics. Be accountable for technical delivery and take ownership of solutions. Lead a team of senior and junior developers providing mentorship and guidance. Collaborate with enterprise architects, business analysts and stakeholders to understand data requirements, validate designs and communicate progress. Drive technical innovation within the department to increase code reusability, code quality and developer productivity. Challenge the status quo by bringing the very latest data engineering practices and techniques
Posted 1 month ago
1.0 - 3.0 years
3 - 6 Lacs
Hyderabad
Hybrid
What you will do In this vital role, you will be responsible for the end-to-end development of an enterprise analytics and data mastering solution using Databricks and Power BI. This role requires expertise in both data architecture and analytics, with the ability to create scalable, reliable, and impactful enterprise solutions that research cohort-building and advanced research pipeline. The ideal candidate will have experience creating and surfacing large unified repositories of human data, based on integrations from multiple repositories and solutions, and be extraordinarily skilled with data analysis and profiling. You will collaborate closely with key customers, product team members, and related IT teams, to design and implement data models, integrate data from various sources, and ensure best practices for data governance and security. The ideal candidate will have a good background in data warehousing, ETL, Databricks, Power BI, and enterprise data mastering. Design and build scalable enterprise analytics solutions using Databricks, Power BI, and other modern data tools. Leverage data virtualization, ETL, and semantic layers to balance need for unification, performance, and data transformation with goal to reduce data proliferation Break down features into work that aligns with the architectural direction runway Participate hands-on in pilots and proofs-of-concept for new patterns Create robust documentation from data analysis and profiling, and proposed designs and data logic Develop advanced sql queries to profile, and unify data Develop data processing code in sql, along with semantic views to prepare data for reporting Develop PowerBI Models and reporting packages Design robust data models, and processing layers, that support both analytical processing and operational reporting needs. Design and develop solutions based on best practices for data governance, security, and compliance within Databricks and Power BI environments. Ensure the integration of data systems with other enterprise applications, creating seamless data flows across platforms. Develop and maintain Power BI solutions, ensuring data models and reports are optimized for performance and scalability. Collaborate with key customers to define data requirements, functional specifications, and project goals. Continuously evaluate and adopt new technologies and methodologies to enhance the architecture and performance of data solutions. What we expect of you We are all different, yet we all use our unique contributions to serve patients. The R&D Data Catalyst Team is responsible for building Data Searching, Cohort Building, and Knowledge Management tools that provide the Amgen scientific community with visibility to Amgens wealth of human datasets, projects and study histories, and knowledge over various scientific findings. These solutions are pivotal tools in Amgens goal to accelerate the speed of discovery, and speed to market of advanced precision medications. Basic Qualifications: Masters degree and 1 to 3 years of Data Engineering experience OR Bachelors degree and 3 to 5 years of Data Engineering experience OR Diploma and 7 to 9 years of Data Engineering experience Must-Have Skills: Minimum of 3 years of hands-on experience with BI solutions (Preferable Power BI or Business Objects) including report development, dashboard creation, and optimization. Minimum of 3 years of hands-on experience building Change-data-capture (CDC) ETL pipelines, data warehouse design and build, and enterprise-level data management. Hands-on experience with Databricks, including data engineering, optimization, and analytics workloads. Deep understanding of Power BI, including model design, DAX, and Power Query. Proven experience designing and implementing data mastering solutions and data governance frameworks. Expertise in cloud platforms (AWS), data lakes, and data warehouses. Strong knowledge of ETL processes, data pipelines, and integration technologies. Good communication and collaboration skills to work with cross-functional teams and senior leadership. Ability to assess business needs and design solutions that align with organizational goals. Exceptional hands-on capabilities with data profiling, data transformation, data mastering Success in mentoring and training team members Good-to-Have Skills: Experience in developing differentiated and deliverable solutions Experience with human data, ideally human healthcare data Familiarity with laboratory testing, patient data from clinical care, HL7, FHIR, and/or clinical trial data management Professional Certifications: ITIL Foundation or other relevant certifications (preferred) SAFe Agile Practitioner (6.0) Microsoft Certified: Data Analyst Associate (Power BI) or related certification. Databricks Certified Professional or similar certification. Soft Skills: Excellent analytical and troubleshooting skills Deep intellectual curiosity The highest degree of initiative and self-motivation Strong verbal and written communication skills, including presentation to varied audiences of complex technical/business topics Confidence technical leader Ability to work effectively with global, remote teams, specifically including using of tools and artifacts to assure clear and efficient collaboration across time zones Ability to handle multiple priorities successfully Team-oriented, with a focus on achieving team goals Strong problem solving, analytical skills; Ability to learn quickly and retain and synthesize complex information from diverse sources
Posted 1 month ago
2.0 - 6.0 years
7 - 17 Lacs
Hyderabad
Work from Office
In this role, you will: Consult with business line and enterprise functions on less complex research Use functional knowledge to assist in non-model quantitative tools that support strategic decision making Perform analysis of findings and trends using statistical analysis and document process Present recommendations to increase revenue, reduce expense, maximize operational efficiency, quality, and compliance Identify and define business requirements and translate data and business needs into research and recommendations to improve efficiency Participate in all group technology efforts including design and implementation of database structures, analytics software, storage, and processing Develop customized reports and ad hoc analyses to make recommendations and provide guidance to less experienced staff Understand compliance and risk management requirements for supported area Ensure adherence to data management or data governance regulations and policies Participate in company initiatives or processes to assist in meeting risk and capital objectives and other strategic goals Collaborate and consult with more experienced consultants and with partners in technology and other business groups Required Qualifications: 2+ years of Analytics experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education Desired Qualifications: Experience in Analytics, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education. Excellent verbal, written, and interpersonal communication skills. Strong knowledge of Enterprise Risk programs and applicability of risk management framework (3 Line of defense) Experience identifying internal and external data sources from multiple sources across the business Experience with SQL, Teradata, or SAS and Database Management systems like Teradata and MS SQL Server. Experience in risk (includes compliance, financial crimes, operational, audit, legal, credit risk, market risk). Experience in data visualization and business intelligence tools. Advanced Microsoft Office (Word, Excel, Outlook and PowerPoint) skills Demonstrated strong analytical skills with high attention to detail and accuracy. Strong presentation skills and ability to translate and present data in a manner that educates, enhances understanding, and influence decisions, bias for simplicity Strong writing skills - proven ability to translate data sets and conclusions drawn from analysis into business/executive format and language Ability to support multiple projects with tight timelines Meta Data management, Data Lineage, Data Element Mapping, Data Documentation experience. Experience researching and resolving data problems and working with technology teams on remediation of data issues Hands-on proficiency with Python, Power BI (Power Query, DAX, Power apps), Tableau, or SAS Knowledge of Defect management tools like HP ALM. Knowledge of Data Governance. Job Expectations: Ensure adherence to data management or data governance regulations and policies Extract and analyze data from multiple technology systems/platforms and related data sources to identify factors that pose a risk to the firm. Consult with business line and enterprise functions on less complex research Understand compliance and risk management requirements for sanctions compliance and data management Perform analysis of findings and trends using statistical analysis and document process Require a solid background in reporting, understanding and utilizing Relational Databases and Data Warehouses, and be effective in querying and reporting large and complex data sets. Excel at telling stories with data, presenting information in visually compelling ways that appeal to executive audiences, and will be well versed in the development and delivery of reporting solutions. Responsible for building easy to use visualization and perform data analysis to generate meaningful business insights using complex datasets for global stakeholders. Responsible for testing key reports and produce process documentation. Present recommendations to maximize operational efficiency, quality, and compliance Identify and define business requirements and translate data and business needs into research and recommendations to improve efficiency Develop customized reports and ad hoc analyses to make recommendations and provide guidance to less experienced staff
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough