Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 9.0 years
0 Lacs
noida, uttar pradesh
On-site
As a Data Engineer at GlobalLogic, you will be responsible for architecting, building, and maintaining complex ETL/ELT pipelines for batch and real-time data processing using various tools and programming languages. Your key duties will include optimizing existing data pipelines for performance, cost-effectiveness, and reliability, as well as implementing data quality checks, monitoring, and alerting mechanisms to ensure data integrity. Additionally, you will play a crucial role in ensuring data security, privacy, and compliance with relevant regulations such as GDPR and local data laws. To excel in this role, you should possess a Bachelor's or Master's degree in Computer Science, Engineering, or a related technical field. Excellent analytical, problem-solving, and critical thinking skills with meticulous attention to detail are essential. Strong communication (written and verbal) and interpersonal skills are also required, along with the ability to collaborate effectively with cross-functional teams. Experience with Agile/Scrum development methodologies is considered a plus. Your responsibilities will involve providing technical leadership and architecture by designing and implementing robust, scalable, and efficient data architectures that align with organizational strategy and future growth. You will define and enforce data engineering best practices, evaluate and recommend new technologies, and oversee the end-to-end data development lifecycle. As a leader, you will mentor and guide a team of data engineers, conduct code reviews, provide feedback, and promote a culture of engineering excellence. You will collaborate closely with data scientists, data analysts, software engineers, and business stakeholders to understand data requirements and translate them into technical solutions. Your role will also involve communicating complex technical concepts and data strategies effectively to both technical and non-technical audiences. At GlobalLogic, we offer a culture of caring, continuous learning and development opportunities, interesting and meaningful work, balance and flexibility, and a high-trust environment. By joining our team, you will have the chance to work on impactful projects, engage your curiosity and problem-solving skills, and contribute to shaping cutting-edge solutions that redefine industries. With a commitment to integrity and trust, GlobalLogic provides a safe, reliable, and ethical global environment where you can thrive both personally and professionally.,
Posted 2 days ago
4.0 - 8.0 years
0 Lacs
pune, maharashtra
On-site
YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, you will be part of a team of innovative professionals working with cutting-edge technologies. Our purpose is anchored in bringing real positive changes in an increasingly virtual world, transcending generational gaps and future disruptions. We are currently seeking SQL Professionals for the role of Data Engineer with 4-6 years of experience. The ideal candidate must have a strong academic background. As a Data Engineer at BNY Mellon in Pune, you will be responsible for designing, developing, and maintaining scalable data pipelines and ETL processes using Apache Spark and SQL. You will collaborate with data scientists and analysts to understand data requirements, optimize and query large datasets, ensure data quality and integrity, implement data governance and security best practices, participate in code reviews, and troubleshoot data-related issues promptly. Qualifications for this role include 4-6 years of experience in data engineering, proficiency in SQL and data processing frameworks like Apache Spark, knowledge of database technologies such as SQL Server or Oracle, experience with cloud platforms like AWS, Azure, or Google Cloud, familiarity with data warehousing solutions, understanding of Python, Scala, or Java for data manipulation, excellent analytical and problem-solving skills, and good communication skills to work effectively in a team environment. Joining YASH means being empowered to shape your career in an inclusive team environment. We offer career-oriented skilling models and promote continuous learning, unlearning, and relearning at a rapid pace. Our workplace is based on four principles: flexible work arrangements, free spirit, and emotional positivity; agile self-determination, trust, transparency, and open collaboration; all support needed for the realization of business goals; and stable employment with a great atmosphere and ethical corporate culture.,
Posted 3 days ago
5.0 - 10.0 years
0 Lacs
hyderabad, telangana
On-site
You are an experienced Azure Databricks Engineer who will be responsible for designing, developing, and maintaining scalable data pipelines and supporting data infrastructure in an Azure cloud environment. Your key responsibilities will include designing ETL pipelines using Azure Databricks, building robust data architectures on Azure, collaborating with stakeholders to define data requirements, optimizing data pipelines for performance and reliability, implementing data transformations and cleansing processes, managing Databricks clusters, and leveraging Azure services for data orchestration and storage. You must possess 5-10 years of experience in data engineering or a related field with extensive hands-on experience in Azure Databricks and Apache Spark. Strong knowledge of Azure cloud services such as Azure Data Lake, Data Factory, Azure SQL, and Azure Synapse Analytics is required. Experience with Python, Scala, or SQL for data manipulation, ETL frameworks, Delta Lake, Parquet formats, Azure DevOps, CI/CD pipelines, big data architecture, and distributed systems is essential. Knowledge of data modeling, performance tuning, and optimization of big data solutions is expected, along with problem-solving skills and the ability to work in a collaborative environment. Preferred qualifications include experience with real-time data streaming tools, Azure certifications, machine learning frameworks, integration with Databricks, and data visualization tools like Power BI. A bachelor's degree in Computer Science, Data Engineering, Information Technology, or a related field is required for this role.,
Posted 4 days ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
You should have end-to-end project implementation and cross-functional stakeholder management experience, with a focus on agile project delivery. As a seasoned business analyst, you will be responsible for requirements analysis, requirements management, and documentation. It is important to have exposure to tools like JIRA, Confluence, as well as hands-on experience with SQL queries. Additionally, exposure to leading vendor products such as Actimize, Fircosoft, etc. would be beneficial. Experience in Data Science, Analytics, AI/ML, Gen AI, Data Management, Data Architectures, Data Governance, platforms, and applications is a plus. Virtusa is a company that values teamwork, quality of life, and professional and personal development. With a global team of 27,000 people, we are dedicated to supporting your growth and providing exciting projects, opportunities, and the chance to work with cutting-edge technologies throughout your career with us. At Virtusa, we believe in the power of great minds and great potential coming together. We foster a collaborative team environment and strive to offer a dynamic space where new ideas can flourish and excellence can be achieved.,
Posted 1 week ago
9.0 - 12.0 years
5 - 8 Lacs
Hyderabad, Telangana, India
On-site
Job description Broad expertise in Fincrime Monitoring, AML, KYC, Sanctions Screening, Payment Screening, Fraud etc Proven risk & regulatory experience in financial services gained through management consulting, banking or other relevant industry practitioner or regulatory roles End to end project implementation and cross functional stakeholder management experience including agile project delivery Seasoned business analyst with experience in requirements analysis, requirements management and documentation, exposure to tools like JIRA, Confluence etc hands on with SQL queries Bachelor degree from a reputable Institute. Master degree preferably in a quantitative field Business, Data Science, Statistics, Computer Science, etc. Comfortable with ideation, solution design and development of thought leadership materials and documents to support practice development efforts Exposure to leading vendor products like Actimize, Fircosoft etc is a plus Experience in Data Science, Analytics, AI ML. Gen AI, Data Management, Data Architectures, Data Governance, platforms and applications is a plus Exposure to Consultative sales business development, pre-sales, RFP a
Posted 1 week ago
12.0 - 16.0 years
0 Lacs
haryana
On-site
You will be joining a renowned global digital engineering firm as a Senior Solution Architect, reporting to the Director Consulting. Your key responsibility will be to craft innovative solutions for both new and existing clients, with a primary focus on leveraging data to fuel the architecture and strategy of Digital Experience Platforms (DXP). Your expertise will guide the development of solutions heavily anchored in CMS, CDP, CRM, loyalty, and analytics-intensive platforms, integrating ML and AI capabilities. The essence of your approach will be centered around leveraging data to create composable, insightful, and effective DXP solutions. In this role, you will be client-facing, sitting face to face with prospective customers to shape technical and commercially viable solutions. You will also lead by example as a mentor, challenge others to push their boundaries, and strive to improve your skillset in the ever-evolving landscape of Omnichannel solutions. Collaboration with cross-functional teams will be a key aspect of your daily work, as you strategize, problem-solve, and communicate effectively with internal and external team members. Your mastery of written language will allow you to deliver compelling technical proposals to both new and existing clients. Your day-to-day responsibilities will include discussing technical solutions with clients, contributing to digital transformation strategies, collaborating with various teams to shape solutions based on client needs, constructing technical architectures, articulating transitions from current to future states, sharing knowledge and thought leadership within the organization, participating in discovery of technical project requirements, and estimating project delivery efforts based on your recommendations. The ideal candidate for this position will possess 12+ years of experience in design, development, and support of large-scale web applications, along with specific experience in cloud-native technologies, data architectures, customer-facing applications, client-facing technology consulting roles, and commerce platforms. A Bachelor's degree in a relevant field is required. In addition to fulfilling work, Material offers a high-impact work environment with a strong company culture and benefits. As a global company working with best-of-class brands worldwide, Material values inclusion, interconnectedness, and amplifying impact through people, perspectives, and expertise. The company focuses on learning and making an impact, creating experiences that matter, new value, and making a difference in people's lives. Material offers professional development, mentorship, a hybrid work mode, health and family insurance, leaves, wellness programs, and counseling sessions.,
Posted 1 week ago
10.0 - 14.0 years
0 Lacs
ahmedabad, gujarat
On-site
As a Solution Architect at Kanerika, you will collaborate with our sales, presales, and COE teams to provide technical expertise and support throughout the new business acquisition process. Your role will involve understanding customer requirements, presenting our solutions, and demonstrating the value of our products. In this high-pressure environment, maintaining a positive outlook and making strategic choices for career growth are essential. Your excellent communication skills, both written and verbal, will enable you to convey complex technical concepts clearly and effectively. Being a team player, customer-focused, self-motivated, and responsible individual who can work under pressure with a positive attitude is crucial for success in this role. Experience in managing and handling RFPs/ RFIs, client demos and presentations, and converting opportunities into winning bids is required. Having a strong work ethic, positive attitude, and enthusiasm to embrace new challenges are key qualities. You should be able to multitask, prioritize, and demonstrate good time management skills, as well as work independently with minimal supervision. A process-oriented and methodical approach with a quality-first mindset will be beneficial. The ability to convert a client's business challenges and priorities into winning proposals through excellence in technical solutions will be the key performance indicator for this role. Your responsibilities will include developing high-level architecture designs for scalable, secure, and robust solutions, selecting appropriate technologies, frameworks, and platforms for business needs, and designing cloud-native, hybrid, or on-premises solutions using AWS, Azure, or GCP. You will also ensure seamless integration between various enterprise applications, APIs, and third-party services, as well as design and develop scalable, secure, and performant data architectures on Microsoft Azure and/or new generation analytics platforms. To excel in this role, you should have at least 10 years of experience working in data analytics and AI technologies from consulting, implementation, and design perspectives. Certifications in data engineering, analytics, cloud, and AI will be advantageous. A Bachelor's in engineering/technology or an MCA from a reputed college is a must, along with prior experience working as a solution architect during the presales cycle. Soft skills such as communication, presentation, flexibility, and being hard-working are essential. Additionally, having knowledge of presales processes and a basic understanding of business analytics and AI will benefit you in this role at Kanerika. Join us at Kanerika and become part of a vibrant and diverse community where your talents are recognized, your growth is nurtured, and your contributions make a real impact. See the benefits section below for the perks you'll get while working for Kanerika.,
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
ahmedabad, gujarat
On-site
DXFactor is a US-based tech company working with customers globally. We are a certified Great Place to Work and currently seeking candidates for the role of Data Engineer with 4 to 6 years of experience. Our presence spans across the US and India, specifically in Ahmedabad. As a Data Engineer at DXFactor, you will be expected to specialize in SnowFlake, AWS, and Python. Key Responsibilities: - Design, develop, and maintain scalable data pipelines for both batch and streaming workflows. - Implement robust ETL/ELT processes to extract data from diverse sources and load them into data warehouses. - Build and optimize database schemas following best practices in normalization and indexing. - Create and update documentation for data flows, pipelines, and processes. - Collaborate with cross-functional teams to translate business requirements into technical solutions. - Monitor and troubleshoot data pipelines to ensure optimal performance. - Implement data quality checks and validation processes. - Develop and manage CI/CD workflows for data engineering projects. - Stay updated with emerging technologies and suggest enhancements to existing systems. Requirements: - Bachelor's degree in Computer Science, Information Technology, or a related field. - Minimum of 4+ years of experience in data engineering roles. - Proficiency in Python programming and SQL query writing. - Hands-on experience with relational databases (e.g., PostgreSQL, MySQL) and NoSQL databases (e.g., MongoDB, Cassandra). - Familiarity with data warehousing technologies such as Snowflake, Redshift, and BigQuery. - Demonstrated ability in constructing efficient and scalable data pipelines. - Practical knowledge of batch and streaming data processing methods. - Experience in implementing data validation, quality checks, and error handling mechanisms. - Work experience with cloud platforms, particularly AWS (S3, EMR, Glue, Lambda, Redshift) and/or Azure (Data Factory, Databricks, HDInsight). - Understanding of various data architectures including data lakes, data warehouses, and data mesh. - Proven ability to debug complex data flows and optimize underperforming pipelines. - Strong documentation skills and effective communication of technical concepts.,
Posted 2 weeks ago
10.0 - 14.0 years
0 Lacs
karnataka
On-site
Credit Saison India, established in 2019, is a rapidly growing Non-Bank Financial Company (NBFC) lender in India. The company operates in wholesale, direct lending, and tech-enabled partnerships with Non-Bank Financial Companies (NBFCs) and fintechs. With a tech-enabled model and underwriting capability, Credit Saison India facilitates lending at scale to meet India's credit needs, particularly in underserved populations. Committed to long-term growth as a lender in India, Credit Saison India focuses on providing financial solutions to MSMEs, households, and individuals. The company is registered with the Reserve Bank of India (RBI) and holds an AAA rating from CRISIL and CARE Ratings. With 45 physical offices, 1.2 million active loans, an AUM exceeding US$1.5B, and around 1,000 employees, Credit Saison India is dedicated to evolving its offerings to create a positive impact. As part of Saison International, a global financial company, Credit Saison India aims to bring together people, partners, and technology to develop innovative financial solutions. Saison International operates across various countries, including Singapore, India, Indonesia, Thailand, Vietnam, Mexico, and Brazil, to enable opportunities and fulfill the dreams of individuals. Position: Product Head for Data, Analytics, and Platform - Director/Sr. Director Role Overview: The Product Head for Data and Analytics will lead the strategic development and execution of data platform initiatives at Credit Saison India. The role involves defining product strategy, setting OKRs, and ensuring successful program execution to drive the product roadmap. Collaboration with technical teams to integrate advanced analytics, machine learning, and AI capabilities into scalable data platforms is essential. Key Responsibilities: - Define and drive the strategic direction for data platforms, aligning technical initiatives with business goals. - Set clear OKRs for data platform development and ensure alignment across all teams. - Establish standards for planning, project management, execution, and documentation throughout the product lifecycle. - Lead the design and implementation of scalable data pipelines, data lakes, and real-time analytics architectures. - Collaborate with Data Science, Analytics, Platform, and AI teams to integrate machine learning models, predictive analytics, and AI technologies. - Maintain high standards for project management, ensuring initiatives are delivered on time, within scope, and on budget. - Implement data governance and security protocols to ensure compliance with relevant regulations. - Develop key performance metrics to assess the success of data products and drive continuous improvement. - Mentor and lead product managers and technical teams, fostering a culture of ownership, innovation, and excellence. Qualifications: - BTech and/or MBA from reputed colleges (e.g., IITs, NITs, ISBs, IIMs, or equivalent). - 10+ years of experience in product management focusing on data platforms and advanced analytics. - Proficiency in cloud technologies (e.g., AWS, Azure) and data frameworks (Hadoop, Spark, Kafka). - Strong experience with data architectures, data governance, and real-time data pipelines. - Knowledge of integrating AI/ML models into data products and driving BI initiatives. - Program management expertise, leadership, communication, and interpersonal skills to engage stakeholders effectively.,
Posted 2 weeks ago
12.0 - 18.0 years
0 Lacs
noida, uttar pradesh
On-site
We are looking for an experienced Manager - Data Engineering with a strong background in Databricks or the Apache data stack to lead the implementation of complex data platforms. In this role, you will be responsible for overseeing impactful data engineering projects for global clients, delivering scalable solutions, and steering digital transformation initiatives. With 12-18 years of overall experience in data engineering, including 3-5 years in a leadership position, you will need hands-on expertise in either Databricks or the core Apache stack (Spark, Kafka, Hive, Airflow, NiFi, etc.). Proficiency in at least one cloud platform such as AWS, Azure, or GCP, ideally with Databricks on the cloud, is required. Strong programming skills in Python, Scala, and SQL are essential, along with experience in constructing scalable data architectures, delta lakehouses, and distributed data processing. Familiarity with modern data governance, cataloging, and data observability tools is also necessary. You should have a proven track record of managing delivery in an onshore-offshore or hybrid model, coupled with exceptional communication, stakeholder management, and team mentoring abilities. As a Manager - Data Engineering, your key responsibilities will include leading the design, development, and deployment of modern data platforms utilizing Databricks, Apache Spark, Kafka, Delta Lake, and other big data tools. You will be tasked with designing and implementing data pipelines (both batch and real-time), data lakehouses, and large-scale ETL frameworks. Furthermore, you will take ownership of delivery accountability for data engineering programs across various industries, collaborating with global stakeholders, product owners, architects, and business teams to drive data-driven outcomes. Ensuring best practices in DevOps, CI/CD, infrastructure-as-code, data security, and governance will be crucial. Additionally, you will be responsible for managing and mentoring a team of 10-25 engineers, conducting performance reviews, capability building, and coaching, as well as supporting presales activities including solutioning, technical proposals, and client workshops. At GlobalLogic, we prioritize a culture of caring where people come first. We offer continuous learning and development opportunities to help you grow personally and professionally. You'll have the chance to work on interesting and meaningful projects that have a real impact. With various career areas, roles, and work arrangements, we believe in providing a balance between work and life. As a high-trust organization, integrity is at the core of everything we do. GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner known for creating innovative digital products and experiences. Join us in collaborating with forward-thinking companies to transform businesses and redefine industries through intelligent products, platforms, and services.,
Posted 2 weeks ago
10.0 - 14.0 years
0 Lacs
karnataka
On-site
About Credit Saison India Established in 2019, Credit Saison India (CS India) is one of the country's fastest-growing Non-Bank Financial Company (NBFC) lenders. With verticals in wholesale, direct lending, and tech-enabled partnerships with Non-Bank Financial Companies (NBFCs) and fintechs, CS India's tech-enabled model, coupled with underwriting capability, facilitates lending at scale, addressing India's significant credit gap, especially within underserved and underpenetrated population segments. Committed to long-term growth as a lender in India, CS India aims to evolve its offerings for MSMEs, households, individuals, and more. Registered with the Reserve Bank of India (RBI) and boasting an AAA rating from CRISIL and CARE Ratings, CS India currently operates through a branch network of 45 physical offices, servicing 1.2 million active loans, managing an AUM of over US$1.5B, and employing around 1,000 individuals. As part of Saison International, a global financial company with a mission to foster resilient and innovative financial solutions for positive impact by bringing people, partners, and technology together, CS India aligns with Saison International's commitment to being a transformative partner in creating opportunities and enabling people's dreams. With a presence in Singapore and operations across various countries, including India, Indonesia, Thailand, Vietnam, Mexico, and Brazil, Saison International boasts a diverse workforce of over 1,000 employees. Product Head for Data, Analytics, and Platform - Director/Sr. Director Role Overview: As the Product Head for Data and Analytics, you will spearhead the strategic development and execution of data platform initiatives, ensuring alignment with business objectives and delivering measurable results. Your responsibilities include defining product strategy, setting OKRs, and overseeing the successful implementation of programs that drive the product roadmap. Collaborating cross-functionally with technical teams, you will integrate advanced analytics, machine learning, and AI capabilities into scalable data platforms. Key Responsibilities: - Define and drive the strategic direction for data platforms, ensuring alignment with business goals. - Set clear OKRs for data platform development and ensure alignment across all teams. - Establish and enforce standards for planning, project management, execution, and documentation. - Lead the design and implementation of scalable data pipelines, data lakes, and real-time analytics architectures. - Collaborate with Data Science, Analytics, Platform, and AI teams to integrate machine learning models, predictive analytics, and AI technologies seamlessly. - Maintain high standards for project management, ensuring timely delivery within budget, using Agile methodologies. - Ensure comprehensive documentation of technical processes, product specifications, and architectural decisions. - Implement data governance and security protocols to ensure compliance with data protection regulations. - Develop key performance metrics to assess the success of data products and drive continuous improvement. - Mentor and lead product managers and technical teams to foster a culture of ownership, innovation, and excellence. Qualifications: - BTech and/or MBA from reputed colleges (e.g., IITs, NITs, ISBs, IIMs, or equivalent). - 10+ years of product management experience with a focus on building data platforms and integrating advanced analytics. - Proven track record in setting and executing strategic roadmaps, OKRs, and ensuring business alignment. - Deep understanding of cloud technologies (e.g., AWS, Azure) and data frameworks (Hadoop, Spark, Kafka). - Strong experience in data architectures, data governance, and real-time data pipelines. - Proficiency in integrating AI/ML models into data products and driving BI initiatives. - Program management expertise with the ability to lead cross-functional teams and deliver complex projects. - Knowledge of data governance frameworks and experience in ensuring regulatory compliance. - Excellent leadership, communication, and interpersonal skills for engaging and aligning stakeholders.,
Posted 2 weeks ago
10.0 - 14.0 years
0 Lacs
thiruvananthapuram, kerala
On-site
You are an experienced Solution Architect with a solid background in software architecture and a good understanding of AI-based products and platforms. Your main responsibility will be to design robust, scalable, and secure architectures that support AI-driven applications and enterprise systems. In this role, you will collaborate closely with cross-functional teams, including data scientists, product managers, and engineering leads, to ensure the alignment of business needs, technical feasibility, and AI capabilities. Your key responsibilities will include architecting end-to-end solutions for enterprise and product-driven platforms, such as data pipelines, APIs, AI model integration, cloud infrastructure, and user interfaces. You will guide teams in selecting appropriate technologies, tools, and design patterns for building scalable systems. Additionally, you will work with AI/ML teams to understand model requirements and facilitate their smooth deployment and integration into production environments. In this role, you will define system architecture diagrams, data flow, service orchestration, and infrastructure provisioning using modern tools. Furthermore, you will collaborate with stakeholders to translate business requirements into technical solutions, emphasizing scalability, performance, and security. Your leadership will be crucial in promoting best practices for software development, DevOps, and cloud-native architecture. You will also conduct architecture reviews to ensure compliance with security, performance, and regulatory standards. To be successful in this role, you should have at least 10 years of experience in software architecture or solution design roles. You should demonstrate expertise in designing systems using microservices, RESTful APIs, event-driven architecture, and cloud-native technologies. Hands-on experience with major cloud providers like AWS, GCP, or Azure is essential. Familiarity with AI/ML platforms and components, data architectures, containerization, DevOps principles, and the ability to lead technical discussions are also required skills. Preferred qualifications include exposure to AI model lifecycle management, infrastructure-as-code tools like Terraform or Pulumi, knowledge of GraphQL, gRPC, or serverless architectures, and previous experience in AI-driven product companies or digital transformation programs. In return, you will have the opportunity to play a high-impact role in designing intelligent systems that drive the future of AI adoption. You will work alongside forward-thinking engineers, researchers, and innovators, with a strong focus on career growth, learning, and technical leadership. The compensation offered is competitive and reflective of the value you bring to the role.,
Posted 2 weeks ago
12.0 - 16.0 years
0 Lacs
haryana
On-site
As the Senior Solution Architect reporting to the Director Consulting, you play a crucial role in creating innovative solutions for both new and existing clients. Your main focus will be on utilizing data to drive the architecture and strategy of Digital Experience Platforms (DXP). You will be pivotal in shaping the technology and software architecture, particularly emphasizing how data-driven insights influence the design and execution of client projects. Your expertise will steer the development of solutions deeply rooted in CMS, CDP, CRM, loyalty, and analytics-intensive platforms, integrating ML and AI capabilities. The core of our approach revolves around harnessing data to craft composable, insightful, and efficient DXP solutions. You are a client-facing professional, often engaging directly with potential customers to shape technically and commercially viable solutions. As a mentor, you lead by example and encourage others to expand their horizons. A self-starter in your field, you are keen on enhancing your skill set in the ever-evolving Omnichannel solutions landscape. Collaborative by nature, you thrive on working with various cross-functional teams on a daily basis. An effective communicator, you possess the ability to capture attention, strategize, and troubleshoot on the fly with both internal and external team members. Your mastery of written language allows you to deliver compelling technical proposals to both new and existing clients. In your role, you will be responsible for discussing technical solutions with current and potential clients, as well as internal teams, to introduce innovative ideas for creating functional and appealing digital environments. You will contribute to our clients" digital transformation strategies based on industry best practices and act as a subject matter expert in business development activities. Furthermore, you will collaborate closely with product vendor partners, client partners, strategists, and delivery engagement leaders to tailor solutions to meet clients" explicit and implicit needs. Your tasks will involve architecting technical solutions that meet clients" requirements, selecting and evaluating technology frameworks, and addressing complex business problems with comprehensive assessments. You will also be responsible for articulating the transition from the current to future state, breaking down intricate business and technical strategies into manageable requirements for teams to execute. Your role will also involve conceptualizing and sharing knowledge and thought leadership within the organization, researching and presenting new technology trends, participating in project requirement discovery, scoping, and providing estimations for phased program delivery. The ideal candidate for the Solutions Architect position will have 12+ years of experience in designing, developing, and supporting large-scale web applications, along with expertise in cloud-native capabilities of AWS, GCP, or Azure. They should also possess hands-on experience in data architectures, storage solutions, data processing workflows, CDPs, data lakes, analytics solutions, and customer-facing applications. Additionally, experience in client-facing technology consulting, E-Commerce platforms, and knowledge of digital marketing trends and best practices are desired. A bachelor's degree in a relevant field is required. The statements within this job description outline the essential functions of this role, the necessary level of knowledge and skills, and the extent of responsibility. It should not be viewed as an exhaustive list of job requirements. Individuals may be assigned other duties as needed to cover absences, balance organizational workload, or work in different functional areas. Material is a global company known for partnering with top brands worldwide and launching innovative products. We value inclusion, collaboration, and expertise in our work, with a commitment to understanding human behavior and applying a scientific approach. We offer a learning-focused community dedicated to creating impactful experiences and making a difference in people's lives. Additionally, we provide professional development, a hybrid work mode, health and family insurance, ample leave allowances, and wellness programs, ensuring a supportive and enriching work environment.,
Posted 2 weeks ago
8.0 - 12.0 years
0 Lacs
pune, maharashtra
On-site
As the Director/Head of Data Engineering for India, you will be responsible for developing and maintaining the data strategy for Singapore implementation. Your primary goal will be to create a model implementation that can be replicated across wider PBWM Organisation for compliance in other jurisdictions. You will define and execute the data engineering strategy in alignment with business goals and technology roadmaps. Collaborating with the Chief Data Officer/Chief Operating Officer, you will understand the Critical Data Elements (CDE) and establish controls around them. Your role will involve designing data models, efficient data pipelines, ensuring data quality and integrity, collaborating with data science and analytics teams, and scaling data solutions. Additionally, you will oversee data security and compliance, continuously learn and implement the latest technologies, manage and train the data engineering team, and implement cloud migration for data with appropriate hydrations. Budgeting, resource allocation, implementing data products, ensuring data reconciliation, and upholding high standards and quality in data are also key aspects of this role. In this strategic and senior leadership position, you will oversee data strategy, data engineering, data infrastructure, and data management practices within Private Banking and Wealth Management. Your responsibilities will include managing and developing the data team, delivering outstanding customer-focused service, ensuring quality and quantity are equally prioritized, adhering to policies and procedures, and advocating Barclays values and principles. You will lead effective data management, compliance, and analytics to support business goals, enhance customer experiences, and improve operational efficiencies. Recruiting, training, and developing the data engineering team, fostering collaboration and innovation, providing strategic guidance, and defining KPIs aligned with PBWM goals will be part of your duties. Collaborating with executive leadership, you will ensure data initiatives support the bank's growth, profitability, and risk management. You will oversee budgeting for data-related initiatives, allocate resources efficiently, and track performance indicators for the data engineering team and infrastructure to drive continuous improvement. The purpose of your role is to build and maintain systems that collect, store, process, and analyze data to ensure accuracy, accessibility, and security. Your accountabilities will include building and maintaining data architectures pipelines, designing and implementing data warehouses and data lakes, developing processing and analysis algorithms, and collaborating with data scientists to deploy machine learning models. As a Director, you are expected to manage a business function, contribute to strategic initiatives, provide expert advice, manage resourcing and budgeting, ensure compliance, and monitor external environments. Demonstrating leadership behaviours such as listening, inspiring, aligning, and developing others, along with upholding Barclays Values and Mindset, will be key to excelling in this role.,
Posted 3 weeks ago
7.0 - 9.0 years
14 - 18 Lacs
Pune
Hybrid
The SQL+Power BI Lead is responsible for designing, developing, and maintaining complex data solutions using SQL and Power BI. They serve as a technical lead, guiding the team in implementing best practices and efficient data architectures. The SQL+PowerBI Lead plays a key role in translating business requirements into effective data and reporting solutions. Design and develop advanced SQL queries, stored procedures, and other database objects to support data extraction, transformation, and loading Create dynamic, interactive PowerBI dashboards and reports to visualize data and provide insights Provide technical leadership and mentorship to junior team members on SQL and PowerBI best practices Collaborate with business stakeholders to understand requirements and translate them into data solutions Optimize database performance and implement security measures to ensure data integrity Automate data integration, extraction, and reporting processes where possible Participate in data architecture planning and decision-making Troubleshoot and resolve complex data-related issues Stay up-to-date with the latest trends, technologies, and best practices in data analytics.
Posted 1 month ago
6.0 - 9.0 years
14 - 18 Lacs
Hyderabad
Work from Office
The ideal candidate will have a strong background in IT Services & Consulting, with expertise in Oracle Data Science. Roles and Responsibility Design and implement data science solutions using Oracle technologies. Collaborate with cross-functional teams to identify business problems and develop data-driven solutions. Develop and maintain large-scale data systems and architectures. Work closely with stakeholders to understand requirements and deliver high-quality results. Stay up-to-date with industry trends and emerging technologies. Lead the development of data science projects from concept to delivery. Job Requirements Strong knowledge of Oracle Data Science and related technologies. Experience working with large datasets and developing predictive models. Excellent communication and collaboration skills. Ability to work in a fast-paced environment and meet deadlines. Strong problem-solving skills and attention to detail. Bachelor's degree in Computer Science or related field.
Posted 1 month ago
1.0 - 6.0 years
3 - 8 Lacs
Hyderabad
Work from Office
What you will do In this vital role you will join a collaborative team implementing and supporting the next generation of safety platforms and supporting technologies. In this role, you will analyze and resolve issues with adverse event data and file transmissions across integrated systems, leveraging data analytics to identify trends, optimize workflows, and prevent future incidents. Collaborating closely with various teams, you will develop insights and implement solutions to improve system performance, ensuring reliable and efficient data flow critical to safety operations Roles & Responsibilities: Monitor, solve, and resolve issues related to adverse event data processing across multiple systems. Conduct detailed investigations into system disruptions, data anomalies, or processing delays and implement corrective and preventive measures. Work closely with internal teams, external vendors, and business partners to address dependencies and resolve bottlenecks for critical issues Design and maintain dashboards, reports, and analytics to monitor system performance and identify trends or areas of improvements. Present findings and recommendations to leadership, ensuring data-driven decision-making and clear transparency into system operations. Identify inefficiencies and propose data-driven solutions to optimize and enhance reliability. Collaborate on the development of test plans, scenarios to ensure robust validation of system updates, patches and new features Perform regression testing to verify the changes do not negatively impact existing system functionality Support the creating and implementation of automated testing frameworks to improve efficiency and consistency Support compliance with Key Control Indicators (KCI) and chips in to overall process governance What we expect of you We are all different, yet we all use our unique contributions to serve patients. The [vital attribute] professional we seek is a [type of person] with these qualifications. Basic Qualifications: About the role Role Description: As a Sr Associate IS Analyst, you will join a collaborative team implementing and supporting the next generation of safety platforms and supporting technologies. In this role, you will analyze and resolve issues with adverse event data and file transmissions across integrated systems, leveraging data analytics to identify trends, optimize workflows, and prevent future incidents. Collaborating closely with various teams, you will develop insights and implement solutions to improve system performance, ensuring reliable and efficient data flow critical to safety operations Roles & Responsibilities: Monitor, solve, and resolve issues related to adverse event data processing across multiple systems. Conduct detailed investigations into system disruptions, data anomalies, or processing delays and implement corrective and preventive measures. Work closely with internal teams, external vendors, and business partners to address dependencies and resolve bottlenecks for critical issues Design and maintain dashboards, reports, and analytics to monitor system performance and identify trends or areas of improvements. Present findings and recommendations to leadership, ensuring data-driven decision-making and clear transparency into system operations. Identify inefficiencies and propose data-driven solutions to optimize and enhance reliability. Collaborate on the development of test plans, scenarios to ensure robust validation of system updates, patches and new features Perform regression testing to verify the changes do not negatively impact existing system functionality Support the creating and implementation of automated testing frameworks to improve efficiency and consistency Support compliance with Key Control Indicators (KCI) and chips in to overall process governance Basic Qualifications and Experience: Masters degree and 1 to 3 years of experience in Computer Science, IT or related field OR Bachelors degree and 3 to 5 years of experience in Computer Science, IT or related field OR Diploma and 7 to 9 years of experience in Computer Science, IT or related field Functional Skills: Must-Have Skills: Demonstrated expertise in monitoring, troubleshooting, and resolving data and system issues. Proficiency in data analytics, with experience in dashboarding and reporting tools such as Tableau or Power BI. Familiarity with database technologies and querying tools, including SQL (Oracle SQL, PL/SQL preferred). Understanding of API integrations and middleware platforms (e.g., MuleSoft). Experience with testing methodologies, tools, and automation practices. Experienced in Agile methodology Good-to-Have Skills: Experience with API integrations such as MuleSoft Solid understanding of using one or more general programming languages, including but not limited to: Java or Python Experience with cloud-based technologies and modern data architectures. Outstanding written and verbal communication skills, and ability to explain technical concepts to non-technical clients Sharp learning agility, problem solving and analytical thinking Experienced in GxP systems and implementing GxP projects Extensive expertise in SDLC, including requirements, design, testing, data analysis, change control Experience with Signal platforms is a plus Professional Certifications: SAFe for Teams certification (preferred) Soft Skills: Excellent analytical and troubleshooting skills Excellent leadership and strategic thinking abilities Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation Ability to manage multiple priorities successfully Team-oriented, with a focus on achieving team goals Ability to deal with ambiguity and think on their feet Shift Information: This position requires you to work a later shift and may be assigned a second or third shift schedule. Candidates must be willing and able to work during evening or night shifts, as required based on business requirements.
Posted 1 month ago
3.0 - 5.0 years
5 - 7 Lacs
Bengaluru
Work from Office
Educational Requirements MCA,MSc,Bachelor of Engineering,BBA,BCom,BSc Service Line Data & Analytics Unit Responsibilities Spark Expertise Expert proficiency in Spark Ability to design and implement efficient data processing workflows Experience with Spark SQL and DataFrames Good exposure to Big Data architectures and good understanding of Big Data eco system Experience with some framework building experience on Hadoop Good with DB knowledge with SQL tuning experience. Good to have experience with Python, APIs and exposure to Kafka. Additional Responsibilities: Ability to work with clients to identify business challenges and contribute to client deliverables by refining, analyzing, and structuring relevant data Awareness of latest technologies and trends Logical thinking and problem solving skills along with an ability to collaborate Ability to assess the current processes, identify improvement areas and suggest the technology solutions One or two industry domain knowledge Technical and Professional Requirements: Primary skills:Technology->Big Data - Data Processing->Spark Preferred Skills: Technology->Big Data - Data Processing->Spark
Posted 1 month ago
5.0 - 7.0 years
0 Lacs
Bengaluru / Bangalore, Karnataka, India
On-site
Introduction A career in IBM Consulting embraces long-term relationships and close collaboration with clients across the globe. In this role, you will work for IBM BPO, part of Consulting that, accelerates digital transformation using agile methodologies, process mining, and AI-powered workflows. Youll work with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio, including IBM Software and Red Hat. Curiosity and a constant quest for knowledge serve as the foundation to success in IBM Consulting. In your role, youll be supported by mentors and coaches who will encourage you to challenge the norm, investigate ideas outside of your role, and come up with creative solutions resulting in groundbreaking impact for a wide network of clients. Our culture of evolution and empathy centers on long-term career growth and learning opportunities in an environment that embraces your unique skills and experience. Your role and responsibilities Develop, test and support future-ready data solutions for customers across industry verticals.Develop, test, and support end-to-end batch and near real-time data flows/pipelines.Demonstrate understanding of data architectures, modern data platforms, big data, analytics, cloud platforms, data governance and information management and associated technologies. Communicates risks and ensures understanding of these risks. Graduate with a minimum of 5+ years of related experience required. Experience in modelling and business system designs. Good hands-on experience on DataStage, Cloud-based ETL Services. Have great expertise in writing TSQL code. Well-versed with data warehouse schemas and OLAP techniques. Required education Bachelors Degree Preferred education Masters Degree Required technical and professional expertise Ability to manage and make decisions about competing priorities and resources. Ability to delegate where appropriate. Must be a strong team player/leader. Ability to lead Data transformation projects with multiple junior data engineers. Strong oral written and interpersonal skills for interacting throughout all levels of the organization. Ability to communicate complex business problems and technical solutions.
Posted 2 months ago
2.0 - 4.0 years
0 Lacs
Bengaluru / Bangalore, Karnataka, India
On-site
Introduction A career in IBM Consulting embraces long-term relationships and close collaboration with clients across the globe. In this role, you will work for IBM BPO, part of Consulting that, accelerates digital transformation using agile methodologies, process mining, and AI-powered workflows. Youll work with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio, including IBM Software and Red Hat. Curiosity and a constant quest for knowledge serve as the foundation to success in IBM Consulting. In your role, youll be supported by mentors and coaches who will encourage you to challenge the norm, investigate ideas outside of your role, and come up with creative solutions resulting in groundbreaking impact for a wide network of clients. Our culture of evolution and empathy centers on long-term career growth and learning opportunities in an environment that embraces your unique skills and experience. Your role and responsibilities Develop, test and support future-ready data solutions for customers across industry verticals Develop, test, and support end-to-end batch and near real-time data flows/pipelines Demonstrate understanding in data architectures, modern data platforms, big data, analytics, cloud platforms, data governance and information management and associated technologies Communicates risks and ensures understanding of these risks. Required education Bachelors Degree Preferred education Masters Degree Required technical and professional expertise Minimum of 2+ years of related experience required Experience in modeling and business system designs Good hands-on experience on DataStage, Cloud based ETL Services Have great expertise in writing TSQL code Well versed with data warehouse schemas and OLAP techniques Preferred technical and professional experience Ability to manage and make decisions about competing priorities and resources. Ability to delegate where appropriate Must be a strong team player/leader Ability to lead Data transformation project with multiple junior data engineers Strong oral written and interpersonal skills for interacting and throughout all levels of the organization. Ability to clearly communicate complex business problems and technical solutions.
Posted 2 months ago
3.0 - 8.0 years
10 - 18 Lacs
Faridabad
Work from Office
Design and implement scalable data architectures to optimize data flow and analytics capabilities. Develop ETL pipelines, data warehouses, and real-time data processing systems. Must have expertise in SQL, Python, and cloud data platforms like AWS Redshift or Google BigQuery. Work closely with data scientists to enhance machine learning models with structured and unstructured data. Prior experience in handling large-scale datasets is preferred.
Posted 2 months ago
3.0 - 8.0 years
10 - 18 Lacs
Vadodara
Work from Office
Design and implement scalable data architectures to optimize data flow and analytics capabilities. Develop ETL pipelines, data warehouses, and real-time data processing systems. Must have expertise in SQL, Python, and cloud data platforms like AWS Redshift or Google BigQuery. Work closely with data scientists to enhance machine learning models with structured and unstructured data. Prior experience in handling large-scale datasets is preferred.
Posted 2 months ago
10.0 - 14.0 years
10 - 16 Lacs
Pune
Work from Office
Role Overview:- The Senior Tech Lead - GCP Data Engineering leads the design, development, and optimization of advanced data solutions. The jobholder has extensive experience with GCP services, data architecture, and team leadership, with a proven ability to deliver scalable and secure data systems. Responsibilities:- Lead the design and implementation of GCP-based data architectures and pipelines. Architect and optimize data solutions using GCP services such as BigQuery, Dataflow, Pub/Sub, and Cloud Storage. Provide technical leadership and mentorship to a team of data engineers. Collaborate with stakeholders to define project requirements and ensure alignment with business goals. Ensure best practices in data security, governance, and compliance. Troubleshoot and resolve complex technical issues in GCP data environments. Stay updated on the latest GCP technologies and industry trends. Key Technical Skills & Responsibilities Overall 10+ Yrs of experience with GCP and Data Warehousing concepts; Coding; reviewing; testing and debugging Experience as architect on GCP implementation/or migration data projects. Must have understanding of Data Lakes and Data Lake Architectures, best practices in data storage, loading, retrieving data from data lakes. Experience in develop and maintain pipelines in GCP platform, understand best practices of bringing on-prem data to the cloud. File loading, compression, parallelization of loads, optimization etc. Working knowledge and/or experience with Google Data Studio, looker and other visualization tools Working knowledge in Hadoop and Python/Java would be an added advantage Experience in designing and planning BI solutions, Debugging, monitoring and troubleshooting BI solutions, Creating and deploying reports and Writing relational and multidimensional database queries. Any experience in NOSQL environment is a plus. Must be good with Python and PySpark for data pipeline building. Must have experience of working with streaming data sources and Kafka. GCP Services - Cloud Storage, BigQuery , Big Table, Cloud Spanner, Cloud SQL, DataStore/Firestore, DataFlow, DataProc, DataFusion, DataPrep, Pub/Sub, Data Studio, Looker, Data Catalog, Cloud Composer, Cloud Scheduler, Cloud Function Eligibility Criteria: Bachelors degree in Computer Science, Data Engineering, or a related field. Extensive experience with GCP data services and tools. GCP certification (e.g., Professional Data Engineer, Professional Cloud Architect). Experience with machine learning and AI integration in GCP environments. Strong understanding of data modeling, ETL/ELT processes, and cloud integration. Proven leadership experience in managing technical teams. Excellent problem-solving and communication skills.
Posted 2 months ago
3.0 - 8.0 years
10 - 18 Lacs
Ludhiana
Work from Office
Design and implement scalable data architectures to optimize data flow and analytics capabilities Develop ETL pipelines, data warehouses, and real-time data processing systems Must have expertise in SQL, Python, and cloud data platforms like AWS Redshift or Google BigQuery Work closely with data scientists to enhance machine learning models with structured and unstructured data Prior experience in handling large-scale datasets is preferred
Posted 2 months ago
3.0 - 8.0 years
10 - 18 Lacs
Coimbatore
Work from Office
Design and implement scalable data architectures to optimize data flow and analytics capabilities Develop ETL pipelines, data warehouses, and real-time data processing systems Must have expertise in SQL, Python, and cloud data platforms like AWS Redshift or Google BigQuery Work closely with data scientists to enhance machine learning models with structured and unstructured data Prior experience in handling large-scale datasets is preferred
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough