Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 - 8.0 years
6 - 10 Lacs
Indore, Pune
Work from Office
Overall 10-18 yrs. of Data Engineering experience with Minimum 4+ years of hands on experience in Databricks. Ready to travel Onsite and work at client location. Proven hands-on experience as a Databricks Architect or similar role with a deep understanding of the Databricks platform and its capabilities. Analyze business requirements and translate them into technical specifications for data pipelines, data lakes, and analytical processes on the Databricks platform. Design and architect end-to-end data solutions, including data ingestion, storage, transformation, and presentation layers, to meet business needs and performance requirements. Lead the setup, configuration, and optimization of Databricks clusters, workspaces, and jobs to ensure the platform operates efficiently and meets performance benchmarks. Manage access controls and security configurations to ensure data privacy and compliance. Design and implement data integration processes, ETL workflows, and data pipelines to extract, transform, and load data from various sources into the Databricks platform. Optimize ETL processes to achieve high data quality and reduce latency. Monitor and optimize query performance and overall platform performance to ensure efficient execution of analytical queries and data processing jobs. Identify and resolve performance bottlenecks in the Databricks environment. Establish and enforce best practices, standards, and guidelines for Databricks development, ensuring data quality, consistency, and maintainability. Implement data governance and data lineage processes to ensure data accuracy and traceability. Mentor and train team members on Databricks best practices, features, and capabilities. Conduct knowledge-sharing sessions and workshops to foster a data-driven culture within the organization. Will be responsible for Databricks Practice Technical/Partnership initiatives. Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects. Bachelors or Master s degree in Computer Science, Information Technology, or related field. In depth hands-on implementation knowledge on Databricks. Delta Lake, Delta table - Managing Delta Tables, Databricks Cluster Configuration, Cluster policies. Experience handling structured and unstructured datasets Strong proficiency in programming languages like Python, Scala, or SQL. Experience with Cloud platforms like AWS, Azure, or Google Cloud, and understanding of cloud-based data storage and computing services. Familiarity with big data technologies like Apache Spark, Hadoop, and data lake architectures. Develop and maintain data pipelines, ETL workflows, and analytical processes on the Databricks platform. Should have good experience in Data Engineering in Databricks Batch process and Streaming Should have good experience in creating Workflows & Scheduling the pipelines. Should have good exposure on how to make packages or libraries available in DB. Familiarity in Databricks default runtimes Databricks Certified Data Engineer Associate/Professional Certification (Desirable). Should have experience working in Agile methodology Strong verbal and written communication skills. Strong analytical and problem-solving skills with a high attention to detail.
Posted 3 weeks ago
4.0 - 8.0 years
6 - 10 Lacs
Hyderabad
Work from Office
Some careers shine brighter than others. If you re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Associate Director. In this role, you will: Design, develop, and optimize data pipelines using Azure Databricks, PySpark, and Prophesy. Implement and maintain ETL/ELT pipelines using Azure Data Factory (ADF) and Apache Airflow for orchestration. Develop and optimize complex SQL queries and Python-based data transformation logic. Work with version control systems (GitHub, Azure DevOps) to manage code and deployment processes. Automate deployment of data pipelines using CI/CD practices in Azure DevOps. Ensure data quality, security, and compliance with best practices. Monitor and troubleshoot performance issues in data pipelines. Collaborate with cross-functional teams to define data requirements and strategies. Requirements To be successful in this role, you should meet the following requirements: 12+ years of experience in data engineering, working with Azure Databricks, PySpark, and SQL. Hands-on experience with Prophesy for data pipeline development. Proficiency in Python for data processing and transformation. Experience with Apache Airflow for workflow orchestration. Strong expertise in Azure Data Factory (ADF) for building and managing ETL processes. Familiarity with GitHub and Azure DevOps for version control and CI/CD automation. Solid understanding of data modelling, warehousing, and performance optimization. Ability to work in an agile environment and manage multiple priorities effectively. Excellent problem-solving skills and attention to detail. Experience with Delta Lake and Lakehouse architecture. Hands-on experience with Terraform or Infrastructure as Code (IaC). Understanding of machine learning workflows in a data engineering context.
Posted 3 weeks ago
5.0 - 8.0 years
7 - 10 Lacs
Pune
Work from Office
Design, develop, and maintain data pipelines and ETL/ELT processes using PySpark / Databricks / bigquery / Airflow / composer. Optimize performance for large datasets through techniques such as partitioning, indexing, and Spark optimization. Collaborate with cross-functional teams to resolve technical issues and gather requirements. Your Key Responsibilities Ensure data quality and integrity through data validation and cleansing processes. Analyze existing SQL queries, functions, and stored procedures for performance improvements. Develop database routines like procedures, functions, and views/MV. Participate in data migration projects and understand technologies like Delta Lake/warehouse/bigquery. Debug and solve complex problems in data pipelines and processes. Your skills and experience that will help you excel Bachelor s degree in computer science, Engineering, or a related field. Strong understanding of distributed data processing platforms like Databricks and BigQuery. Proficiency in Python, PySpark, and SQL programming languages. Experience with performance optimization for large datasets. Strong debugging and problem-solving skills. Fundamental knowledge of cloud services, preferably Azure or GCP. Excellent communication and teamwork skills. Nice to Have: Experience in data migration projects. Understanding of technologies like Delta Lake/warehouse.
Posted 3 weeks ago
5.0 - 9.0 years
7 - 11 Lacs
Jaipur
Work from Office
Hydro Global Business Services (GBS) is an organizational area that operates as an internal service provider for the Hydro group. Its ultimate purpose is to deliver relevant IT, financial and HR business services to all business areas within the company Role and responsibilities Role purpose: As a Senior BI Engineer, you will play a key role in enabling data-driven decision-making by designing, developing, and maintaining robust BI solutions. You ll be part of a team currently working with SQL Server and MSBI tools, Snowflake Data Cloud, Azure Data Pipelines, and Power BI semantic models You will: Collaborate with business stakeholders from Finance, Sales, Marketing, and Operations to gather BI requirements. Develop and optimize reports, dashboards, and visualizations that deliver actionable insights. Build and maintain ETL/ELT processes, supporting the transition from legacy systems to modern cloud platforms. Work closely with data architects, engineers, and data scientists to ensure data quality, consistency, and availability. Translate complex data into simple visual stories that support strategic decision-making. Contribute to the design of a scalable and maintainable BI architecture. Support data governance and documentation practices to ensure transparency and reproducibility. Stay current with BI trends and help evaluate new tools or technologies that align with our future data platform vision Responsibilities Design, develop, and maintain BI solutions using the Snowflake Data Cloud and related technologies. Gather and analyze business requirements in collaboration with departmental stakeholders to translate them into scalable BI solutions. Support solution architects by designing efficient data models and reusable data products tailored to business needs. Interpret business data to uncover trends, patterns, and actionable insights present findings through effective storytelling and visualizations. Collaborate with Data Engineers to ensure robust data pipelines, data warehousing solutions, and efficient ETL/ELT processes. Continuously evaluate and adopt Snowflake BI features, staying updated with emerging trends and best practices. Work closely with cross-functional IT teams to ensure the reliability, performance, and security of BI infrastructure. Maintain compliance with Hydro s Quality System, HSE (Health, Safety & Environment)regulations, policies, and standard operating procedures. Fulfill additional responsibilities assigned by leadership that support the smooth operation of the BI unit and are within legal and policy frameworks. Ensure adherence to area-specific customer requirements and data governance standards Required experience, qualification and skills Work experience 7+ years of hands-on experience in Business Intelligence development, with a strong focus on data visualization and dashboarding using Power BI. Demonstrated business analyst mindset, with the ability to interpret and translate business requirements into actionable data models and products. Proven experience working with SQL-based databases for data querying, transformation, and reporting. Experience with Snowflake or other cloud-based data warehouse platforms is highly desirable. Exposure to or prior experience in the manufacturing domain is considered a strong advantag Education, specific skills Bachelor s degree in Computer Science, Information Systems, Business Administration, or a related technical field. Fluency in English (spoken and written) is mandatory for collaboration across global teams. Strong analytical and problem-solving skills with a detail-oriented mindset. Excellent communication and interpersonal skills to interact with both technical and non-technical stakeholders. Ability to work independently and as part of a cross-functional team in a dynamic, evolving environment Expected skills, expected soft-skills, competencies Technical Skills Strong expertise in SQL, ETL processes, and data modelling. Hands-on experience with designing and developing Tabular models, ideally using Power BI semantic models. Proficiency with data warehousing and integration technologies, such as Snowflake,Azure Data Factory, or AWS. F amiliarity with BI solution architectures, including best practices for performance optimization and scalability. Experience in migrating from traditional BI tools (e.g., SSAS cubes) to modern cloud-based platforms like Snowflake and Power BI. Leadership & Analytical Competence Proven ability to lead BI initiatives, set development standards, and guide teams through complex project lifecycles. Demonstrated strength in solving analytical problems by leveraging both domain knowledge and technical acumen. Ability to translate business questions into data-driven insights and actionable recommendations. Soft Skills Excellent communication skills with the ability to present technical topics to non- technical audiences. Strong interpersonal skills for effective collaboration with cross-functional teams. Self-motivated and proactive, capable of working independently and managing multiple priorities. Adaptable, with a continuous learning mindset in evolving technology landscapes. Commitment to quality, attention to detail, and adherence to deadlines under pressure. What we offer you Working at the world s only fully integrated aluminum and leading renewable energy company Diverse, global teams Flexible work environment/home office We provide you the freedom to be creative and to learn from experts Possibility to grow with the company, gain new certificates Attractive benefit package Please apply by uploading your CV and optionally a cover letter. Only applications received through our online system will be considered, not via e-mail. Recruiter Lima Mathew Sr. HR Advisor People Resourcing A job where you make a difference. A key part of succeeding in this mission involves encouraging a collegial environment where our differences are acknowledged as our greatest competitive advantage. Your diverse perspective makes us stronger. Our global diversity, inclusion and belonging program enables us to cultivate a high-performing and inclusive workplace where everyone feels valued. Your career journey is unique. We strive to provide you with the support needed to achieve your full potential. With our global reach, inclusive culture, and cutting-edge technology youll have the opportunity to build a career that aligns with your strengths and passions. Join our global community of over 30,000 people with a presence in 40 countries and united by the values of Care, Courage and Collaboration. At Hydro, you have the chance to make a difference in the industries that matter. Click here to explore our world and the heart of our operations. Posted on: Jun 3, 2025 Location:
Posted 3 weeks ago
5.0 - 10.0 years
7 - 12 Lacs
Bengaluru
Work from Office
Description What makes us Qlik ? A Gartner Magic Quadrant Leader for 1 5 years in a row , Qlik transforms complex data landscapes into actionable insights, driving strategic business outcomes. Serving over 40,000 global customers, our portfolio leverages pervasive data quality and advanced AI/ML capabilities that lead to better decisions, faster. We excel in integration and governance solutions that work with diverse data sources, and our real-time analytics uncover hidden patterns, empowering teams to address complex challenges and seize new opportunities. The Senior Salesforce Developer Role Step into a S enior Salesforce Developer seat where your blend of technical depth and business curiosity drives every decision. You ll own end-to-end enhancements scoping needs with Customer Success, Support, and Professional Services, then translating them into elegant, scalable solutions. In a culture that prizes experimentation and autonomy, you ll have the freedom (and expectation) to try new ideas, challenge assumptions, and shape how our platform evolves. What makes this role interesting : Breadth without boredom - One week you re refactoring code to handle soaring data volumes; the next, you re launching a brand-new framework that redefines customer workflows. Creator s playground - Low bureaucracy, high trust. If you can justify it with data and creativity, you re empowered to build it. True cross-functional visibility - Partner directly with analysts, PMs, and business leaders, so your work never disappears into a black box. Mentorship that matters - Grow others while sharpening your own craft through peer reviews and community learning. Culture of experimentation - We sprint toward abstract ideas, test quickly, and celebrate insights perfect for inquisitive minds who like to solve puzzles. Here s h ow you will make an impact: Elevate user experience - Turn complex requirements into intuitive features that make internal teams wonder how they ever lived without them. Future-proof our platform - Apply best-practice patterns so today s solutions scale smoothly with tomorrow s growth. Accelerate decision-making - Surface clean, actionable data that lets leadership act in minutes, not months. Retire technical debt - Refactor legacy code into modern, maintainable modules, boosting speed and reliability across the board. Grow the talent bench - Coach fellow developers, leading code reviews that spread knowledge and raise our collective bar. We re looking for a teammate with: B.S. in Computer Science, Computer Information Systems, Engineering, or another relevant field of study, or equivalent work experience. 7+ years of enterprise application development experience of which 5+ years is focused Salesforce development. Strong knowledge of the Salesforce platform s declarative capabilities across Sales, Service, Experience Cloud and Certinia /FinancialForce. Experience with Salesforce administration including the use of custom objects, validation rules, lightning pages and flows. Strong background in Salesforce development including Apex, Flows, Triggers, SOQL/SOSL, Platform Events, Heroku, HTML, CSS, JavaScript, Java, Web Services, Visualforce, Lightning Design Systems, Lightning Web Components, and REST/SOAP Web Services. Mastery of modern development standards including the use of DevOps, VCS, CI/CD pipelines, VS Code, and Apex PMD static code analysis. Familiar with full stack tech including Angular, Node.js, Postgres, MySQL, Moodle, PHP, and WordPress is desirable. Experienced with designing interfaces utilizing Salesforce APIs to integrate with various internal/external systems. Quality engineering principles including shift-left testing with well-structured test classes, and asserts is considered a necessity vs. a requirement. The implementation of record triggered, or screen flows is as interesting as developing complex Apex methods and frameworks. Clear communicator with excellent written, verbal, and listening skills. Experience with Agile/Scrum methodologies actively participating in all aspects of software development lifecycle. Excellent troubleshooting, problem solving, and root cause analysis skills. Demonstrated ability to meet deadlines, handle and prioritize simultaneous requests, and manage laterally and upwards. Preferred Certifications: Salesforce Platform App Builder Salesforce Platform Developer I Salesforce Platform Developer II Salesforce Lightning Component Framework Specialist (Trailhead Super Badge), Salesforce Lightning Experience Specialist (Trailhead Super Badge) Certinia /FinancialForce Th e l ocation for this role is : Bangalore, India - Hybrid Ready to bring big ideas to life and see your impact ripple across every customer touchpoint? Hit Apply and let s build the next chapter together. More about Qlik and who we are : Find out more about life at Qlik on social: Instagram , LinkedIn , YouTube , and X/Twitter , and to see all other opportunities to join us and our values , check out our Careers Page . What else do we offer? G enuine career progression pathways and mentoring programs C ulture of innovation, technology, collaboration, and openness F lexible, diverse, and international work environment Giving back is a huge part of our culture. Alongside an extra change the world day plus another for personal development, we also highly encourage participation in our Corporate Responsibility Employee Programs If you need assistance applying for a role due to a disability, please submit your request via email to accessibilityta @ qlik.com . Any information you provide will be treated according to Qlik s Recruitment Privacy Notice . Qlik may only respond to emails related to accommodation requests. #L1-APAC
Posted 3 weeks ago
6.0 - 10.0 years
8 - 12 Lacs
Bengaluru
Work from Office
HSBC is one of the largest banking and financial services organizations in the world, with operations in 62 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realize their ambitions. We are currently seeking an experienced professional to join our team in the role of Lead Assistant Vice President- Analytics Principal responsibilities Produce analytical / data asset prototypes and work with stakeholders to define requirements, definitions, and objectives. Supporting the design and development of data assets, including data analysis and transformation logic. Applying advanced analytics to portfolio data, especially in data remediation and improving data quality, completeness, and consistency. Applying business and data design principles, risk management, and technology architecture when interacting with delivery partners. At times deputizing for Product Owners in the oversight of the change plan for data requirements in the assigned Agile POD including prioritization of backlogs. Acting as ambassadors in ensuring credit data definitions are appropriately documented and applied. Providing guidance and advice on technology and big data. Collaborating with various stakeholder groups to ensure the availability and quality of data. Providing support for troubleshooting and root cause assessment of data issues. Proactively managing the transition of data assets to production state and ensuring effective embedment. Requirements You should be a subject matter expert with a strong understanding of Credit and Lending processes with between 6-10 years of experience. Hands-on experience with advanced programming languages (e.g., Python or Pyspark) and data visualization tools (e.g., Tableau or Qliksense) is necessary. Your role will require expertise in data management, analytics, and a knowledge of large company structures, values, and processes. Proven experience in leading delivery across diverse environments and managing stakeholders is essential. Key skills for this role include effective communication, virtual team management, data analysis, agile methodology, and stakeholder management. A solid understanding of the use of credit models (regulatory, decision, pricing) within the Credit & Lending journey. Experience in managing stakeholders and using Agile tools like JIRA & Confluence is beneficial. Educated to master s standard or equivalent experience (Desired) Industry recognized technical certifications (Desired) Candidate with Risk Management/Finance/Banking knowledge & experience (Desired) Evidence of self-development across several disciplines Proven track record of performance within HSBC
Posted 3 weeks ago
7.0 - 12.0 years
9 - 14 Lacs
Bengaluru
Work from Office
Description What makes us Qlik ? A Gartner Magic Quadrant Leader for 1 5 years in a row , Qlik transforms complex data landscapes into actionable insights, driving strategic business outcomes. Serving over 40,000 global customers, our portfolio leverages pervasive data quality and advanced AI/ML capabilities that lead to better decisions, faster. We excel in integration and governance solutions that work with diverse data sources, and our real-time analytics uncover hidden patterns, empowering teams to address complex challenges and seize new opportunities. The Site Reliability Engineer Role This is an exciting opportunity to step into a Site Reliability Engineer role that goes beyond traditional boundaries. You ll be at the intersection of development, infrastructure, and analytics designing, building, and automating solutions that power business-critical decisions. From developing QlikView, Qlik Sense, and Talend applications to managing complex cloud infrastructure across Azure, AWS, and GCP, this role gives you the space to own and evolve a data platform that scales. You ll be a trusted technical partner working with stakeholders across the business, blending hands-on development with strategic impact. What makes this role interesting? You ll find yourself immersed in a role that brings variety, autonomy, and the chance to flex both your development and infrastructure skills: A full-stack challenge - Youll move fluidly between building high-performance data applications and engineering resilient cloud environments using tools like Terraform, CloudFormation, and Python. Own cloud innovation - Be hands-on with public cloud infrastructure across Azure, AWS, and GCP. You ll not just manage but innovate in deployment, monitoring, and automation. Shape data experiences - Develop dashboards and data flows that give real-time insights to users, driving smarter decisions across the business. Empower the organization - You won t just build you ll also lead knowledge-sharing sessions, turning users into power-users and amplifying the impact of your work. A collaborative culture - Partner with developers, IT, and business teams who value your voice and expertise in building smarter, faster, more secure solutions. Here s how you ll be making an impact: Your contributions will help shape how data is accessed, visualized, and used to drive action across the organization. Youll play a critical role in scaling systems, improving performance, and ensuring business continuity: Deliver stability and speed - Your automation skills will reduce manual workloads and improve deployment reliability, making our systems faster and more resilient. Unlock data at scale - By creating high-performing data models and ETL processes, you ll help users harness insights quickly and confidently. Protect and optimize - You ll implement best-in-class security practices while also keeping an eye on cost-effectiveness, particularly in cloud environments. Be the go-to expert - From troubleshooting to training, your technical support will empower teams and keep mission-critical services running smoothly. Drive performance improvements - Monitor, analyze, and fine-tune the entire stack for ongoing optimization your work will make a visible difference. We re looking for a teammate with: Bachelor s degree in computer science, Information Technology, or related field. 3 + years of Total IT Experience 1 + year of expertise in QlikView and QlikSense development, Talend integration, and Cloud automation utilizing Terraform, Python, and Powershell. Proficiency in QlikView and QlikSense development and administration. Strong knowledge of Talend ETL processes and data integration. Experience with data modeling, SQL, and server management. Familiarity with security protocols and best practices. Knowledge of scripting languages (e.g., Python, R) is a plus. Cloud Administration - Microsoft Azure and Amazon Web Services Infrastructure as Code - CFT, Python, Powershell Excellent problem-solving and analytical skills. Strong communication and interpersonal skills. Ability to work independently and as part of a team. Detail-oriented with a focus on quality and accuracy. Th e l ocation for this role is : India - Bangalore If youre passionate about helping businesses harness the full potential of their data and want to be part of a team that values expertise, innovation, and collaboration, this is your opportunity to make a real difference. Apply today! More about Qlik and who we are : Find out more about life at Qlik on social: Instagram , LinkedIn , YouTube , and X/Twitter , and to see all other opportunities to join us and our values , check out our Careers Page . What else do we offer? G enuine career progression pathways and mentoring programs C ulture of innovation, technology, collaboration, and openness F lexible, diverse, and international work environment Giving back is a huge part of our culture. Alongside an extra change the world day plus another for personal development, we also highly encourage participation in our Corporate Responsibility Employee Programs If you need assistance applying for a role due to a disability, please submit your request via email to accessibilityta @ qlik.com . Any information you provide will be treated according to Qlik s Recruitment Privacy Notice . Qlik may only respond to emails related to accommodation requests. #L1-APAC
Posted 3 weeks ago
3.0 - 8.0 years
10 - 14 Lacs
Hyderabad, Chennai
Work from Office
Are you ready to make an impact at DTCC Pay and Benefits: Competitive compensation, including base pay and annual incentive. Comprehensive health and life insurance and well-being benefits, based on location. Pension / Retirement benefits Paid Time Off and Personal/Family Care, and other leaves of absence when needed to support your physical, financial, and well-being. DTCC offers a flexible/hybrid model of 3 days onsite and 2 days remote (Tuesdays, Wednesdays, and a day unique to each team or employee). The impact you will have in this role: Data Quality and Integration role is a highly technical position considered a technical expert in system implementation - with an emphasis on providing design, ETL, data quality and warehouse modeling expertise. This role will be accountable for, knowledge of capital development efforts. Performs in an experienced level the technical design of application components, builds applications, interfaces between applications, and understands data security, retention, and recovery. Can research technologies independently and recommend appropriate solutions. Contributes to technology-specific best practices & standards; gives to success criteria from design through deployment, including, reliability, cost-effectiveness, performance, data integrity, maintainability, reuse, extensibility, usability and scalability; gives expertise on significant application components, vendor products, program languages, databases, operating systems, etc, completes the plan by building components, testing, configuring, tuning, and deploying solutions. Software Engineer (SE) for Data Quality and Integration applies specific technical knowledge of data quality and data integration in order to assist in the design and construction of critical systems. The SE works as part of an AD project squad and may interact with the business, Functional Architects, and domain experts on related integrating systems. The SE will contribute to the design of components or individual programs and participates fully in the construction and testing. This involves working with the Senior Application Architect, and other technical contributors at all levels. This position contributes expertise to project teams through all phases, including post-deployment support. This means researching specific technologies, and applications, and contributing to the solution design, supporting development teams, testing, troubleshooting, and production support. The ASD must possess experience in integrating large volumes of data, efficiently and in a timely manner. This position requires working closely with the functional and governance functions, and more senior technical resources, reviewing technical designs and specifications, and contributing to cost estimates and schedules. What You'll Do: Technology Expertise is a domain expert on one or more of programming languages, vendor products specifically, Informatica Data Quality and Informatica Data Integration Hub, DTCC applications, data structures, business lines. Platforms works with Infrastructure partners to stand up development, testing, and production environments Elaboration works with the Functional Architect to ensure designs satisfy functional requirements Data Modeling reviews and extends data models Data Quality Concepts Experience in Data Profiling, Scorecards, Monitoring, Matching, Cleansing Is aware of frameworks that promote concepts of isolation, extensibility, and extendibility System Performance contributes to solutions that satisfy performance requirements; constructs test cases and strategies that account for performance requirements; tunes application performance issues Security implements solutions and complete test plans working mentoring other team members in standard process Standards is aware of technology standards and understands technical solutions need to be consistent with them Documentation develops and maintains system documentation Is familiar with different software development methodologies (Waterfall, Agile, Scrum, Kanban) Aligns risk and control processes into day to day responsibilities to monitor and mitigate risk; escalates appropriately Educational background and work experience that includes mathematics and conversion of expressions into run time executable code. Ensures own and teams practices support success across all geographic locations Mitigates risk by following established procedures and monitoring controls, spotting key errors and demonstrating strong ethical behavior. Helps roll out standards and policies to other team members. Financial Industry Experience including Trades, Clearing and Settlement Education: Bachelor's degree or equivalent experience. Talents Needed for Success: Minimum of 3+ years in Data Quality and Integration. Basic understanding of Logical Data Modeling and Database design is a plus Technical experience with multiple database platformsSybase, Oracle, DB2 and distributed databases like Teradata/Greenplum/Redshift/Snowflake containing high volumes of data. Knowledge of data management processes and standard methodologies preferred Proficiency with Microsoft Office tools required Supports team in managing client expectations and resolving issues on time. Technical skills highly preferred along with strong analytical skills.
Posted 3 weeks ago
1.0 - 5.0 years
4 - 8 Lacs
Hyderabad
Work from Office
About the role As a Marketing Data Specialist , you will work alongside the Marketing Data Lead in managing and optimizing lead and contact creation and management processes with a focus on data quality, enrichment, standardization, and supporting ongoing data-related initiatives. What you’ll do Assist with data enrichment, acquisition, standardization, and segmentation efforts Aid in the management of lead and contact creation and routing workflows, ensuring data flows smoothly to and from CRM (Salesforce) and other systems using tools like LeanData , RingLead , Demandbase Data Integrity, SFDC Data Loader, Cloudingo , and more Ensure lead creation and conversion processes across Sales and Marketing are optimized and adhere to internal and external data health and compliance requirements Support Marketing Data Lead in implementing data governance policies and practices Oversee intake, perform related actions, and ensure positive outcomes of lead generation program s with goal to automate process where appropriate Collaborate with marketing , sales , and customer success teams to support data-driven initiatives and reporting What you’ll bring 5 + years of experience in data management, marketing operations, sales operations , or customer success operations Familiarity with CRM systems (e.g., Salesforce) and data management tools like LeanData , RingLead , Demandbase Data Integrity, SFDC Data Loader, Cloudingo , and more Has a high level of autonomy in day-to-day operations Strong attention to detail with an understanding of global data governance principles and best practices Ability to work independently and collaboratively with a global team Analytical mindset adept at problem finding and problem solving Able to work flexible hours as required by business priorities Stay up to date on everything Blackbaud, follow us on Linkedin, X, Instagram, Facebook and YouTube Blackbaud is a digital-first company which embraces a flexible remote or hybrid work culture. Blackbaud supports hiring and career development for all roles from the location you are in today! Blackbaud is proud to be an equal opportunity employer and is committed to maintaining an inclusive work environment. All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, physical or mental disability, age, or veteran status or any other basis protected by federal, state, or local law.
Posted 3 weeks ago
5.0 - 10.0 years
20 - 25 Lacs
Hyderabad
Work from Office
Role - Informatica Customer 360 Cloud MDM Expert Client: Global Pharmaceutical Leader Engagement: 6-month contract Mode: Remote Skills: 5+ years in Data Management with 3+ years in Informatica Customer 360 Cloud, MDM, and Reference 360. Led end-to-end MDM implementations and data modeling. Strong in data governance, quality rules, and dashboards. Hands-on with API/batch integrations and performance tuning. Skilled in troubleshooting, documentation, and stakeholder communication.
Posted 3 weeks ago
3.0 - 8.0 years
6 - 10 Lacs
Bengaluru
Work from Office
Project Role : Tech Delivery Subject Matter Expert Project Role Description : Drive innovative practices into delivery, bring depth of expertise to a delivery engagement. Sought out as experts, enhance Accentures marketplace reputation. Bring emerging ideas to life by shaping Accenture and client strategy. Use deep technical expertise, business acumen and fluid communication skills, work directly with a client in a trusted advisor relationship to gather requirements to analyze, design and/or implement technology best practice business changes. Must have skills : SAP HCM On Premise ABAP Good to have skills : SAP Data MigrationMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Tech Delivery Subject Matter Expert, you will drive innovative practices into delivery, bring depth of expertise to a delivery engagement. Sought out as experts, enhance Organization marketplace reputation. Bring emerging ideas to life by shaping Organization and client strategy. Use deep technical expertise, business acumen and fluid communication skills, work directly with a client in a trusted advisor relationship to gather requirements to analyze, design and/or implement technology best practice business changes. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Lead and mentor junior team members.- Collaborate with cross-functional teams to drive project success. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP HCM On Premise ABAP.- Strong understanding of SAP Data Migration.- Experience in ABAP development for SAP HCM modules.- Knowledge of SAP HR processes and configurations.- Hands-on experience in SAP HCM data modeling. Additional Information:- The candidate should have a minimum of 3 years of experience in SAP HCM On Premise ABAP.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 3 weeks ago
15.0 - 20.0 years
6 - 10 Lacs
Bengaluru
Work from Office
Project Role : Tech Delivery Subject Matter Expert Project Role Description : Drive innovative practices into delivery, bring depth of expertise to a delivery engagement. Sought out as experts, enhance Accentures marketplace reputation. Bring emerging ideas to life by shaping Accenture and client strategy. Use deep technical expertise, business acumen and fluid communication skills, work directly with a client in a trusted advisor relationship to gather requirements to analyze, design and/or implement technology best practice business changes. Must have skills : SAP HCM On Premise ABAP Good to have skills : SAP Data MigrationMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Tech Delivery Subject Matter Expert, you will drive innovative practices into delivery, bring depth of expertise to a delivery engagement. Sought out as experts, enhance Organization marketplace reputation. Bring emerging ideas to life by shaping Organization and client strategy. Use deep technical expertise, business acumen and fluid communication skills, work directly with a client in a trusted advisor relationship to gather requirements to analyze, design and/or implement technology best practice business changes. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Expected to provide solutions to problems that apply across multiple teams- Lead the team in implementing SAP HCM On Premise ABAP solutions- Provide technical guidance and expertise to team members- Collaborate with stakeholders to understand business requirements and translate them into technical solutions Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP HCM On Premise ABAP- Strong understanding of SAP Data Migration- Experience in ABAP development for SAP HCM modules- Knowledge of SAP HR processes and configurations- Ability to troubleshoot and resolve technical issues in SAP HCM modules Additional Information:- The candidate should have a minimum of 12 years of experience in SAP HCM On Premise ABAP- This position is based at our Bengaluru office- A 15 years full time education is required Qualification 15 years full time education
Posted 3 weeks ago
1.0 - 9.0 years
25 - 30 Lacs
Hyderabad
Work from Office
Job summary Amazon.com s Buyer Risk Prevention (BRP) mission is to make Amazon the safest and most trusted place worldwide to transact online. Amazon runs one of the most dynamic e-commerce marketplaces in the world, with nearly 2 million sellers worldwide selling hundreds of millions of items in ten countries. BRP safeguards every financial transaction across all Amazon sites. As such, BRP designs and builds the software systems, risk models, and operational processes that minimize risk and maximize trust in Amazon.com. The BRP organization is looking for a data scientist for its Risk Mining Analytics (RMA) team, whose mission is to combine advanced analytics with investigator insight to detect negative customer experiences, improve system effectiveness, and prevent bad debt across Amazon. As a data scientist in risk mining, you will be responsible for modeling complex problems, discovering insights, and building risk algorithms that identify opportunities through statistical models, machine learning, and visualization techniques to improve operational efficiency and reduce bad debt. You will need to collaborate effectively with business and product leaders within BRP and cross-functional teams to build scalable solutions against high organizational standards. The candidate should be able to apply a breadth of tools, data sources, and data science techniques to answer a wide range of high-impact business questions and proactively present new insights in a concise and effective manner. The candidate should be an effective communicator capable of independently driving issues to resolution and communicating insights to non-technical audiences. This is a high-impact role with goals that directly impact the bottom line of the business. Analyze terabytes of data to define and deliver on complex analytical deep dives to unlock insights and build scalable solutions through Data Science to ensure security of Amazon s platform and transactions Build Machine Learning and/or statistical models that evaluate the transaction legitimacy and track impact over time Ensure data quality throughout all stages of acquisition and processing, including data sourcing/collection, ground truth generation, normalization, transformation, and cross-lingual alignment/mapping Define and conduct experiments to validate/reject hypotheses, and communicate insights and recommendations to Product and Tech teams Develop efficient data querying infrastructure for both offline and online use cases Collaborate with cross-functional teams from multidisciplinary science, engineering and business backgrounds to enhance current automation processes Learn and understand a broad range of Amazon s data resources and know when, how, and which to use and which not to use. Maintain technical document and communicate results to diverse audiences with effective writing, visualizations, and presentations - 3+ years of data querying languages (e.g. SQL), scripting languages (e.g. Python) or statistical/mathematical software (e.g. R, SAS, Matlab, etc.) experience - 2+ years of data scientist experience - 3+ years of machine learning/statistical modeling data analysis tools and techniques, and parameters that affect their performance experience - Experience applying theoretical models in an applied environment - Experience in Python, Perl, or another scripting language
Posted 3 weeks ago
1.0 - 9.0 years
25 - 30 Lacs
Hyderabad
Work from Office
Amazon.com s Buyer Risk Prevention (BRP) mission is to make Amazon the safest and most trusted place worldwide to transact online. Amazon runs one of the most dynamic e-commerce marketplaces in the world, with nearly 2 million sellers worldwide selling hundreds of millions of items in ten countries. BRP safeguards every financial transaction across all Amazon sites. As such, BRP designs and builds the software systems, risk models, and operational processes that minimize risk and maximize trust in Amazon.com. The BRP organization is looking for a data scientist for its Risk Mining Analytics (RMA) team, whose mission is to combine advanced analytics with investigator insight to detect negative customer experiences, improve system effectiveness, and prevent bad debt across Amazon. As a data scientist in risk mining, you will be responsible for modeling complex problems, discovering insights, and building risk algorithms that identify opportunities through statistical models, machine learning, and visualization techniques to improve operational efficiency and reduce bad debt. You will need to collaborate effectively with business and product leaders within BRP and cross-functional teams to build scalable solutions against high organizational standards. The candidate should be able to apply a breadth of tools, data sources, and data science techniques to answer a wide range of high-impact business questions and proactively present new insights in a concise and effective manner. The candidate should be an effective communicator capable of independently driving issues to resolution and communicating insights to non-technical audiences. This is a high-impact role with goals that directly impact the bottom line of the business. Analyze terabytes of data to define and deliver on complex analytical deep dives to unlock insights and build scalable solutions through Data Science to ensure security of Amazon s platform and transactions Build Machine Learning and/or statistical models that evaluate the transaction legitimacy and track impact over time Ensure data quality throughout all stages of acquisition and processing, including data sourcing/collection, ground truth generation, normalization, transformation, and cross-lingual alignment/mapping Define and conduct experiments to validate/reject hypotheses, and communicate insights and recommendations to Product and Tech teams Develop efficient data querying infrastructure for both offline and online use cases Collaborate with cross-functional teams from multidisciplinary science, engineering and business backgrounds to enhance current automation processes Learn and understand a broad range of Amazon s data resources and know when, how, and which to use and which not to use. Maintain technical document and communicate results to diverse audiences with effective writing, visualizations, and presentations - 2+ years of data scientist experience - 3+ years of data querying languages (e.g. SQL), scripting languages (e.g. Python) or statistical/mathematical software (e.g. R, SAS, Matlab, etc.) experience - 3+ years of machine learning/statistical modeling data analysis tools and techniques, and parameters that affect their performance experience - Experience applying theoretical models in an applied environment - Experience in a ML or data scientist role with a large technology company
Posted 3 weeks ago
1.0 - 5.0 years
3 - 7 Lacs
Mumbai
Work from Office
FanCode is India s premier sports destination committed to giving fans a highly personalised experience across content and merchandise for a wide variety of sports. Founded by sports industry veterans Yannick Colaco and Prasana Krishnan in March 2019, FanCode has over 100 million users. It has partnered with domestic, international sports leagues and associations across multiple sports. In content, FanCode offers interactive live streaming for sports with industry-first subscription formats like Match, Bundle and Tour Passes, along with monthly and annual subscriptions at affordable prices. Through FanCode Shop, it also offers fans a wide range of sports merchandise for sporting teams, brands and leagues across the world. Responsibilities Create, update, and maintain product listings on the Magento platform. Manage product listings across multiple e-commerce platforms like Amazon, Myntra, Flipkart, etc., ensuring accuracy, completeness, and consistency. Organize products into relevant categories and subcategories, ensuring a logical and user-friendly navigation structure. Implement and manage categorization and tagging systems for efficient catalog organization. Implement data quality standards to ensure consistency and accuracy across the product catalog. Work closely with cross-functional teams to address any discrepancies or missing information. Continuously optimize product listings for search engine visibility and conversion rate optimization. Work closely with the operations team to coordinate product availability and updates. Experience with e-commerce tools and software, such as Magento, Amazon Seller Central, Myntra Seller Hub, Flipkart Seller Hub, etc. Must Haves 2+ years of experience in e-commerce catalog management, including Magento and leading online marketplaces such as Amazon, Myntra, Flipkart, etc. Optimizing product listings across multiple e-commerce platforms. In-depth knowledge of Magento administration and configuration, including product catalog setup, attribute management, category hierarchy and Product grouping. Proficiency in utilizing Amazon Seller Central, Myntra Seller Hub, Flipkart Seller Hub, or similar platforms to manage product listings, inventory, and orders. Ensure accurate and comprehensive product information, including descriptions, images, prices, and specifications. Excellent communication and collaboration skills, with the ability to work effectively with cross-functional teams and external partners(Vendors). Dream Sports is India s leading sports technology company with 250 million users, housing brands such as Dream11 , the world s largest fantasy sports platform, FanCode , India s digital sports destination, and DreamSetGo , a sports experiences platform. Dream Sports is based in Mumbai and has a workforce of close to 1,000 Sportans . Founded in 2008 by Harsh Jain and Bhavit Sheth, Dream Sports vision is to Make Sports Better for fans through the confluence of sports and technology. For more information: https://dreamsports.group/
Posted 3 weeks ago
1.0 - 6.0 years
3 - 8 Lacs
Gurugram
Work from Office
Quality Analyst (Freshers Only) at Lattice Technologies Pvt Ltd See all the jobs at Lattice Technologies Pvt Ltd here: Apply with Indeed About US . Job Overview: We are seeking a detail-oriented and analytical Quality Analyst to ensure the accuracy, completeness, and reliability of data collected by our Research Associates through telephonic interviews. The QA Executive will be responsible for monitoring, auditing, and validating calls, identifying discrepancies, and providing feedback for continuous improvement in data collection practices. Key Responsibilities: Conduct regular audits of calls made by Research Associates to validate the authenticity, completeness, and quality of data collected. Review questionnaires/responses to ensure adherence to project guidelines and quality standards. Identify gaps, errors, or inconsistencies in data and provide constructive feedback to the Research team. Maintain detailed and accurate records of audits, observations, and corrective actions. Share daily/weekly quality reports with key stakeholders, highlighting trends and areas for improvement. Work closely with Training and Operations teams to support coaching initiatives based on audit findings. Participate in calibration sessions to align QA standards across projects and teams. Suggest and implement process improvements to enhance overall data quality and operational efficiency. Ensure compliance with company policies, client requirements, and industry regulations. Key Requirements: MBA in any discipline (preferred: Market Research, Business Administration, Communications). 0 1 years of experience in Quality Assurance within Market Research, BPO, or similar environment. Strong attention to detail and excellent analytical skills. Good listening and communication skills in English (other regional languages are an advantage). Ability to work independently and manage multiple tasks simultaneously. Familiarity with CRM systems, QA tools, and MS Office (especially Excel and PowerPoint). Understanding of market research methodologies and call centre operations will be a plus. Preferred Skills: Experience in call auditing or monitoring in a market research setting. Strong documentation and reporting skills. Ability to provide constructive feedback and drive quality improvement initiatives.
Posted 3 weeks ago
15.0 - 18.0 years
50 - 55 Lacs
Bengaluru
Work from Office
Job Description Summary The Sr Director - Data Architecture will manage a team of Data Architects and partner with Product Managers to define the data architecture and strategy of the GE HealthCare One Data Platform, and identify product advancement & enhancement opportunities for critical enterprise use cases. Job Description Roles and Responsibilities The Sr Director - Data Architecture is a critical leadership role responsible for shaping and executing the organization s enterprise data architecture strategy. This individual will lead a team of data architects, driving the evolution of our enterprise data landscape and support business objectives and enabling data-driven decision-making. The ideal candidate should possess a deep understanding of data architecture principles, domain-driven design, and emerging technologies, coupled with exceptional leadership and communication skills. (S)He will be a champion for data quality, governance, and security, and a key influencer in shaping the future of the enterprise data strategy at GE HealthCare. Strategic Data Architecture Leadership: Establish the overall vision for the team, d efine and champion the overall data architecture strategy, aligning it directly with business objectives and growth plans. Develop and maintain a multi-year data architecture roadmap, incorporating emerging technologies and industry best practices. Lead the evolution of the organization s data landscape, ensuring scalability, reliability, security, and performance. Drive the transition towards a data mesh architecture, defining principles, patterns, and governance frameworks to enable decentralized data ownership and domain autonomy. Data Architecture Design & Standards: Define and evangelize standard data patterns, reference architectures, and architectural guidelines for various data domains (viz., Finance, Supply Chain, Services, Manufacturing, Enterprise Operations, etc.). Ensure architectural designs are aligned with domain-driven design (DDD) principles, promoting data ownership and accountability within business domains. Oversee the design and implementation of data models, data pipelines, and data integration solutions. Evaluate and recommend data technologies and platforms (e.g., data warehouses, data lakes, streaming platforms, graph databases, cloud services). Data Mesh Implementation: Define and enforce data architecture standards related to data privacy, access control, data lifecycle management, and operational models for a data mesh architecture. Guide domain teams in adopting data mesh principles and building data products. Establish clear ownership and accountability for data assets within each domain. Develop and enforce data contracts and APIs to ensure interoperability between data products. Team Leadership & Development: Attract, recruit, and retain top data architecture talent, building a high-performing team. Mentor and develop data architects, fostering a culture of innovation, collaboration, and continuous learning. Lead a large team of data architects, providing technical guidance and oversight to the team. Conduct performance reviews and provide constructive feedback. Collaboration & Communication: Partner closely with business stakeholders, data scientists, data engineers, and other technology leaders to understand data needs and translate them into architectural solutions. Communicate complex technical concepts clearly and effectively to both technical and non-technical audiences. Influence and advocate for data architecture best practices across the organization. Collaborate with the Data Governance team to ensure data quality, compliance, and security. Stay abreast of emerging data technologies and trends (e.g., AI/ML, blockchain, data fabrics) and evaluate their potential impact on the organization. Education Qualification: Bachelors Degree in Computer Science or STEM Majors (Science, Technology, Engineering and Math) with 15+ years of relevant experience. Desired Characteristics: Passionate about driving change and influence across enterprise Demonstrated ability to lead in a highly matrixed environment Ability to work with cross-functional teams to drive operationalization of analytic products and help teams to transform processes through analytics Strong ability to communicate deep analytical results in forms that resonate with the business collaborators, highlighting actionable insights Entrepreneurial inclination to discover novel opportunities for applying analytical techniques to business/scientific problems across the company Ability to acquire any specialized domain knowledge required to be more effective in all required activities Willingness to learn and develop critical data science skills (e.g. statistical analyses, advanced data visualization, data mining, and data cleansing/transformation) Competency in major analytics frameworks and programming environments like Data Mesh and Data Fabric Technical Expertise: Understands technical and business discussions relative to future architecture direction aligning with business goals. Understands concepts of setting and driving architecture direction. Familiar with elements of gathering architecture requirements. Understands architecture standards concepts to apply to project work. Business Acumen: Understand key cross-functional concepts that impact the organization; is aware of business priorities and organizational dynamics. Personal Attributes: Applies values, policies, procedures and precedent to make timely, routine decisions of limited, clear choice. Reacts open-mindedly to new perspectives or ideas. Considers different or unusual solutions when appropriate. Resolves day-to-day issues related to strategy implementation. Escalates issues that impact the client and/or strategic initiatives. Inclusion and Diversity GE HealthCare is an Equal Opportunity Employer where inclusion Employment decisions are made without regard to race, color, religion, national or ethnic origin, sex, sexual orientation, gender identity or expression, age, disability, protected veteran status or other characteristics protected by law. We expect all employees to live and breathe our behaviors: to act with humility and build trust; lead with transparency; deliver with focus, and drive ownership - always with unyielding integrity. Our total rewards are designed to unlock your ambition by giving you the boost and flexibility you need to turn your ideas into world-changing realities. Our salary and benefits are everything you d expect from an organization with global strength and scale, and you ll be surrounded by career opportunities in a culture that fosters care, collaboration and support. Relocation Assistance Provided: Yes
Posted 3 weeks ago
1.0 - 4.0 years
6 - 9 Lacs
Bengaluru
Work from Office
Responsible for the day-to-day management and maintenance of the data lake infrastructure, including data ingestion, storage, organization, and retrieval processes. Monitor and assess the quality of data stored in the data lake, identify inconsistencies, errors, and gaps, and collaborate with relevant stakeholders to address data quality issues. Develop and implement data integration and transformation pipelines to ingest, cleanse, and transform data from various sources into the data lake, ensuring compatibility and consistency. Develop and implement data integration and transformation pipelines to ingest, cleanse, and transform data from various sources into the data lake, ensuring compatibility and consistency. Implement and enforce data governance policies and security measures to safeguard sensitive and confidential data stored in the data lake, in compliance with regulatory requirements and best practices. Identify opportunities to optimize the performance and efficiency of the data lake infrastructure, including storage optimization, query performance tuning, and resource utilization monitoring. Conduct exploratory data analysis and perform ad-hoc queries to extract insights and derive actionable intelligence from the data lake, supporting decision-making processes across the organization. Maintain comprehensive documentation of data lake processes, configurations, and workflows, and generate regular reports on data lake performance, usage metrics, and data quality metrics. Collaborate with cross-functional teams, including data engineers, data scientists, business analysts, project managers, and internal/external IT professionals, to understand data requirements, prioritize initiatives, and deliver data solutions that meet business needs. Overview Job Posting Title India Remote/Ahmedabad/Bengaluru/New Delhi Emmes Group: Building a better future for us all. Emmes Group is transforming the future of clinical research, bringing the promise of new medical discovery closer within reach for patients. Emmes Group was founded as Emmes more than 47 years ago, becoming one of the primary clinical research providers to the US government before expanding into public-private partnerships and commercial biopharma. Emmes has built industry leading capabilities in cell and gene therapy, vaccines and infectious diseases, ophthalmology, rare diseases, and neuroscience. We believe the work we do will have a direct impact on patients lives and act accordingly. We strive to build a collaborative culture at the intersection of being a performance and people driven company. We re looking for talented professionals eager to help advance clinical research as we work to embed innovation into the fabric of our company. If you share our motivations and passion in research, come join us! Primary Purpose This position will support and optimize our data lake infrastructure and play a crucial role in ensuring the reliability, efficiency, and usability of our data lake environment Responsibilities Responsible for the day-to-day management and maintenance of the data lake infrastructure, including data ingestion, storage, organization, and retrieval processes. Monitor and assess the quality of data stored in the data lake, identify inconsistencies, errors, and gaps, and collaborate with relevant stakeholders to address data quality issues. Develop and implement data integration and transformation pipelines to ingest, cleanse, and transform data from various sources into the data lake, ensuring compatibility and consistency. Develop and implement data integration and transformation pipelines to ingest, cleanse, and transform data from various sources into the data lake, ensuring compatibility and consistency. Implement and enforce data governance policies and security measures to safeguard sensitive and confidential data stored in the data lake, in compliance with regulatory requirements and best practices. Identify opportunities to optimize the performance and efficiency of the data lake infrastructure, including storage optimization, query performance tuning, and resource utilization monitoring. Conduct exploratory data analysis and perform ad-hoc queries to extract insights and derive actionable intelligence from the data lake, supporting decision-making processes across the organization. Maintain comprehensive documentation of data lake processes, configurations, and workflows, and generate regular reports on data lake performance, usage metrics, and data quality metrics. Collaborate with cross-functional teams, including data engineers, data scientists, business analysts, project managers, and internal/external IT professionals, to understand data requirements, prioritize initiatives, and deliver data solutions that meet business needs. Qualifications Bachelors degree or higher in Computer Science, Information Systems, Statistics, Mathematics, or a related field. Skilled in using Azure specific tools for data analysis, including Microsoft Fabric, Azure Synapse Analytics for big data and complex analysis tasks, and Microsoft Power BI for data visualization and business intelligence solutions. Proficiency in SQL and experience with database management systems (DBMS) for querying, manipulating, and analyzing large datasets. Strong analytical and problem-solving skills, with the ability to interpret complex data sets, identify patterns, and draw meaningful insights. Familiarity with data modeling concepts, data warehousing principles, and ETL (Extract, Transform, Load) processes. Experience with other cloud platforms like AWS and GCP is a plus. Excellent communication and interpersonal skills, with the ability to effectively communicate technical concepts to non-technical stakeholders. Strong attention to detail and a commitment to data accuracy, integrity, and security. CONNECT WITH US! Follow us on Twitter - @EmmesCRO Find us on LinkedIn - Emmes #LI-Remote
Posted 3 weeks ago
0.0 - 5.0 years
10 - 20 Lacs
Mumbai
Work from Office
We are recruiting an expert application support engineer to scale up the global support capability for our data nad analytics platform used by our research and trading teams. The candidate will work closely with our data engineers, data scientists, external data vendors, and various trading teams to rapidly resolve data and analytics application issues related to data quality, data integration, model pipelines, and analtics applications. Knowledge, Skills and Abilities - Python, SQL - Familiarity with data engineering - Experience with AWS data and analytics services or similar cloud vendor services - Strong problem solving and interpersonal skills - Ablity to organise and prioritise work effectively Key Responsibilities - Incident and user management for data and analytics platform - Development and maintenance of Data Quliaty framework (including anomaly detection) - Implemenation of Python & SQL hotfixes and working with data engineers on more complex issues - Diagnostic tools implementation and automation of operational processes Key Relationships - Work closely with data scientists, data engineers, and platform engineers in a highly commercial environment - Support research analysts and traders with issue resolution Competencies - Excellent problem solving skills - Ability to communicate effectively with a diverse set of customers across business lines and technology - Report to Head of DSE Engineering Mumbai, who reports to Global Head of Cloud and Data Engineering
Posted 3 weeks ago
2.0 - 4.0 years
4 - 7 Lacs
Pune
Work from Office
Local accounting manager - - - - - - - - - - - - KEY EXPECTED ACHIEVEMENTS General review - Closing Review and Zero surprised prepared consistently with Group standards. Standard General Review process (actors involvement and coordination with CESP Company Leader) in place to guarantee Financial statements economical consistency and facilitate Region/Country F manager accounts validation. Expected downstream data quality level, closing deadlines are reached and standards are applied (by local data suppliers - especially by SP department). Company forecast to fulfill internal and external needs. Group accounts certified by legal auditors. Rules to transform Group accounts into local norms defined, validated and updated following regulatory modifications and company activities evolutions. Financial statements in local norms validated, formatted and submitted to local authorities following country deadlines. Financial statements in local norms certification by legal auditors. In case of regulatory or tax controls, provide and explain accounting data. Quality and accounting compliance of data produced for the preparation of tax returns.
Posted 3 weeks ago
1.0 - 4.0 years
2 - 6 Lacs
Mumbai
Work from Office
We are seeking a seasoned Data Test Engineer with expertise in designing and implementing robust data quality solutions for enterprise data systems. This is a development-focused role requiring a deep understanding of data engineering, data governance, and data cataloging. As a Data Test Engineer, you will be responsible for ensuring the accuracy, consistency, and reliability of data across platforms, empowering business teams with trusted insights. Key Responsibilities: Data Quality Frameworks and Implementation: Design, implement, and maintain data quality frameworks, standards, and best practices. Develop workflows for data profiling, validation, and cleansing to enhance data accuracy, consistency, and completeness. Leverage tools and technologies to automate data quality checks and streamline processes. Data Engineering for Quality Assurance: Collaborate with data engineering teams to design and implement ETL pipelines that embed data quality validations. Write efficient Python and SQL scripts for data quality checks and monitoring. Utilize data warehousing expertise to ensure robust data storage and access standards. Data Governance and Cataloging: Define and enforce data governance principles, policies, and quality rules. Implement and manage data cataloging tools to ensure data discoverability, lineage, and metadata management. Collaborate with stakeholders to establish data stewardship and ownership protocols. Collaboration and Issue Resolution: Work with cross-functional teams, including data scientists, analysts, and business users, to align data quality requirements with business needs. Investigate and resolve data anomalies, ensuring prompt resolution of quality issues. Provide technical guidance on best practices for data quality and governance. Monitoring and Reporting: Establish and monitor data quality metrics and KPIs to track improvements and identify potential issues. Deliver insights and recommendations through dashboards and reports, fostering a culture of data excellence. Required Skills & Qualifications: Technical Proficiency: Advanced expertise in Python and SQL for data manipulation, quality checks, and automation. Strong knowledge of data warehousing concepts, ETL processes, and data pipelines. Hands-on experience with data quality tools such as Informatica Data Quality (IDQ) , Ataccama , Trifacta , or similar platforms. Familiarity with data governance frameworks, metadata management, and cataloging tools (e.g., Alation , Collibra , or similar). Cloud and Big Data Knowledge: Experience with cloud platforms such as AWS , Azure , or GCP is preferred. Exposure to tools like Databricks , Snowflake , or BigQuery is an added advantage. Analytical and Problem-Solving Skills: Ability to troubleshoot data inconsistencies and anomalies using an analytical and systematic approach. Communication and Collaboration: Strong verbal and written communication skills to interact with technical and non-technical stakeholders. Demonstrated ability to work in cross-functional teams to achieve data quality objectives. Educational Qualifications: Bachelor s or Master s degree in Computer Science, Data Science, Information Technology, or a related field. Preferred Qualifications: Certifications in data quality tools, data governance, or cloud platforms. Familiarity with machine learning or AI-powered data quality solutions is a plus. Previous experience in a data-driven, fast-paced organization. Why Join Us? Work on cutting-edge data technologies and frameworks. Collaborate with a team of experts in an innovative and supportive environment. Competitive salary, comprehensive benefits, and opportunities for career growth. Be part of an organization that values data integrity and fosters a culture of excellence. As a Data Test Engineer , your contributions will directly impact the integrity and reliability of our data ecosystem, empowering data-driven decision-making across the organization. Join us in driving excellence in data quality and governance!
Posted 3 weeks ago
5.0 - 9.0 years
20 - 25 Lacs
Hyderabad
Work from Office
TJX Companies At TJX Companies, every day brings new opportunities for growth, exploration, and achievement. You ll be part of our vibrant team that embraces diversity, fosters collaboration, and prioritizes your development. Whether you re working in our four global Home Offices, Distribution Centers or Retail Stores TJ Maxx, Marshalls, Homegoods, Homesense, Sierra, Winners, and TK Maxx, you ll find abundant opportunities to learn, thrive, and make an impact. Come join our TJX family a Fortune 100 company and the world s leading off-price retailer. Job Description: What you will discover Inclusive culture and career growth opportunities Global IT Organization which collaborates across U.S., Canada, Europe and Australia Challenging, collaborative, and team-based environment What you will do Enterprise Data & Analytics thrives on strong relationships with our business partners and working diligently to address their needs which support TJX growth and operational stability. On this tightly knit and fast-paced solution delivery team you will be constantly challenged to stretch and think outside the box. You will have a real opportunity to be a part of TJX s transformation to a data driven culture, with the autonomy to work with the business to unlock game changing insights that drive business value. We have modernized our technology stack to focus on the cloud and top tier tools. We are looking for someone who embraces the use of technology to build, manage, and govern data. As an Enterprise Data & Analytics Architect, you will work across all architecture disciplines to collaborate in platform and solution architecture work to lead the organization through pragmatic vision and strong influence. You will recommend alternative options within architecture designs and the introduction and use cases for the adoption of new technologies. Key Responsibilities: Design and implement a comprehensive MDM architecture centered around a central MDM hub. Analyze and define detailed MDM processes, tasks, data flows, and dependencies specific to the MDM hub model. Develop and maintain data models for master data domains (e.g., customer, product, supplier) within the MDM hub. Lead the selection, evaluation, and implementation of MDM tools and technologies that support the MDM hub architecture. Integrate disparate data sources into the MDM hub, ensuring data quality and consistency. Define and implement data quality standards and processes for data entering and residing in the MDM hub. Develop and maintain data governance policies and procedures specific to the MDM hub. Collaborate with business stakeholders to understand and translate data needs into technical requirements, considering the role of the MDM hub. Work with development teams to ensure the successful integration of MDM hub solutions with existing systems. Monitor and analyze the performance of the MDM hub and identify areas for improvement. Create a collaborative environment for work through respect, honesty, transparency and empathy. What You will Need (Minimum Qualifications) 7+ years of experience in data management, data architecture, or a related field. Proven experience in designing and implementing MDM solutions, particularly those utilizing an MDM hub architecture. Strong understanding of data modeling concepts and techniques, with a focus on MDM hub data models. Experience with data integration tools and technologies used for MDM hub implementations. Excellent analytical and problem-solving skills. Strong communication and collaboration skills. Ability to work independently and as part of a team. Familiarity with architecture fundamentals, e.g. Patterns, Standards/Best Practices. Blueprints, NFR s, etc. Experience working in a global agile delivery environment. Proven success partnering effectively with key stakeholders and the use of informal influencing skills to gain consensus. Excellent communication and collaboration skills, with the ability to effectively bridge the gap between technical and non-technical audiences. Strong leadership qualities, with a proven track record of mentoring and inspiring teams. Strong strategic and critical thinking skills. Comfortable dealing with ambiguity and adversity. Must be comfortable in a social environment and building personal relationships with key stakeholders inside and outside of the architecture function. Ability to learn and adapt quickly. Must be comfortable meeting and presenting to large groups of people. Preferred Qualifications Prior work experience delivering Master Data Management solutions for Retail domain. Working knowledge of MDM technologies such as Syndigo/Riversand and Alation. Working knowledge of cloud-based technologies such as - Azure Data Lake Storage, Databricks, Azure DevOps, GitHub, AzureML, etc. Experience with data integration and ETL/ELT processes and methodologies. Programming proficiency in languages like Python or SQL. SAFe certification In addition to our open door policy and supportive work environment, we also strive to provide a competitive salary and benefits package. TJX considers all applicants for employment without regard to race, color, religion, gender, sexual orientation, national origin, age, disability, gender identity and expression, marital or military status, or based on any individuals status in any group or class protected by applicable federal, state, or local law. TJX also provides reasonable accommodations to qualified individuals with disabilities in accordance with the Americans with Disabilities Act and applicable state and local law. Address: Salarpuria Sattva Knowledge City, Inorbit Road Location: APAC Home Office Hyderabad IN
Posted 3 weeks ago
5.0 - 10.0 years
15 - 19 Lacs
Mumbai
Work from Office
Kent is looking for a Finance Systems Analyst to be based in Mumbai, India Your Responsibilities Collaborate with the Finance team to identify and implement opportunities for automation and reporting improvements using Power Platform tools (Power BI, Power Apps, Power Automate). Design, build, and maintain interactive Power BI dashboards to support budgeting, forecasting, actuals tracking, and financial performance reporting. Develop and manage Power Apps to streamline finance-specific processes (e.g., accrual tracking, Capex requests, internal approvals). Automate recurring manual tasks through Power Automate (e.g., report refresh and distribution, approval workflows, reminders). Manage and enhance existing applications, ensuring they meet current business needs and are in line with the latest Power Platform updates. Manage the Dataverse environment focusing on data integrity and security. Troubleshoot and resolve Power Platform-related issues, providing support to users and colleagues. Maintain, enhance, and troubleshoot existing SmartView reports and HsGet queries to support monthly reporting, variance analysis, and management packs. Ensure data quality, governance, and secure integration between Dataverse, Excel, Power BI, SmartView, and other enterprise systems. Support and train finance users on SmartView usage and Power Platform solutions; prepare documentation including user guides and process documentation. Work closely with FP&A, IT, and business stakeholders to ensure alignment of reporting tools and workflows with business objectives. Stay current with Power Platform and Oracle EPM developments to continuously enhance finance technology capabilities Bachelor s or higher degree in Computer Science, Engineering, or a related field. However, candidates with significant practical experience and a proven track record in Power Platform development will also be considered favourably. 3 to 5+ years of relevant experience. (Preferred) Microsoft Certified: Power Platform certification, or equivalent. (Preferred) Full-stack development skills with proficiency in front-end and back-end technologies. Strong verbal and written communication skills in English. Ability to articulate technical challenges and progress effectively. Proficiency in remote collaboration tools and practices. Core Competencies Hands-on experience with Microsoft Power Platform: Power BI (incl. DAX & Power Query), Power Apps, and Power Automate. Experience working with finance professionals in automating report creations and database Proficiency in SmartView and HsGet functions for Oracle EPM (e.g., EPBCS/FCCS), with experience in maintaining and building Excel-based financial reports. Knowledgeable in data modelling, database concepts, and integration techniques Familiarity with SQL, ERP systems, and cloud planning tools is an advantage. Analytical mindset with excellent problem-solving and troubleshooting skills. Strong communication skills, able to train users and articulate technical concepts to finance professionals. Able to manage priorities independently and work across time zones in a global, multicultural environment
Posted 3 weeks ago
4.0 - 8.0 years
20 - 25 Lacs
Hyderabad
Work from Office
TJX Companies At TJX Companies, every day brings new opportunities for growth, exploration, and achievement. You ll be part of our vibrant team that embraces diversity, fosters collaboration, and prioritizes your development. Whether you re working in our four global Home Offices, Distribution Centers or Retail Stores TJ Maxx, Marshalls, Homegoods, Homesense, Sierra, Winners, and TK Maxx, you ll find abundant opportunities to learn, thrive, and make an impact. Come join our TJX family a Fortune 100 company and the world s leading off-price retailer. Job Description: What you will discover Inclusive culture and career growth opportunities Global IT Organization which collaborates across U.S., Canada, Europe and Australia Challenging, collaborative, and team-based environment What you will do Enterprise Data & Analytics thrives on strong relationships with our business partners and working diligently to address their needs which support TJX growth and operational stability. On this tightly knit and fast-paced solution delivery team you will be constantly challenged to stretch and think outside the box. You will have a real opportunity to be a part of TJX s transformation to a data driven culture, with the autonomy to work with the business to unlock game changing insights that drive business value. We have modernized our technology stack to focus on the cloud and top tier tools. We are looking for someone who embraces the use of technology to build, manage, and govern data. As an Enterprise Data & Analytics Data Architect, you will work across all business domains to collaborate in analytics and data architecture work to lead the organization through pragmatic vision and strong influence. You will recommend alternative options within architecture designs and the introduction and use cases for the adoption of new technologies and methodologies. Key Responsibilities Design and develop scalable data architectures and data models to support business objectives. Collaborate with cross-functional teams to understand data requirements and translate them into effective data solutions. Lead the design and implementation of data integration processes, ensuring data quality and consistency. Assist in the development and maintenance of data governance frameworks and best practices. Optimize data storage and retrieval processes to improve performance and efficiency. Provide technical leadership and mentorship to junior data architects and data engineers. Work with the development, production, and support teams to provide design guidelines and impart knowledge on technical trends and solutions. Work with Solution Delivery Lead, Business Delivery Services and Solution/Business Architect to review business drivers, needs and strategies and understand implications; Provides technical guidance and oversight to delivery team. Provide patterns, standards, and best practices to support product teams during the analysis, development, and testing processes. Champion innovation by staying abreast of the latest advancements in Data and Analytics related technologies, and proactively finding opportunities for their integration into our architecture. Work with Enterprise Architects to align processes, documentation and capability models with enterprise architecture standards. Support platform teams during lifecycle management - roadmaps, capability assessment, enablement, and retirement. Create a collaborative environment for work through respect, honesty, transparency and empathy. What You will Need (Minimum Qualifications) 7+ years of experience in data architecture, data modeling, and data integration. Strong knowledge of retail industry data and analytics. Proficiency in data modeling tools (e.g., ERwin, ER/Studio) and database management systems (e.g., SQL Server, Oracle, SnowFlake). Experience with big data technologies/concepts (e.g., Hadoop, Spark) and the Azure cloud platform. Excellent problem-solving skills and attention to detail. Experience working in a global agile delivery environment. Proven success partnering effectively with key stakeholders and the use of informal influencing skills to gain consensus. Excellent communication and collaboration skills, with the ability to effectively bridge the gap between technical and non-technical audiences. Strong leadership qualities, with a proven track record of mentoring and inspiring teams. Strong strategic and critical thinking skills. Comfortable dealing with ambiguity and adversity. Must be comfortable in a social environment and building personal relationships with key stakeholders inside and outside of the architecture function. Ability to learn and adapt quickly. Must be comfortable meeting and presenting to large groups of people.\ Preferred Qualifications Prior work experience in a Retail domain. Experience with machine learning and AI applications in retail. Knowledge of data privacy regulations and compliance standards. Certification in data architecture or related fields. Working knowledge of cloud-based technologies such as - Azure Data Lake Storage, Databricks, Azure DevOps, GitHub, AzureML, etc. Experience with data integration and ETL/ELT processes and methodologies. Programming proficiency in languages like Python or SQL. SAFe certification. In addition to our open door policy and supportive work environment, we also strive to provide a competitive salary and benefits package. TJX considers all applicants for employment without regard to race, color, religion, gender, sexual orientation, national origin, age, disability, gender identity and expression, marital or military status, or based on any individuals status in any group or class protected by applicable federal, state, or local law. TJX also provides reasonable accommodations to qualified individuals with disabilities in accordance with the Americans with Disabilities Act and applicable state and local law. Address: Salarpuria Sattva Knowledge City, Inorbit Road Location: APAC Home Office Hyderabad IN
Posted 3 weeks ago
2.0 - 5.0 years
4 - 8 Lacs
Bengaluru
Work from Office
ql-editor "> Job Title: Senior Azure Data Engineer Location: Bangalore, Experience Required: 10+ Years Job Summary: We are looking for an experienced Azure Data Engineer with 10+ years in Data Engineering to join our Bangalore-based team. You ll work on large-scale data platforms with a focus on Azure and Snowflake, handling complex financial and retail datasets. Key Responsibilities: Design, develop, and maintain scalable and robust data pipelines using Azure services. Build and optimize data architectures including data lakes and data warehouses . Integrate, model, and orchestrate complex finance-related datasets across multiple systems. Collaborate with cross-functional teams to gather business requirements and translate them into data solutions. Ensure data quality, reliability, and governance through automated data integrity checks and monitoring systems. Tune performance and conduct bottleneck analysis across data platforms and pipelines. Implement CI/CD pipelines for data workflows and infrastructure as code practices. Work with Kafka and similar real-time data streaming platforms. Apply best practices in data engineering and DevOps frameworks to promote reusable and modular architecture. Required Skills & Qualifications: 10+ years of experience in Data Engineering, with a strong focus on the Azure ecosystem. Solid experience with Azure Data Factory , Azure Databricks , Azure Synapse , and other Azure data services. Hands-on experience with Snowflake or similar cloud-based data warehouses. Proficiency in SQL , Python , PowerShell , and JavaScript . Experience handling large-volume datasets , ideally in the retail industry . Strong background in performance tuning , optimization , and data troubleshooting in complex environments. Familiarity with Kafka for data streaming and event-driven architecture. Experience with CI/CD , DevOps principles, and monitoring tools. Excellent communication skills with the ability to work in fast-paced, dynamic teams.
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20312 Jobs | Dublin
Wipro
11977 Jobs | Bengaluru
EY
8165 Jobs | London
Accenture in India
6667 Jobs | Dublin 2
Uplers
6462 Jobs | Ahmedabad
Amazon
6351 Jobs | Seattle,WA
Oracle
5993 Jobs | Redwood City
IBM
5803 Jobs | Armonk
Capgemini
3897 Jobs | Paris,France
Tata Consultancy Services
3776 Jobs | Thane