Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 5.0 years
5 - 9 Lacs
Bengaluru
Work from Office
About the Company ZAGENO offers the largest life sciences lab supply marketplace. Our one-stop shop helps scientists, lab managers, and procurement leaders compare products, source alternatives, track deliveries, and communicate order statuses in real time, accelerating innovation by saving valuable time. Leveraging advanced AI, ZAGENO enhances supply chain resilience and makes the customer experience superior, seamlessly integrating with existing systems to boost productivity and make online shopping for research materials convenient, efficient, and reliable. We are committed to innovation, excellence, and fostering a supportive, and dynamic work environment. About the Role ZAGENO is hiring a Data Engineer to establish data driven pipelines as well as seamless data platform setup. You would dedicate your time to improve the reliability, efficiency and performance of all data related operation environments across the ZAGENO platform and related systems in development, staging and production. Alongside establishing data driven pipelines to establish seamless Data Platform setup, Data Engineer would be dedicated full-time to improve the reliability, efficiency and performance of all data related operation environments across the ZAGENO platform and related systems in development, staging and production, fixing issues, responding to release efforts, and operational incidents. Key deliverables could include improved ZAGENO data pipeline availability, reduced data latency (all types) across environments. Increased utilization of ZAGENO infrastructure to minimize adding capacity as dataflows increase in volume and frequency as customers and catalog items are added. In this role, you will: Architect and develop data pipelines to optimize for performance, quality, and scalability Collaborate with data, engineering, and business teams to build tools and data marts that enable analytics Develop testing and monitoring to improve data observability and data quality Partner with data science to build and deploy predictive models Support code versioning, and code deployments for Data Pipelines About you : 3-5 years of experience as a data engineer Strong experience with DATALAKE, DATA WAREHOUSE and DELTA TABLES Strong SQL skills are required (we currently use Big Query, Databricks) Strong experience with Spark and big data tech Strong hand-on in Pyspark and SQL Experience orchestrating and scheduling data pipelines is required Expertise with cloud environments (AWS or GCP) is must Experience with cloud data warehouses and distributed query engines is a plus Proactive and keen attention to detail Our benefits Working for a mission-driven business with a meaningful challenge with a positive impact on the scientific community A clear growth perspective A learning and development budget to enable your ambitions to grow professionally in your field A professional and dynamic team with a global vision and mindset An exciting, international working environment - we have more than 40 nationalities! We ve got your health benefits (medical, dental, and vision) Hybrid Work with 3 days work from office in our HSR Layout, Bangalore office Staying healthy and fit is essential - we cover a part of your gym membership! Holidays and flexible PTO Paid family leave A budget to improve your home office environment
Posted 3 weeks ago
2.0 - 6.0 years
14 - 15 Lacs
Bengaluru
Work from Office
Its fun to work in a company where people truly believe in what theyre doing! At BlackLine, were committed to bringing passion and customer focus to the business of enterprise applications. Since being founded in 2001, BlackLine has become a leading provider of cloud software that automates and controls the entire financial close process. Our vision is to modernize the finance and accounting function to enable greater operational effectiveness and agility, and we are committed to delivering innovative solutions and services to empower accounting and finance leaders around the world to achieve Modern Finance. Being a best-in-class SaaS Company, we understand that bringing in new ideas and innovative technology is mission critical. At BlackLine we are always working with new, cutting edge technology that encourages our teams to learn something new and expand their creativity and technical skillset that will accelerate their careers. Work, Play and Grow at BlackLine! Make Your Mark: We are looking for a motivated and enthusiastic DataOps Engineer to join our growing data team. In this role, you will be instrumental in bridging the gap between data engineering, operations, and development, ensuring our data pipelines & data infrastructure is robust, reliable, and scalable. If you have a passion for automating data processes, streamlining deployments, and maintaining healthy data ecosystems, we encourage you to apply. Youll Get To: Develop and Maintain Data Pipelines: Assist in the design, development, and maintenance of scalable and efficient ETL (Extract, Transform, Load) processes to ingest, transform, and load data from various sources into our data warehouse. Orchestrate Workflows: Implement and manage data workflows using Apache Airflow, ensuring timely execution and monitoring of data jobs. Containerization and Orchestration: Utilize Docker and Kubernetes to containerize data applications and services, and manage their deployment and scaling in production environments. Cloud Infrastructure Management & Data warehousing : Work with Google Cloud Platform (GCP) & snowflake services to deploy, manage, and optimize data infrastructure components , including performance tuning and data governance. Scripting and Automation: Develop and maintain Python scripts for data processing, automation, and operational tasks. CI/CD Implementation: Contribute to the development and improvement of our CI/CD pipelines for data applications, ensuring efficient and reliable deployments. System Administration: Provide basic Unix/Linux administration support for data infrastructure, including scripting, monitoring, and troubleshooting. Monitoring and Alerting: Help implement and maintain monitoring solutions for data pipelines and infrastructure, responding to and resolving incidents. Collaboration: Work closely with Data Engineers, Data Scientists, and other stakeholders to understand data requirements and deliver reliable data solutions. Documentation: Contribute to the documentation of data pipelines, processes, and operational procedures. What Youll Bring: 2-6 years of professional experience in a DataOps , Data Engineering, or a similar role. Proficiency in Python & SQL for data scripting and automation. Familiarity with ETL concepts & tools like airflow, dbt , and experience in building data pipelines. Hands-on experience with Docker & Kubernetes for containerization. Experience with Apache Airflow for workflow orchestration. Working knowledge of at least one major cloud platform, preferably Google Cloud Platform (GCP). Basic Unix/Linux administration skills. Familiarity with CI/CD principles and tools. Strong problem-solving skills and a proactive approach to identifying and resolving issues. Excellent communication and collaboration skills. Bachelors degree in Computer Science , Engineering, or a related field (or equivalent practical experience). We re Even More Excited If You Have: Experience with other data orchestration tools. Knowledge of data governance and data quality best practices. Contributions to open-source projects. Experience in an agile development environment. Thrive at BlackLine Because You Are Joining: A technology-based company with a sense of adventure and a vision for the future. Every door at BlackLine is open. Just bring your brains, your problem-solving skills, and be part of a winning team at the worlds most trusted name in Finance Automation! A culture that is kind, open, and accepting. Its a place where people can embrace what makes them unique, and the mix of cultural backgrounds and varying interests cultivates diverse thought and perspectives. A culture where BlackLiners continued growth and learning is empowered. BlackLine offers a wide variety of professional development seminars and inclusive affinity groups to celebrate and support our diversity. BlackLine is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to sex, gender identity or expression, race, ethnicity, age, religious creed, national origin, physical or mental disability, ancestry, color, marital status, sexual orientation, military or veteran status, status as a victim of domestic violence, sexual assault or stalking, medical condition, genetic information, or any other protected class or category recognized by applicable equal employment opportunity or other similar laws. BlackLine recognizes that the ways we work and the workplace itself have shifted. We innovate in a workplace that optimizes a combination of virtual and in-person interactions to maximize collaboration and nurture our culture. Candidates who live within a reasonable commute to one of our offices will work in the office at least 2 days a week. 2-6 years of professional experience in a DataOps , Data Engineering, or a similar role. Proficiency in Python & SQL for data scripting and automation. Familiarity with ETL concepts & tools like airflow, dbt , and experience in building data pipelines. Hands-on experience with Docker & Kubernetes for containerization. Experience with Apache Airflow for workflow orchestration. Working knowledge of at least one major cloud platform, preferably Google Cloud Platform (GCP). Basic Unix/Linux administration skills. Familiarity with CI/CD principles and tools. Strong problem-solving skills and a proactive approach to identifying and resolving issues. Excellent communication and collaboration skills. Bachelors degree in Computer Science , Engineering, or a related field (or equivalent practical experience).
Posted 3 weeks ago
4.0 - 8.0 years
11 - 15 Lacs
Chennai
Work from Office
FE fundinfo is a global leader in investment fund data and technology. We are proud of our vast, diverse, and highly skilled team, who help to make our industry Better Connected and Better Informed. We are currently recruiting a Delivery Data Manager to join our team in Chennai. The Delivery Data Manager is responsible for overseeing the comprehensive management of client data to ensure seamless project execution and exemplary client service. This role involves coordinating the collection, validation, and analysis of client-specific data, ensuring its accuracy and integrity. The Delivery Data Manager will work collaboratively with cross-functional teams to facilitate timely data delivery, adhering to project timelines and maintaining the highest standards of data quality. By aligning data processes with client expectations, this position plays a critical role in enhancing client satisfaction and fostering long-term relationships. Your key responsibilities as a Delivery Data Manager will be to: Collaborate with colleagues and clients to oversee the ingestion, validation, reconciliation, and testing of fund data pertinent to your assigned projects. Communicate with colleagues and clients via emails, MS Teams video conferences, and chat with ability to convey technical concepts clearly and effectively to non-technical stakeholders across diverse global audiences. Evaluate complex data challenges and formulating innovative, effective solutions and address queries based on their priority and urgency. Collaborate with clients to assess their data requirements, challenges, and business objectives. Identify and incorporate relevant data sources into the database while ensuring accuracy, completeness, and regulatory compliance. To join us as a Delivery Data Manager you will need the following experience and skills: Exceptional Communication Skills: Fluent in both written and verbal English. Expertise in handling large volumes of data, including entry, validation, cleansing, and reconciliation to maintain data integrity and accuracy. Basic understanding of mutual funds and the broader financial services industry. Skilled at assessing and resolving complex data challenges by developing innovative and effective solutions. Extensive experience with MS Excel and MS Outlook, skilled in both manual and automated data entry processes. Knowledge of data compliance standards and regulations in the financial sector, ensuring all processes adhere to legal requirements By joining the team as a Delivery Data Manager, you will be offered the following: Be part of the Global Data Operation team who is responsible for the core data collection & processing. Become a domain expert by increasing the knowledge of Mutual funds operations. 24 days holiday Paid Study leave Enhanced paternity & maternity Statutory benefits like PF, Gratuity, etc Support to set up home office Health cover with option to add family members Annual health check up Meal cards Full LinkedIn Learning access Apply today for immediate consideration and we will endeavour to get back to you within 5 working days. Visit our Glassdoor profile or fefundinfo.com to find out more about life @ FE fundinfo!
Posted 3 weeks ago
4.0 - 6.0 years
18 - 20 Lacs
Pune
Work from Office
Local accounting manager KEY EXPECTED ACHIEVEMENTS General review - Closing Review and Zero surprised prepared consistently with Group standards. Standard General Review process (actors involvement and coordination with CESP Company Leader) in place to guarantee Financial statements economical consistency and facilitate Region/Country F manager accounts validation. Expected downstream data quality level, closing deadlines are reached and standards are applied (by local data suppliers - especially by SP department). Company forecast to fulfill internal and external needs. Group accounts certified by legal auditors. Rules to transform Group accounts into local norms defined, validated and updated following regulatory modifications and company activities evolutions. Financial statements in local norms validated, formatted and submitted to local authorities following country deadlines. Financial statements in local norms certification by legal auditors. In case of regulatory or tax controls, provide and explain accounting data. Quality and accounting compliance of data produced for the preparation of tax returns.
Posted 3 weeks ago
1.0 - 3.0 years
3 - 6 Lacs
Bengaluru
Work from Office
Job Description: Our Enterprise Data team provides Talkdesk with accurate & accessible data, decision support models & analyses, enablement training & services, as well as facilitation for all analytics endeavors across the company. Data Analyst Role will work alongside the business team and be responsible for collecting, cleaning, analyzing, and interpreting data to identify trends and patterns. In this role, you will need to use your knowledge of data pipeline orchestration, standard methodologies for data warehouse performance, data migration strategies, and analytics architecture to function collaboratively within a team environment. We are looking for someone who will apply software engineering standard methodologies to analytics code via the utilization of version control, testing and validation processes, and continuous integration. Responsibilities: Data Analytics: Create Tableau dashboards to support analytical business needs. Facilitate Data Operations : Monitor data quality dashboards for data discrepancies and work with respective teams to resolve data issues. Data Modeling: Write sql, create views and data models in Snowflake to support data analytics. Ad Hoc Reporting : Respond to ad hoc reporting requests from cross-functional teams for custom insights and analysis in a timely manner. Data Management : Participate in the Data Governance committee (led by IT) to ensure the integrity and accuracy of data used in reporting as well as ensuring consistency across departments and systems. Qualifications: Education : Bachelor s degree in Computer Science or related field required. Experience : Minimum of 1 year of progressive experience in data analysis, reporting. Technical Expertise : Proficient in SQL, Snowflake, Tableau Strong knowledge of Excel and data visualization tools. Data analysis: Strong understanding of data analysis principles and techniques. Data visualization: Ability to create compelling and informative visualizations. SQL: Expert level knowledge with SQL for data extraction and manipulation. Problem-solving: Ability to troubleshoot technical issues and find solutions. Communication : Good verbal and written communication Work Environment and Physical Requirements: Primarily office-environment work, extended periods of sitting or standing, computer-based work. Limited lifting, and equipment usage limited to computer-related equipment (keyboards, mouse, etc.)
Posted 3 weeks ago
2.0 - 5.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Job Description: As an Integration Developer, you will have an opportunity to support and maintain Integration infrastructure that surfaces Integrations across the company. The Integration team is really the intersection of business processes with business and applications. You will be responsible for documenting and maintaining the integration infrastructure and systems that support Talkdesk s SaaS platforms. The role may include documenting, designing and implementing integration solutions, developing and deploying data pipelines, and creating and maintaining Integration platform tools. Responsibilities Work closely with process owners and document the integration processes. Develop and maintain technical documentation, including design documents, database schema designs, and runbooks. Develop and maintain end-to-end integrations with our homegrown applications and various cloud platforms like Salesforce, Netsuite, Zuora, Coupa, Workday etc. Help lead the implementation and maintenance of the data platform solutions, ensuring data integrity, performance, and security. Collaborate with multi-functional teams including data scientists, analysts, and software engineers to understand data requirements and deliver high-quality solutions. Evaluate and implement standard practices for data ingestion, ETL processes, and data quality frameworks. Optimize and tune data processing workflows and SQL queries for improved performance and efficiency. Contribute to ad hoc projects executed by the business analytics team by structuring integrations, data, conducting analyses, and synthesizing presentations or other report-outs Stay up-to-date with industry trends and advancements in integration, continuously improving the teams technical knowledge and skills. Collaborate with infrastructure and operations teams to ensure reliable and scalable data storage, processing, and monitoring solutions. Develop an in-depth understanding of our industry on integration platforms like Boomi, Starburst. What You Bring To The Team Bachelors degree in Computer Science, Engineering, or a related field. Advanced degree preferred. Proven experience (1+ years) with Integration engineering, designing and implementing data pipelines, and building ETL infrastructure. Design and implement integrations using an iPaaS solution. Troubleshoot and resolve issues related to integrations, working with stakeholders and the iPaaS vendor as necessary Good knowledge and exposure in working with Boomi, Snowflake and dbt, including data modeling, ETL, and data quality assurance. Proficiency in SQL and experience with optimizing and tuning queries for performance. Good understanding of data warehousing concepts, dimensional modeling, and data integration techniques. Experience with cloud platforms (e.g., AWS, Azure, GCP) and cloud-based data technologies is plus. Good programming skills in Python or other scripting languages for data manipulation and automation. Excellent problem-solving and troubleshooting abilities with a keen attention to detail. Strong communication skills with the ability to effectively collaborate with multi-functional teams and partners. Clear communication about complex technical topics Comfortable working in a fast-paced and highly collaborative environment Work Environment and Physical Requirements: Primarily office-environment work, extended periods of sitting or standing, computer-based work. Limited lifting, and equipment usage limited to computer-related equipment (keyboards, mouse, etc.)
Posted 3 weeks ago
2.0 - 5.0 years
10 - 14 Lacs
Bengaluru
Work from Office
Technical Leadership: Architect and design scalable, efficient, and reliable data pipelines and platforms and Lead the adoption and best practices for Date Engineering tools. Ensure high standards for data quality, reliability, governance, and security. Drive the teams technical roadmap, identifying areas for innovation and optimization People & Project Leadership: Lead and mentor a team of data engineers, fostering growth, ownership, and excellence. Manage sprint planning, code reviews, and technical delivery in an agile environment. Collaborate with Product, Analytics, and Data Science teams to align on data needs and priorities. Own end-to-end delivery of projects, ensuring scalability, performance, and sustainability. Requirements 8\u201310 years of experience in data engineering or backend engineering with significant data pipeline responsibilities. Expertise in Python for data engineering tasks, including data transformation and orchestration. Hands-on experience with Kafka for real-time streaming and event-driven architectures. Proven experience with Airflow or equivalent scheduling/orchestration tools. Deep knowledge of Databricks and Apache Spark for distributed data processing. Proficient in SQL, with strong understanding of relational and distributed databases (e.g., Snowflake, Redshift, BigQuery). Experience in designing data architectures on cloud platforms (AWS, Azure, or GCP). Solid understanding of data modeling, data warehousing, and data governance. Experience in managing cost optimization and performance in cloud-based data platforms. Familiarity with MLOps or data products serving ML pipelines.
Posted 3 weeks ago
4.0 - 7.0 years
15 - 17 Lacs
Bengaluru
Work from Office
Position: Sales Ops Analyst Organization: WW Sales Operations Location: Noida, India Direct Manager: Senior Manager Sales Ops Description The Sales Ops Analyst functions as an integral part of the sales operations team. The candidate should know how to manage compensation processes for WW Sales Organization Reps, manage compensation rules in the compensation tool, report around attainment, set quotas for sales reps, and ensure that compensation and revenue actuals are followed and completed within the required timelines provided. The candidate develops, implements, and utilizes processes and tools to enable the sales lifecycle. He/ She should how to analyze and report order pipeline, bookings, forecasting, sales productivity, and goal attainment. We seek energetic, dynamic, engaging individuals who are passionate about working with data, complex rules, sales performance metrics, etc. This position will report to the Manager Sales Operations and will be responsible for interacting regularly with WW Sales & Sales Operations teams, etc. Primary Responsibilities Lead multiple concurrent projects and initiate, and drive projects to completion with minimal guidance Understanding process bottlenecks and inconsistencies to improve the sales team s performance Engage and work with aligned operations teams and lines of business to more effectively achieve data needs and analysis results Develop and maintain sales analytics reports and dashboards to provide actionable insights that support data-driven decision-making for the sales and executive leadership teams Strengthen sales and operational efficiency by applying innovative methods, streamlining processes and systems, and exchanging standard practices. Augment data quality assurance processes by putting in place required QA activities to run sanity, correctness, and quality of data to ensure trust among end consumers/stakeholders and accurate payouts to reps Apply data cleansing techniques to improve the quality and accuracy of contacts and accounts databases and develop processes and methods for acquiring net new names to our database. Skills: 4-7 years of work experience Bachelor s Degree (MBA preferred) Project Management experience in handling complex projects with multiple stakeholders Experience working in sales operations Expert in Microsoft Excel (creation of multi-variable models; fuzzy logic matching, use of v-lookups, h-lookups, sum-if, pivots, etc.) and PowerPoint (linking PPT to Excel, embedded charts, etc.) Expert in creating Excel VBA Macros and automating many excel based reports Experience using tools & platforms such as SFDC, Power BI, Tableau Working knowledge of MS Access & SQL Highly organized, and pays attention to details Ability to work under minimal supervision, a strong team player Strong Analytical skills Strong project management skills Ability to work under tight schedules & have the flexibility to work under different time zones at times .
Posted 3 weeks ago
5.0 - 7.0 years
20 - 25 Lacs
Bengaluru
Work from Office
AWS Utility Computing (UC) provides product innovations from foundational services such as Amazon s Simple Storage Service (S3) and Amazon Elastic Compute Cloud (EC2), to consistently released new product innovations that continue to set AWS s services and features apart in the industry. As a member of the UC organization, you ll support the development and management of Compute, Database, Storage, Internet of Things (Iot), Platform, and Productivity Apps services in AWS. Within AWS UC, Amazon Dedicated Cloud (ADC) roles engage with AWS customers who require specialized security solutions for their cloud services. Are you interested in Machine Learning and AI? Are you passionate about developing solutions to improve the data quality for ML models ? Are you passionate in writing high quality code? Then, this is the right opportunity for you!!! AWS NGDE team is looking for talented System Development Engineers to develop optimized solutions to create high quality information for the ML services to improve the developer experience. Our teams mission is to make machine learning easy, stronger, and universal in the world of Natural Language and Speech. We developed thriving ML services such as Comprehend, Kendra, Lex and Transcribe. We continuously work on adding new services to our portfolio, which address real world problems through research and innovation. We build state-of-the-art services using the latest deep learning techniques and highly scalable distributed systems engineering. System development Engineers in our team will be responsible to dive deep into the new data requirements for the ML models and deliver optimized solutions in a massive scale. He/She will refactor and optimize our existing solutions to enhance the ML models. He/She will work with external teams and set-up high coding standards to ensure the quality of the delivery. He/She will innovate to continuously improve the coverage and quality of the information. He/She will play a major role in the next phase of AWS innovation in ML and AI. Develop & Deliver high quality and optimized solutions to create high quality information for the ML services and enhance the ML models. Partner effectively with the Product or science teams to understand the needs and deliver with high quality. Innovate new ideas and implement to continuously improve the coverage and quality of the information. Own the infrastructure and mitigate the risks on time. Mentor new engineers About the team Diverse Experiences AWS values diverse experiences. Even if you do not meet all of the preferred qualifications and skills listed in the job description, we encourage candidates to apply. If your career is just starting, hasn t followed a traditional path, or includes alternative experiences, don t let it stop you from applying. Why AWS? Amazon Web Services (AWS) is the world s most comprehensive and broadly adopted cloud platform. We pioneered cloud computing and never stopped innovating that s why customers from the most successful startups to Global 500 companies trust our robust suite of products and services to power their businesses. Inclusive Team Culture Here at AWS, it s in our nature to learn and be curious. Our employee-led affinity groups foster a culture of inclusion that empower us to be proud of our differences. Ongoing events and learning experiences, including our Conversations on Race and Ethnicity (CORE) and AmazeCon (gender diversity) conferences, inspire us to never stop embracing our uniqueness. Mentorship & Career Growth We re continuously raising our performance bar as we strive to become Earth s Best Employer. That s why you ll find endless knowledge-sharing, mentorship and other career-advancing resources here to help you develop into a better-rounded professional. Work/Life Balance We value work-life harmony. Achieving success at work should never come at the expense of sacrifices at home, which is why we strive for flexibility as part of our working culture. When we feel supported in the workplace and at home, there s nothing we can t achieve in the cloud. - B.E. in Computer Science or equivalent qualification. - 2+ years of experience in system development and automation. (Using C++/Java/Python) - Demonstrated skill and passion for problem solving and operational excellence. - Experience in writing test assertions - Experience in communicating with users, other technical teams, and management to collect requirements, describe software product features, and technical designs
Posted 3 weeks ago
5.0 - 7.0 years
20 - 25 Lacs
Chennai
Work from Office
AWS Utility Computing (UC) provides product innovations from foundational services such as Amazon s Simple Storage Service (S3) and Amazon Elastic Compute Cloud (EC2), to consistently released new product innovations that continue to set AWS s services and features apart in the industry. As a member of the UC organization, you ll support the development and management of Compute, Database, Storage, Internet of Things (Iot), Platform, and Productivity Apps services in AWS, including support for customers who require specialized security solutions for customers who require specialized security solutions for their cloud services. Are you interested in working for one of the innovate products in the Generative AI space? Are you passionate in designing and developing tools at scale for generating high quality data? Are you passionate about designing and developing solutions to enhance the data quality for optimizing the large language models? Then, this is the right opportunity for you!!! AWS NGDE team is looking for talented System Development Engineers to design and develop optimal solutions to generate data at high quality and continuously enhance the quality for optimizing the large language models to improve the developer experience. Our Org s mission is to use GenAI to help builders create faster, cheaper, more secure, and more reliable applications. We launched Services like Amazon CodeWhisperer and continuously working on adding new services to our portfolio, which solve common business problems and improves developer productivity through research and innovation. We build state-of-the-art services using the latest deep learning techniques and highly scalable distributed systems engineering. System development engineers in our team will be responsible to dive deep into the model quality requirements and design/deliver optimal solutions in a massive scale to generate high quality data for model optimization. As an engineer will continuously innovate and enhance the solutions to generate data to optimize the ML models and expand the coverage. Will design UX tools for generating the manual data and build guardrails for ensuring high quality. Will be responsible to drive the technology roadmap for the team and skill up the team members. Key responsibilities of the System Development Engineer II are (not limited to): Design, Develop & Deliver optimal solutions to generate data at high quality and continuously enhance the quality for optimizing the large language models. Partner effectively with the Product, Science and Engineering teams to understand the product requirements and build extendable solutions at scale. Innovate continuously to enhance the coverage and quality of data. Mentor new engineers and skill up the team in terms of technology and process. A day in the life Our team is dedicated to supporting new members. We have a broad mix of experience levels and tenures, and we re building an environment that celebrates knowledge sharing and mentorship. Our senior members enjoy one-on-one mentoring and thorough, but kind, code reviews. We care about your career growth and strive to assign projects based on what will help each team member develop into a better-rounded engineer and enable them to take on more complex tasks in the future. About the team Diverse Experiences AWS values diverse experiences. Even if you do not meet all of the preferred qualifications and skills listed in the job description, we encourage candidates to apply. If your career is just starting, hasn t followed a traditional path, or includes alternative experiences, don t let it stop you from applying. Why AWS? Amazon Web Services (AWS) is the world s most comprehensive and broadly adopted cloud platform. We pioneered cloud computing and never stopped innovating that s why customers from the most successful startups to Global 500 companies trust our robust suite of products and services to power their businesses. Inclusive Team Culture Here at AWS, it s in our nature to learn and be curious. Our employee-led affinity groups foster a culture of inclusion that empower us to be proud of our differences. Ongoing events and learning experiences, including our Conversations on Race and Ethnicity (CORE) and AmazeCon (gender diversity) conferences, inspire us to never stop embracing our uniqueness. Mentorship & Career Growth We re continuously raising our performance bar as we strive to become Earth s Best Employer. That s why you ll find endless knowledge-sharing, mentorship and other career-advancing resources here to help you develop into a better-rounded professional. Work/Life Balance We value work-life harmony. Achieving success at work should never come at the expense of sacrifices at home, which is why we strive for flexibility as part of our working culture. When we feel supported in the workplace and at home, there s nothing we can t achieve in the cloud. - B.E. in Computer Science or equivalent qualification. - 5+ years of experience in system development and automation. (Using C++/Java/Python) - Good understanding on data structures and annotations - Experience working in large scale software development environment - Experience in taking a project from scoping requirements through launch of the project - Experience in writing test assertions
Posted 3 weeks ago
3.0 - 8.0 years
11 - 12 Lacs
Bengaluru
Work from Office
Amazon s Global Tax Services team seeks an exceptionally capable and motivated individual to drive tax data management and month end compliance operations for global tax teams. The right individual should have in-depth ability to work with or an appetite to work with large data and solid business judgment capable of delivering the right system configurations with a tax and accounting context. This role performs a wide variety of responsibilities for the Global Tax organization including: Manage month end data operations for tax compliance reporting Perform hands on detailed data research and analysis of a large financial data set; investigate, troubleshoot and resolve data quality issues Define business requirements for technical development based on analysis of data sets Work with customer teams to identify improvements in efficiency and controllership for their current data processes; teach them how to utilize the designed reporting and functionality in their processes. Build and maintain relationships with our key technology providers, as well as, other technical teams across Amazon Ability to explain financial/technical concepts and analysis implications clearly to a wide audience, and be able to translate business objectives into actionable analyses. About the team We are a fast growing team supporting Corporate Tax function. We seek candidates who are eager and able to learn new content quickly, who are willing to go into unfamiliar territory, and who possess ironclad judgment and integrity around confidential information. - Bachelors degree - Bachelors degree in finance, accounting, business, economics, engineering , analytics, mathematics, statistics or a related technical or quantitative field - 5+ years of writing SQL queries and creating business intelligence reports using Tableau, Power BI experience - 5+ years of Excel or Tableau (data manipulation, macros, charts and pivot tables) experience - 2+ years of tax, finance or a related analytical field experience - MBA - 2+ years of product or program management, product marketing, business development or technology experience
Posted 3 weeks ago
2.0 - 3.0 years
5 - 8 Lacs
Bengaluru
Work from Office
Pure Storage is seeking an early-career problem solver who will help design Pure s supply chain network of the future. In this high-impact role, you'll use advanced software tools to model and optimize our global supply chain network. you'll support the development of proposals, build models, run what if scenarios, and help identify opportunities to enhance our supply chain structure. This is a great opportunity to grow your analytical and technical skills while influencing key decisions across the company. Responsibilities include: Act as an internal consultant in network design by partnering with cross-functional teams to analyze expansion opportunities Develop supply chain models using industry-leading tools (eg, Coupa Supply Chain Guru, AIMMS, or equivalent) for network, transportation, and routing optimization Evaluate network scenarios to reduce Total Landed Costs after tax, minimize touchpoints, and streamline operations Cleanse and prepare data for network modeling efforts, ensuring high data quality and model accuracy Assist in identifying supply chain risks and support development of mitigation strategies Maintain the supply chain network design framework and support data governance activities Support small to medium-sized projects from planning to execution, driving clarity around objectives, assumptions, and deliverables Create and maintain project documentation including intake forms, scenario definitions, and results presentations Contribute to the creation of business cases for network design recommendations What you'll Bring to This Role: Bachelor s degree in Supply Chain, Engineering, Analytics, Business, or a related field 2-3 years of relevant experience in supply chain analytics, modeling, or optimization Exposure to network design and modeling tools such as Coupa Supply Chain Guru, AIMMS, AnyLogistix , or similar Familiarity with linear/integer optimization concepts and supply chain modeling fundamentals Proficient in Excel is a must-have; working knowledge of SQL or Python is a plus Strong analytical and problem-solving skills with a keen attention to detail Comfortable working with large datasets and able to communicate findings clearly Eagerness to learn and grow in a highly collaborative, fast-paced environment Familiarity with ERP systems (SAP, Netsuite) and visualization tools like Tableau or Power BI is a plus Occasional domestic and international travel may be required 5%. WHAT YOU CAN EXPECT FROM US: Pure Innovation : We celebrate those who think critically, like a challenge and aspire to be trailblazers. Pure Growth : We give you the space and support to grow along with us and to contribute to something meaningful. We have been Named Fortunes Best Large Workplaces in the Bay Area , Fortunes Best Workplaces for Millennials and certified as a Great Place to Work ! Pure Team : We build each other up and set aside ego for the greater good.
Posted 3 weeks ago
1.0 - 9.0 years
4 - 8 Lacs
Hyderabad
Work from Office
Career Category Information Systems Job Description ABOUT AMGEN Amgen harnesses the best of biology and technology to fight the world s toughest diseases, and make people s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what s known today. ABOUT THE ROLE Role Description: As part of the cybersecurity organization, In this vital role you will be responsible for designing, building, and maintaining data infrastructure to support data-driven decision-making. This role involves working with large datasets, developing reports, executing data governance initiatives, and ensuring data is accessible, reliable, and efficiently managed. The role sits at the intersection of data infrastructure and business insight delivery, requiring the Data Engineer to design and build robust data pipelines while also translating data into meaningful visualizations for stakeholders across the organization. The ideal candidate has strong technical skills, experience with big data technologies, and a deep understanding of data architecture, ETL processes, and cybersecurity data frameworks. Roles Responsibilities: Design, develop, and maintain data solutions for data generation, collection, and processing. Be a key team member that assists in design and development of the data pipeline. Build data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems. Develop and maintain interactive dashboards and reports using tools like Tableau, ensuring data accuracy and usability Schedule and manage workflows the ensure pipelines run on schedule and are monitored for failures. Collaborate with multi-functional teams to understand data requirements and design solutions that meet business needs. Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency. Implement data security and privacy measures to protect sensitive data. Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions. Collaborate and communicate effectively with product teams. Collaborate with data scientists to develop pipelines that meet dynamic business needs. Share and discuss findings with team members practicing SAFe Agile delivery model. What we expect of you We are all different, yet we all use our unique contributions to serve patients. The Data engineer professional we seek is one with these qualifications. Basic Qualifications: Master s degree and 1 to 3 years of experience of Computer Science, IT or related field experience OR Bachelor s degree and 3 to 5 years of Computer Science, IT or related field experience OR Diploma and 7 to 9 years of Computer Science, IT or related field experience Preferred Qualifications: Hands on experience with data practices, technologies, and platforms, such as Databricks, Python, GitLab, LucidChart, etc. Hands-on experience with data visualization and dashboarding tools Tableau, Power BI, or similar is a plus Proficiency in data analysis tools (e. g. SQL) and experience with data sourcing tools Excellent problem-solving skills and the ability to work with large, complex datasets Understanding of data governance frameworks, tools, and best practices Knowledge of and experience with data standards (FAIR) and protection regulations and compliance requirements (e. g. , GDPR, CCPA) Good-to-Have Skills: Experience with ETL tools and various Python packages related to data processing, machine learning model development Strong understanding of data modeling, data warehousing, and data integration concepts Knowledge of Python/R, Databricks, cloud data platforms Experience working in Product teams environment Experience working in an Agile environment Professional Certifications: AWS Certified Data Engineer preferred Databricks Certificate preferred Soft Skills: Initiative to explore alternate technology and approaches to solving problems Skilled in breaking down problems, documenting problem statements, and estimating efforts Excellent analytical and troubleshooting skills Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation Ability to handle multiple priorities successfully Team-oriented, with a focus on achieving team goals EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. .
Posted 3 weeks ago
7.0 - 14.0 years
10 - 15 Lacs
Bengaluru
Work from Office
Strategy Analyse business problems and help to arrive at technically advanced solutions Proven ability to think out-of-the-box, fostering innovation and automation Proven ability that establish a strong team-player approach to problem solving Strong foundational knowledge of Algorithms, Data Structures, OOPs concepts and frameworks Curious learner, willing to learn and adapt to new technologies and frameworks Empowered mindset with ability to ask questions and seek clarifications Excellent communication skills that enable seamless interactions with colleagues globally Strong technical skills, with exposure to coding in any next-gen tech Awareness of Agile methodologies Good technical skills, with exposure to o An object-oriented programming, preferably Java, o Modern technologies like Microservices, UI frameworks -Angular, React o Applied Maths and algorithm o AI/NLP/Machine Learning algorithms Business Trade | Risk | Money Laundering People Talent Lead a team of Developers/Senior Developers and guide them in various activities like Development, testing and Testing Support and Implementation and Post Implementation support. Risk Management Pro-actively manage risk and keep stakeholder informed Key Responsibilities Regulatory Business Conduct Display exemplary conduct and live by the Group s Values and Code of Conduct. Take personal responsibility for embedding the highest standards of ethics, including regulatory and business conduct, across Standard Chartered Bank. This includes understanding and ensuring compliance with, in letter and spirit, all applicable laws, regulations, guidelines and the Group Code of Conduct. Effectively and collaboratively identify, escalate, mitigate and resolve risk, conduct and compliance matters. Key stakeholders Trade AML POC Business Trade Technology Other TTO stakeholders Other Responsibilities Adherence to Risk Data Quality Management Requirements Risk and Audit Continuous management of the Trade Application System risk Proactively identify issues and actions Monitor and remediate issues and actions from audits Awareness of the regulatory requirements and ensuring these are catered for in the platform design As Part Of Build Maintenance Model, will have to Support Production As and when required Embed Here for good and Group s brand and values in team; Perform other responsibilities assigned under Group, Country, Business or Functional policies and procedures; Multiple functions (double hats); Skills and Experience Microservices (OCP, Kubernetes) Hadoop SPARK SCALA Elastic Trade Risk AML Azure DevOps traditional ETL pipelines and/or analytics pipelines Qualifications TRAINING Machine Learning/AI experience- Optional CERTIFICATIONS QUANTEXA LANGUAGES AWS EKS, Azure AKS Angular, Microservices (OCP, Kubernetes) Hadoop SPARK SCALA Elastic About Standard Chartered Were an international bank, nimble enough to act, big enough for impact. For more than 170 years, weve worked to make a positive difference for our clients, communities, and each other. We question the status quo, love a challenge and enjoy finding new opportunities to grow and do better than before. If youre looking for a career with purpose and you want to work for a bank making a difference, we want to hear from you. You can count on us to celebrate your unique talents and we cant wait to see the talents you can bring us. Our purpose, to drive commerce and prosperity through our unique diversity, together with our brand promise, to be here for good are achieved by how we each live our valued behaviours. When you work with us, youll see how we value difference and advocate inclusion. Together we: Do the right thing and are assertive, challenge one another, and live with integrity, while putting the client at the heart of what we do Never settle, continuously striving to improve and innovate, keeping things simple and learning from doing well, and not so well Are better together, we can be ourselves, be inclusive, see more good in others, and work collectively to build for the long term What we offer In line with our Fair Pay Charter, we offer a competitive salary and benefits to support your mental, physical, financial and social wellbeing. Core bank funding for retirement savings, medical and life insurance, with flexible and voluntary benefits available in some locations. Time-off including annual leave, parental/maternity (20 weeks), sabbatical (12 months maximum) and volunteering leave (3 days), along with minimum global standards for annual and public holiday, which is combined to 30 days minimum. Flexible working options based around home and office locations, with flexible working patterns. Proactive wellbeing support through Unmind, a market-leading digital wellbeing platform, development courses for resilience and other human skills, global Employee Assistance Programme, sick leave, mental health first-aiders and all sorts of self-help toolkits A continuous learning culture to support your growth, with opportunities to reskill and upskill and access to physical, virtual and digital learning. Being part of an inclusive and values driven organisation, one that embraces and celebrates our unique diversity, across our teams, business functions and geographies - everyone feels respected and can realise their full potential. www. sc. com/careers 30152
Posted 3 weeks ago
0.0 - 4.0 years
2 - 6 Lacs
Mumbai
Work from Office
FanCode is India s premier sports destination committed to giving fans a highly personalised experience across content and merchandise for a wide variety of sports. Founded by sports industry veterans Yannick Colaco and Prasana Krishnan in March 2019, FanCode has over 100 million users. It has partnered with domestic, international sports leagues and associations across multiple sports. In content, FanCode offers interactive live streaming for sports with industry-first subscription formats like Match, Bundle and Tour Passes, along with monthly and annual subscriptions at affordable prices. Through FanCode Shop, it also offers fans a wide range of sports merchandise for sporting teams, brands and leagues across the world. Responsibilities Create, update, and maintain product listings on the Magento platform. Manage product listings across multiple e-commerce platforms like Amazon, Myntra, Flipkart, etc. , ensuring accuracy, completeness, and consistency. Organize products into relevant categories and subcategories, ensuring a logical and user-friendly navigation structure. Implement and manage categorization and tagging systems for efficient catalog organization. Implement data quality standards to ensure consistency and accuracy across the product catalog. Work closely with cross-functional teams to address any discrepancies or missing information. Continuously optimize product listings for search engine visibility and conversion rate optimization. Work closely with the operations team to coordinate product availability and updates. Experience with e-commerce tools and software, such as Magento, Amazon Seller Central, Myntra Seller Hub, Flipkart Seller Hub, etc. Must Haves 2+ years of experience in e-commerce catalog management, including Magento and leading online marketplaces such as Amazon, Myntra, Flipkart, etc. Optimizing product listings across multiple e-commerce platforms. In-depth knowledge of Magento administration and configuration, including product catalog setup, attribute management, category hierarchy and Product grouping. Proficiency in utilizing Amazon Seller Central, Myntra Seller Hub, Flipkart Seller Hub, or similar platforms to manage product listings, inventory, and orders. Ensure accurate and comprehensive product information, including descriptions, images, prices, and specifications. Excellent communication and collaboration skills, with the ability to work effectively with cross-functional teams and external partners(Vendors). Dream Sports is India s leading sports technology company with 250 million users, housing brands such as Dream11 , the world s largest fantasy sports platform, FanCode , India s digital sports destination, and DreamSetGo , a sports experiences platform. Dream Sports is based in Mumbai and has a workforce of close to 1, 000 Sportans . Founded in 2008 by Harsh Jain and Bhavit Sheth, Dream Sports vision is to Make Sports Better for fans through the confluence of sports and technology. For more information: https://dreamsports. group/
Posted 3 weeks ago
5.0 - 9.0 years
7 - 11 Lacs
Noida
Work from Office
5-9 years In Data Engineering, software development such as ELT/ETL, data extraction and manipulation in Data Lake/Data Warehouse environment Expert level Hands to the following: Python, SQL PySpark DBT and Apache Airflow DevOps, Jenkins, CI/CD Data Governance and Data Quality frameworks Data Lakes, Data Warehouse AWS services including S3, SNS, SQS, Lambda, EMR, Glue, Athena, EC2, VPC etc. Source code control - GitHub, VSTS etc. Mandatory Competencies Python - Python Database - SQL Data on Cloud - AWS S3 DevOps - CI/CD DevOps - Github ETL - AWS Glue Beh - Communication At Iris Software, we offer world-class benefits designed to support the financial, health and well-being needs of our associates to help achieve harmony between their professional and personal growth. From comprehensive health insurance and competitive salaries to flexible work arrangements and ongoing learning opportunities, were committed to providing a supportive and rewarding work environment. Join us and experience the difference of working at a company that values its employees success and happiness.
Posted 3 weeks ago
9.0 - 12.0 years
16 - 18 Lacs
Mumbai
Work from Office
Job Description: Essential Job Functions: Participate in data engineering tasks, including data processing and transformation. Assist in the development and maintenance of data pipelines and infrastructure. Collaborate with team members to support data collection and integration. Contribute to data quality and security efforts. Analyze data using data engineering tools and techniques. Collaborate with data engineers and analysts on data-related projects. Pursue opportunities to enhance data engineering skills and knowledge. Stay updated on data engineering trends and best practices. Basic Qualifications: Bachelors degree in a relevant field or equivalent combination of education and experience Typically, 4+ years of relevant work experience in industry, with a minimum of 1+ years in a similar role Proven experience in data engineering Proficiencies in data engineering tools and technologies A continuous learner that stays abreast with industry knowledge and technology Other Qualifications (a plus): Advanced degree in a relevant field a plus Relevant certifications, such as Certified Data Analyst or SAS Certified Big Data Professional a plus Recruitment fraud is a scheme in which fictitious job opportunities are offered to job seekers typically through online services, such as false websites, or through unsolicited emails claiming to be from the company. These emails may request recipients to provide personal information or to make payments as part of their illegitimate recruiting process. DXC does not make offers of employment via social media networks and DXC never asks for any money or payments from applicants at any point in the recruitment process, nor ask a job seeker to purchase IT or other equipment on our behalf. More information on employment scams is available here .
Posted 3 weeks ago
7.0 - 8.0 years
15 - 16 Lacs
Hyderabad
Work from Office
Some careers shine brighter than others. If you re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Consultant Specialist. In this role, you will: Design, develop, and optimize data pipelines using Azure Databricks, PySpark, and Prophesy. Implement and maintain ETL/ELT pipelines using Azure Data Factory (ADF) and Apache Airflow for orchestration. Develop and optimize complex SQL queries and Python-based data transformation logic. Work with version control systems (GitHub, Azure DevOps) to manage code and deployment processes. Automate deployment of data pipelines using CI/CD practices in Azure DevOps. Ensure data quality, security, and compliance with best practices. Monitor and troubleshoot performance issues in data pipelines. Collaborate with cross-functional teams to define data requirements and strategies. Requirements To be successful in this role, you should meet the following requirements: 5+ years of experience in data engineering, working with Azure Databricks, PySpark, and SQL. Hands-on experience with Prophesy for data pipeline development. Proficiency in Python for data processing and transformation. Experience with Apache Airflow for workflow orchestration. Strong expertise in Azure Data Factory (ADF) for building and managing ETL processes. Familiarity with GitHub and Azure DevOps for version control and CI/CD automation. Solid understanding of data modelling, warehousing, and performance optimization. Ability to work in an agile environment and manage multiple priorities effectively. Excellent problem-solving skills and attention to detail. Experience with Delta Lake and Lakehouse architecture. Hands-on experience with Terraform or Infrastructure as Code (IaC). Understanding of machine learning workflows in a data engineering context.
Posted 3 weeks ago
0.0 - 8.0 years
12 - 13 Lacs
Pune, Chennai
Work from Office
Join us as a Data Governance Analyst at Barclays, where you will be responsible for supporting the successful delivery of location strategy projects to plan, budget, agreed quality and governance standards. Youll spearhead the evolution of our digital landscape, driving innovation and excellence. You will harness cutting-edge technology to revolutionise our digital offerings, ensuring unparalleled customer experiences. To be successful as a Data Governance Analyst you should have experience with: Data and Record governance, data controls, data lineage and associated methodologies. Experience in data products, cloud and data warehouses Business Domain (Retail or Banking) and Regulatory reporting experience. Working in a regulated environment and solid understanding of data and control risk management. Some other highly valued skills may include: Understanding of different technologies around the execution of data control. Ability to proactively drive change. Exceptional stakeholder management skills to be able to maintain collaborative working relationships with key senior stakeholders. Experience of working in multiple large teams delivering complex services involving the highest standards of resilience, risk and governance controls. Proficiency in data analytics and insight generation to derive actionable insights from data. You may be assessed on key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen, strategic thinking and digital and technology, as well as job-specific technical skills. This role is based out of Chennai. Purpose of the role To develop, implement, and maintain effective governance frameworks for all data and records across the banks global operations. Accountabilities Development and maintenance of a comprehensive data and records governance framework aligned with regulatory requirements and industry standards. Monitoring data quality and records metrics and compliance with standards across the organization. Identification and addressing of data and records management risks and gaps. Development and implementation of a records management programme that ensures the proper identification, classification, storage, retention, retrieval and disposal of records. Development and implementation of a data governance strategy that aligns with the banks overall data management strategy and business objectives. Provision of Group wide guidance and training on Data and Records Management standard requirements. Assistant Vice President Expectations To advise and influence decision making, contribute to policy development and take responsibility for operational effectiveness. Collaborate closely with other functions/ business divisions. Lead a team performing complex tasks, using well developed professional knowledge and skills to deliver on work that impacts the whole business function. Set objectives and coach employees in pursuit of those objectives, appraisal of performance relative to objectives and determination of reward outcomes If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L - Listen and be authentic, E - Energise and inspire, A - Align across the enterprise, D - Develop others. OR for an individual contributor, they will lead collaborative assignments and guide team members through structured assignments, identify the need for the inclusion of other areas of specialisation to complete assignments. They will identify new directions for assignments and/ or projects, identifying a combination of cross functional methodologies or practices to meet required outcomes. Consult on complex issues; providing advice to People Leaders to support the resolution of escalated issues. Identify ways to mitigate risk and developing new policies/procedures in support of the control and governance agenda. Take ownership for managing risk and strengthening controls in relation to the work done. Perform work that is closely related to that of other areas, which requires understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function. Collaborate with other areas of work, for business aligned support areas to keep up to speed with business activity and the business strategy. Engage in complex analysis of data from multiple sources of information, internal and external sources such as procedures and practises (in other areas, teams, companies, etc). to solve problems creatively and effectively. Communicate complex information. Complex information could include sensitive information or information that is difficult to communicate because of its content or its audience. Influence or convince stakeholders to achieve outcomes.
Posted 3 weeks ago
0.0 - 8.0 years
12 - 13 Lacs
Noida
Work from Office
Embark on a transformative journey as Data Strategy Analyst at Barclays, where youll spearhead the evolution of our digital landscape, driving innovation and excellence. Youll harness cutting-edge technology to revolutionize our digital offerings, ensuring unapparelled customer experiences. The Data Strategy Team within Credit and Data Analytics (CDA) is in a long-term program to migrate data and analysis from on premise tools and platforms (SAS, Oracle, etc) to AWS with world class analysis using more modern tools, platforms and data (AWS, Databricks, Git, Python, etc). As part of this journey, Data Strategy works closely with both business units and Tech resources to craft the narrative of what the migration will look like and then validates that it was done successfully. To be successful in this role as a Data Strategy Analyst, you should possess the following skillsets: 1. Technical skills consistent with performing the following functions: Use Python and various packages for the exploration of data within AWS /Athena environment. Read and potentially convert SAS scripts to Python - recognize data usage in SAS and be able to migrate the data steps into Python/Pyspark for analysis. Manage code and processes with version control platforms like Git, BitBucket and potentially GitHub. 2. Communication skills as both the receiver of requests and the provider of results. Must be able to take relevant direction on a request and translate that into an approach for the analysis that drives to the right results. Must be able to compile results and provide to business teams in a meaningful manner to deliver value and drive insight to whether data migration is successful. 3. Data Quality concepts to understand what makes data valid and how to assess it. Provide insight to Tech for standardized validation of data transformation. Collaborate with business teams to understand what good data means to them and translate this into requirements. Some other highly valued skills include: Team collaboration - the Data Strategy team is highly collaborative and each member provides input and insight for weekly meetings, monthly business reviews and other product/process sharing endeavors. You will also be assessed on the key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen strategic thinking and digital and technology, as well as job-specific technical skills. This role is based in our Noida office. Purpose of the role To use innovative data analytics and machine learning techniques to extract valuable insights from the banks data reserves, leveraging these insights to inform strategic decision-making, improve operational efficiency, and drive innovation across the organisation. Accountabilities Identification, collection, extraction of data from various sources, including internal and external sources. Performing data cleaning, wrangling, and transformation to ensure its quality and suitability for analysis. Development and maintenance of efficient data pipelines for automated data acquisition and processing. Design and conduct of statistical and machine learning models to analyse patterns, trends, and relationships in the data. Development and implementation of predictive models to forecast future outcomes and identify potential risks and opportunities. Collaborate with business stakeholders to seek out opportunities to add value from data through Data Science. Analyst Expectations To perform prescribed activities in a timely manner and to a high standard consistently driving continuous improvement. Requires in-depth technical knowledge and experience in their assigned area of expertise Thorough understanding of the underlying principles and concepts within the area of expertise They lead and supervise a team, guiding and supporting professional development, allocating work requirements and coordinating team resources. If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L - Listen and be authentic, E - Energise and inspire, A - Align across the enterprise, D - Develop others. OR for an individual contributor, they develop technical expertise in work area, acting as an advisor where appropriate. Will have an impact on the work of related teams within the area. Partner with other functions and business areas. Takes responsibility for end results of a team s operational processing and activities. Escalate breaches of policies / procedure appropriately. Take responsibility for embedding new policies/ procedures adopted due to risk mitigation. Advise and influence decision making within own area of expertise. Take ownership for managing risk and strengthening controls in relation to the work you own or contribute to. Deliver your work and areas of responsibility in line with relevant rules, regulation and codes of conduct. Maintain and continually build an understanding of how own sub-function integrates with function, alongside knowledge of the organisations products, services and processes within the function. Demonstrate understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function. Make evaluative judgements based on the analysis of factual information, paying attention to detail. Resolve problems by identifying and selecting solutions through the application of acquired technical experience and will be guided by precedents. Guide and persuade team members and communicate complex / sensitive information. Act as contact point for stakeholders outside of the immediate function, while building a network of contacts outside team and external to the organisation.
Posted 3 weeks ago
5.0 - 7.0 years
6 - 10 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
More than 5+ years of experience in data modelling – designing, implementing, and maintaining data models to support data quality, performance and scalability. Proven experience as a Data Modeler and worked with data analysts, data architects and business stakeholders to ensure data models are aligned to business requirements. Expertise in data modeling tools (ER/Studio, Erwin, IDM/ARDM models, CPG / Manufacturing/Sales/Finance/Supplier/Customer domains ) Experience with at least one MPP database technology such as Databricks Lakehouse, Redshift, Synapse, Teradata, or Snowflake. Experience with version control systems like GitHub and deployment & CI tools. Experience of metadata management, data lineage, and data glossaries is a plus. Working knowledge of agile development, including DevOps and DataOps concepts. Working knowledge of SAP data models, particularly in the context of HANA and S/4HANA, Retails Data like IRI, Nielsen Retail. Must Have: Data Modeling, Data Modeling Tool experience, SQL Nice to Have: SAP HANA, Data Warehouse, Databricks, CPG"
Posted 3 weeks ago
1.0 - 4.0 years
10 - 14 Lacs
Pune
Work from Office
Overview Design, develop, and maintain data pipelines and ETL/ELT processes using PySpark/Databricks/bigquery/Airflow/composer. Optimize performance for large datasets through techniques such as partitioning, indexing, and Spark optimization. Collaborate with cross-functional teams to resolve technical issues and gather requirements. Responsibilities Ensure data quality and integrity through data validation and cleansing processes. Analyze existing SQL queries, functions, and stored procedures for performance improvements. Develop database routines like procedures, functions, and views/MV. Participate in data migration projects and understand technologies like Delta Lake/warehouse/bigquery. Debug and solve complex problems in data pipelines and processes. Qualifications Bachelor’s degree in computer science, Engineering, or a related field. Strong understanding of distributed data processing platforms like Databricks and BigQuery. Proficiency in Python, PySpark, and SQL programming languages. Experience with performance optimization for large datasets. Strong debugging and problem-solving skills. Fundamental knowledge of cloud services, preferably Azure or GCP. Excellent communication and teamwork skills. Nice to Have: Experience in data migration projects. Understanding of technologies like Delta Lake/warehouse. What we offer you Transparent compensation schemes and comprehensive employee benefits, tailored to your location, ensuring your financial security, health, and overall wellbeing. Flexible working arrangements, advanced technology, and collaborative workspaces. A culture of high performance and innovation where we experiment with new ideas and take responsibility for achieving results. A global network of talented colleagues, who inspire, support, and share their expertise to innovate and deliver for our clients. Global Orientation program to kickstart your journey, followed by access to our Learning@MSCI platform, LinkedIn Learning Pro and tailored learning opportunities for ongoing skills development. Multi-directional career paths that offer professional growth and development through new challenges, internal mobility and expanded roles. We actively nurture an environment that builds a sense of inclusion belonging and connection, including eight Employee Resource Groups. All Abilities, Asian Support Network, Black Leadership Network, Climate Action Network, Hola! MSCI, Pride & Allies, Women in Tech, and Women’s Leadership Forum. At MSCI we are passionate about what we do, and we are inspired by our purpose – to power better investment decisions. You’ll be part of an industry-leading network of creative, curious, and entrepreneurial pioneers. This is a space where you can challenge yourself, set new standards and perform beyond expectations for yourself, our clients, and our industry. MSCI is a leading provider of critical decision support tools and services for the global investment community. With over 50 years of expertise in research, data, and technology, we power better investment decisions by enabling clients to understand and analyze key drivers of risk and return and confidently build more effective portfolios. We create industry-leading research-enhanced solutions that clients use to gain insight into and improve transparency across the investment process. MSCI Inc. is an equal opportunity employer. It is the policy of the firm to ensure equal employment opportunity without discrimination or harassment on the basis of race, color, religion, creed, age, sex, gender, gender identity, sexual orientation, national origin, citizenship, disability, marital and civil partnership/union status, pregnancy (including unlawful discrimination on the basis of a legally protected parental leave), veteran status, or any other characteristic protected by law. MSCI is also committed to working with and providing reasonable accommodations to individuals with disabilities. If you are an individual with a disability and would like to request a reasonable accommodation for any part of the application process, please email Disability.Assistance@msci.com and indicate the specifics of the assistance needed. Please note, this e-mail is intended only for individuals who are requesting a reasonable workplace accommodation; it is not intended for other inquiries. To all recruitment agencies MSCI does not accept unsolicited CVs/Resumes. Please do not forward CVs/Resumes to any MSCI employee, location, or website. MSCI is not responsible for any fees related to unsolicited CVs/Resumes. Note on recruitment scams We are aware of recruitment scams where fraudsters impersonating MSCI personnel may try and elicit personal information from job seekers. Read our full note on careers.msci.com
Posted 3 weeks ago
8.0 - 13.0 years
30 - 45 Lacs
Bengaluru
Hybrid
Job Title: Enterprise Data Architect | Immediate Joiner Experience: 8 15 Years Location: Bengaluru (Onsite/Hybrid) Joining Time: Immediate Joiners Only (015 Days) Job Description We are looking for an experienced Enterprise Data Architect to join our dynamic team in Bengaluru. This is an exciting opportunity to shape modern data architecture across finance and colleague (HR) domains using the latest technologies and design patterns. Key Responsibilities Design and implement conceptual and logical data models for finance and colleague domains. Define complex as-is and to-be data architectures, including transition states. Develop and maintain data standards, principles, and architecture artifacts. Build scalable solutions using data lakes, data warehouses, and data governance platforms. Ensure data lineage, quality, and consistency across platforms. Translate business requirements into technical solutions for data acquisition, storage, transformation, and governance. Collaborate with cross-functional teams for data solution design and delivery Required Skills Strong communication and stakeholder engagement. Hands-on experience with Kimball dimensional modeling and/or Snowflake modeling. Expertise in modern cloud data platforms and architecture (AWS, Azure, or GCP). Proficient in building solutions for web, mobile, and tablet platforms. Background in Finance and/or Colleague Technology (HR systems) is a strong plus. Preferred Qualifications Bachelors/Masters degree in Computer Science, Engineering, or a related field. 8–15 years of experience in data architecture and solution design. Important Notes Immediate Joiners Only (Notice period max of 15 days) Do not apply if you’ve recently applied or are currently in the Xebia interview process Location: Bengaluru – candidates must be based in or open to relocating immediately To Apply Send your updated resume with the following details to vijay.s@xebia.com: Full Name: Total Experience: Current CTC: Expected CTC: Current Location: Preferred Location: Notice Period / Last Working Day (if serving notice): Primary Skill Set: LinkedIn URL: Apply now and be part of our exciting transformation journey at Xebia!
Posted 3 weeks ago
3.0 - 8.0 years
20 - 35 Lacs
Pune
Work from Office
Who we are: Founded in 2015, Fresh Gravity helps businesses make data-driven decisions. We are driven by data and its potential as an asset to drive business growth and efficiency. Our consultants are passionate innovators who solve clients business problems by applying best-in-class data and analytics solutions. We provide a range of consulting and systems integration services and solutions to our clients in the areas of Data Management, Analytics and Machine Learning, and Artificial Intelligence. In the last 10 years, we have put together an exceptional team and have delivered 200+ projects for over 80 clients ranging from startups to several fortune 500 companies. We are on a mission to solve some of the most complex business problems for our clients using some of the most exciting new technologies, providing the best of learning opportunities for our team. We are focused and intentional about building a strong corporate culture in which individuals feel valued, supported, and cared for. We foster an environment where creativity thrives, paving the way for groundbreaking solutions and personal growth. Our open, collaborative, and empowering work culture is the main reason for our growth and success. To know more about our culture and employee benefits, visit our website https://www.freshgravity.com/employee- benefits/. We promise rich opportunities for you to succeed, to shine, to exceed even your own expectations. We are data driven. We are passionate. We are innovators. We are Fresh Gravity. What You'll Do: Configure the data model, hierarchies, and other relationships. Troubleshoot technical issues conflict resolution and demonstrate problem-solving skills. Monitor, analyze, and participate in business requirement review, data loading, data modeling, etc. Strong background in API Development, preferably within the Informatica ecosystem Utilize CDI (Cloud Data Integration) tools for effective data integration within the cloud environment. Implement CAI (Cloud Application Integration) for seamless connectivity across applications. Deploy DQ (Data Quality) tools and methodologies to ensure high-quality data across systems. Design and implement efficient APIs for diverse integration requirements. Involved in requirement gatherings and analyzing Business Requirement Documents and worked with system analysts to create source-to-target documents. Analyse data in source databases before loading it into the data warehouse and create technical documents according to BRD. What Were Looking For: Hands-on experience in client projects Hands-on experience in Informatica MDM which includes, but is not limited to, advanced match/merge configuration, AVOS, SIF, E360, IDD, etc. OR Hands-on experience as an Informatica IDMC SaaS developer (Informatica Intelligent Cloud Services (IICS) and Intelligent Data Management Cloud (IDMC). Experience with Java enterprise. Nice to have experience in AWS services like S3, SQS etc. Why us: In addition to a competitive package, we promise rich opportunities for you to succeed, to shine, to exceed even your own expectations. In keeping with Fresh Gravity’s challenger ethos, we have developed the 5Dimensions (5D) benefits program. This program recognizes the multiple dimensions within each of us and seek to provide opportunities for deep development across these dimensions. Enrich Myself; Enhance My Client; Build my Company, Nurture My Family; and Better Humanity.
Posted 3 weeks ago
12.0 - 17.0 years
25 - 30 Lacs
Gurugram
Work from Office
We are currently seeking an experienced professional to join our team in the role of Finance Management Assistant Vice President Business: Finance Principal responsibilities This role is part of the Stress Testing Data Operations vertical. The key responsibilities for this role include: Perform data plausibility review / analysis of all the metrics/dimension in each of the tabs of the PRA reports to come up with review and challenge queries, work with CPOs to get explanation, get the correction processed, enhance explanantion in Basis of Preparation Work with stakeholder to log new Data Quality issues on Aurora and track resolution of new/existing Data Quality issues Drive/Lead enhancement of existing/new data quality checks via Python/QlikSense Perform change assessment as and when the Regulator communicates changes in existing requirement or releases a new requirement, closely work with all stakeholders to agree ownership, data sourcing, design, system enhancement. and ensure the requirements are set up ahead of the production cycle. Create/update Instructions, governance document, operating model, provide status updates in various Working Group Forums on quarterly/annual submissions Demonstrate completeness of the Data Plausibility review to Risk owner to obtain his sign offs. Provides instructions and best practice guidance to regional and global business peers Provide support to the Data Lead in execution and delivery of various adhoc requirements from the Regulator Support Quarterly Actuals, Tzero, Projections and GIST deliverables on an on going basis. Requirements Strong financial accounting experience, in particular financial consolidation Knowledge regulatory reporting requirements pertaining to Stress testing Preferably having Knowledge and experience in Basel III Stress Testing reporting Knowledge of Financial reporting would be an advantage, in particular the differences between a financial and regulator basis of consolidation Previous experience in a reporting role essential Preferably a qualified Accountant Self-motivated and capable of working as part of a team Good mathematical and analytical skills Ability to work under pressure, report to tight deadlines and deal effectively with issues as they arise Proven ability to develop and communicate effective arguments confidently Strong communication and interpersonal skills
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20312 Jobs | Dublin
Wipro
11977 Jobs | Bengaluru
EY
8165 Jobs | London
Accenture in India
6667 Jobs | Dublin 2
Uplers
6462 Jobs | Ahmedabad
Amazon
6351 Jobs | Seattle,WA
Oracle
5993 Jobs | Redwood City
IBM
5803 Jobs | Armonk
Capgemini
3897 Jobs | Paris,France
Tata Consultancy Services
3776 Jobs | Thane