Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
1.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Primary Responsibilities Software engineering is the application of engineering to the design, development, implementation, testing and maintenance of software in a systematic method Cover all primary development activity across all technology functions that ensure we deliver code with high quality for our applications, products and services and to understand customer needs and to develop product roadmaps These roles include, but are not limited to analysis, design, coding, engineering, testing, debugging, standards, methods, tools analysis, documentation, research and development, maintenance, new development, operations and delivery With every role in the company, each position has a requirement for building quality into every output Evaluating new tools, new techniques, strategies; Automation of common tasks; build of common utilities to drive organizational efficiency with a passion around technology and solutions and influence of thought and leadership on future capabilities and opportunities to apply technology in new and innovative ways Basic, structured, standard approach to work Build and manage data engineering pipelines in Azure and Airflow Develop and manage the business logic in SQL and Python Monitoring the ETL processes Come up with recommendations on best possible solution approaches to fix glitches and data issues Work as a self-driven Individual Contributor Work closely with Business/SA/BA/Data Team to ensure requirements are well documented and clear Support team on day to day tasks to achieve business goals. Be accountable for timely quality delivery of assigned team/modules Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications Graduate degree or equivalent experience 1+ years of experience in SQL, Python, ETL tool (Airflow), Database, Snowflake, Azure Hands-on experience on building data pipelines in Airflow and Python Hands-on experience on SQL writing and good understanding of sql concepts joins, group functions, analytics function etc. Working experience on project related to Database, ETL, Data analysis and Reporting Applications Solid fundamental knowledge of Database key concepts, Data warehousing, Data models etc. Proven excellent verbal and written communication skills At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.
Posted 1 day ago
3.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Welcome to Veradigm! Our Mission is to be the most trusted provider of innovative solutions that empower all stakeholders across the healthcare continuum to deliver world-class outcomes. Our Vision is a Connected Community of Health that spans continents and borders. With the largest community of clients in healthcare, Veradigm is able to deliver an integrated platform of clinical, financial, connectivity and information solutions to facilitate enhanced collaboration and exchange of critical patient information. Veradigm Welcome to Veradigm! Our Mission is to be the most trusted provider of innovative solutions that empower all stakeholders across the healthcare continuum to deliver world-class outcomes. Our Vision is a Connected Community of Health that spans continents and borders. With the largest community of clients in healthcare, Veradigm is able to deliver an integrated platform of clinical, financial, connectivity and information solutions to facilitate enhanced collaboration and exchange of critical patient information. We are an Equal Opportunity Employer. No job applicant or employee shall receive less favorable treatment or be disadvantaged because of their gender, marital or family status, color, race, ethnic origin, religion, disability or age; nor be subject to less favorable treatment or be disadvantaged on any other basis prohibited by applicable law. For more information, please explore Veradigm.com. What Will Your Job Look Like This position reports to the Claims Manager and is responsible for the daily preparation of electronic claims processing, manual claim form processing, electronic transmission error corrections, patient statement processing, new client electronic claims enrollment authorization, client software training. Main Duties Daily transmission of electronic claims, either direct to the payer or via the clearinghouse. Processing of HCFA 1500 claims forms. Responsible for the setup of payor EDI numbers into PCN. Enrollment of new clients to allow electronic data interchange/claims submission. The following is a list of current payers who require the client to submit an application and receive authorization prior to sending electronic claims. (Medicare, Medi-cal, Blue Cross, Blue Shield, Champus, Medicare RR and DMERC). Transmission of patient statement files twice monthly on alternating Tuesdays. Returned mail correction. Patient receives one call requesting an updated mailing address. If the patient does not respond than the balance is either adjusted off or transferred to a collection agency. Client specific small balance minimum policies reside within the policies and procedures folder on the shared drive. Achieve goals set forth by supervisor regarding error-free work, transactions, processes and compliance requirements. Other duties as assigned. Essential functions may include: Performs initial review of database after wizard HL7 automation is complete Collaboration with RCMS team to trouble shoot system configuration issues Handles system dictionary table modifications and documentation Monitor BAM daily exceptions Address updates to registration based on USPS exceptions that do not update electronically Process paper claims supporting the onboard monitoring and tracking of progress Acts as a liaison between teams to ensure timely implementation on activation date Ensures SharePoint PF Onboarding tasks are updated daily Identified and reports risks to project Must be highly organized and self-motivated Strong critical thinking skills Apply logic to technical claim problems Proficient in excel and Microsoft products Prior experience on Allscripts PM is a plus Strong communication skills Clearinghouse experience preferred Academic Qualifications High School Diploma or GED (Required) 3+ Years of experience in relevant field An Ideal Candidate Will Have Technical: Extensive knowledge on use of email, search engine, Internet; ability to effectively use payer websites and Laserfiche; knowledge and use of Microsoft Products: Outlook, Word, Excel. Preferred experience with various billing systems, such as NextGen, Pro and Allscripts. Personal: Strong written, oral, and interpersonal communication skills; Ability to present ideas in business-friendly and user-friendly language; Highly self-motivated, self-directed, and attentive to detail; team-oriented, collaborative; ability to effectively prioritize and execute tasks in a high-pressure environment. Communication: Ability to read, analyze and interpret complex documents. Ability to respond effectively to sensitive inquiries or complaints from employees and clients. Ability to speak clearly and to make effective and persuasive arguments and presentations. Math & Reasoning: Ability to add, subtract, multiply, and divide in all units of measure, using whole numbers, common fractions, and decimals. Ability to use critical thinking skills to apply principles of logic and analytical thinking to practical problems. Work Arrangements: Work from Pune Office all 5 days. Shift Timing: 7:30 PM IST to 4:30 AM IST (US Shift) Benefits Veradigm believes in empowering our associates with the tools and flexibility to bring the best version of themselves to work. Through our generous benefits package with an emphasis on work/life balance, we give our employees the opportunity to allow their careers to flourish. Quarterly Company-Wide Recharge Days Peer-based incentive “Cheer” awards “All in to Win” bonus Program Tuition Reimbursement Program To know more about the benefits and culture at Veradigm, please visit the links mentioned below: - https://veradigm.com/about-veradigm/careers/benefits/ https://veradigm.com/about-veradigm/careers/culture/ Veradigm is proud to be an equal opportunity workplace dedicated to pursuing and hiring a diverse and inclusive workforce. Thank you for reviewing this opportunity! Does this look like a great match for your skill set? If so, please scroll down and tell us more about yourself!
Posted 1 day ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Business or Technical Solution Analyst Primary Skills Experience with Advanced SQL - ability to write and read long complex queries Experience as Business or Technical Analyst: Experience driving conclusions from analysis, data diving and technical or business documentation. Experience collecting, writing, and/or providing requirements. Experience executing and/or designing user acceptance tests. Exposure to regulatory processes. Finance and/or Banking background Good understanding of Data Flow , Data Model and database applications Secondary Skills Conceptual knowledge of ETL and data warehousing, working knowledge is added advantage Basic knowledge of Java is added advantage
Posted 1 day ago
3.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
P1 C2 STS Bachelor s degree in Computer Science Computer Engineering MIS or a technically related field or equivalent Must to have At least 3 years of development experience on which at least 1 year should be in the capacity of a Senior Engineer Must to have Knowledge of one or more relevant programming language, typically C, C++, Perl, Java, SQL, XML Must to have Telecom experience Must to have Broad understanding of UNIX operating environment Must to have Experience in Singleview Must to have Broad understanding of end-to-end real time integration Customer Management Domain knowledge, e.g., PeopleSoft Knowledge of Multi-tiered, relational database architectures with UNIX, Oracle and Windows as the primary operating environments. Knowledge of middleware technologies, e.g., Tuxedo Awareness of industry standards such as TMForum Awareness of Agile Software development methodologies such as SAFE or Kanban Self-motivated individual who works well in a team environment
Posted 1 day ago
0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Calling all innovators – find your future at Fiserv. We’re Fiserv, a global leader in Fintech and payments, and we move money and information in a way that moves the world. We connect financial institutions, corporations, merchants, and consumers to one another millions of times a day – quickly, reliably, and securely. Any time you swipe your credit card, pay through a mobile app, or withdraw money from the bank, we’re involved. If you want to make an impact on a global scale, come make a difference at Fiserv. Job Title Specialist, Client Tech Support Engineering Position Summary: We are seeking skilled C# .NET engineers to join our dynamic application support team. This is non-development role and ideal candidate should have good experience in analyzing and troubleshooting robust and scalable windows applications using the .NET framework along with strong analytical and problem-solving skills. Key Responsibilities Maintenance and Support: Maintain existing software systems by identifying and software defects and analyzing and troubleshooting client tickets. Windows software applications using .NET framework (C#, .NET, Angular 2+, REST, Windows Forms). Must have good understanding on application logging, event logs, traces etc. Database Management: Develop/write basic SQL queries, review stored procedures, and database structures. Experience with Oracle database is essential. Integration: understanding on Integrated software solutions by analyzing and designing system requirements, standards, and specifications. Collaboration: Work closely with project managers, business analysts, and other developers to deliver high-quality software solutions. Technical Skills: Proficiency in C#, .NET, Angular 2+, REST, Window Forms, WPF and WCF. Experience with front-end technologies such as JavaScript, HTML5, CSS3, and Angular/React. Strong knowledge of Oracle database and database concepts. Proficiency with RESTful and SOAP APIs. Experience with Version Control Systems such as TFS, Git. Understanding of Agile methodologies and software development lifecycle. Must have good understanding on popular application logging (SVC logs, log4net), event logs, traces etc. Exposure to Hardware Integrations Problem-Solving: Strong analytical and problem-solving abilities. Communication: Excellent verbal and written communication skills. Team Player: Ability to work effectively both independently and as part of a team. Prior experience within the banking or financial services domain is preferred. Shift: 3:00 PM to 12:00 PM Thank You For Considering Employment With Fiserv. Please Apply using your legal name Complete the step-by-step profile and attach your resume (either is acceptable, both are preferable). Our Commitment To Diversity And Inclusion Fiserv is proud to be an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, gender, gender identity, sexual orientation, age, disability, protected veteran status, or any other category protected by law. Note To Agencies Fiserv does not accept resume submissions from agencies outside of existing agreements. Please do not send resumes to Fiserv associates. Fiserv is not responsible for any fees associated with unsolicited resume submissions. Warning About Fake Job Posts Please be aware of fraudulent job postings that are not affiliated with Fiserv. Fraudulent job postings may be used by cyber criminals to target your personally identifiable information and/or to steal money or financial information. Any communications from a Fiserv representative will come from a legitimate Fiserv email address.
Posted 1 day ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Business or Technical Solution Analyst Primary Skills Experience with Advanced SQL - ability to write and read long complex queries Experience as Business or Technical Analyst: Experience driving conclusions from analysis, data diving and technical or business documentation. Experience collecting, writing, and/or providing requirements. Experience executing and/or designing user acceptance tests. Exposure to regulatory processes. Finance and/or Banking background Good understanding of Data Flow , Data Model and database applications Secondary Skills Conceptual knowledge of ETL and data warehousing, working knowledge is added advantage Basic knowledge of Java is added advantage
Posted 1 day ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
We are seeking a dedicated individual to join our team. Candidate will be responsible for performing a variety of tasks to support our operations and ensure the smooth running of our project. Primary Skills -React redux Jest NodeJS Typescript Jquery SQL No-SQL Database Secondary Skills -Experience in Microservice Design and Deployment, Express JS Note: Need to work from office min 3 days a week
Posted 1 day ago
0 years
0 Lacs
Delhi, India
On-site
Sales Function Main Goal To develop and handle the Bargarh/Sambalpur Territory of Odisha for exploring the opportunies on crops we are focusing and achieving the goal of Ambition 2030. Job Description CUSTOMER KNOWLEDGE – To expand and maintain the company channel data base : Identify and collect customer needs ; prospect and analyse DISTRIBUTOR sales potential. MARKET & COMPETITION INTELLIGENCE – Maintain and update the market intelligence database for the defined area (including market review, pricing survey, seed technology watch, etc.) and convey market trends to the Sales Manager. PRODUCT DEVELOPMENT – Support the Product Development team to set up trials & eventually to evaluate new varieties and provide opinion on variety advancement. PROMOTIONAL ACTIVTIES PLAN & IMPLEMENTATION - Propose, plan and implement promotion actions for the new varieties in the defined area jointly with the local sales and development team when necessary. Lead trials visits, fairs and other promotion events with the customers. Bring adapted technical advices and support to the customers (direct & indirect). SALES CAMPAIGN & ACTIONS – Propose and negotiate Customer specific sales actions validated by the management in order to reach the sales revenue target ORDER TAKING, SALES EXECUTION & SALES FOLLOW UP - a. Control that direct & indirect customers confirmed their orders to the Customer Service (eventually take directly the order from the customer). b. Follow up the sales target completion & propose corrective actions when applicable. 7. REPORTING – Write and submit sales reports to the Sales Manager. CREDIT MANAGEMENT – Propose account payment conditions review; for past due, realize money collection actions jointly with the Credit Management team. CLAIMS & COMPLAINTS – Manage the settlement of claims & complaints jointly with the support of the local Product Development Representative with validation of the manager. What We Expect Of You Responsible for the turn over of the territory Responsible for the receivables & collections recovery. Responsible for the market & competitor intelligence reporting in the territory Responsible for Sales team development in the territory Skills Sales Goals Sales Skills Territory Management Sales Team Support Decision Making/Judgment Integrity/Ethics Initiative Results Focus Vision and Values Your Benefits And Working Environment Career Growth Diversity and Inclusion Gender Equality Employee Wellness Training and Development Safer Workplace Rewards and Recognition Recruitment process Profile Screening Phone screening Interviews - 2 rounds Offer Discussion Offer Letter Issue Background Verification Pre Employment Medical Checkup Onboarding Job Id: T5TGIzh3XbSMtSzcx1xbyax7dMmJqpDP+xpsWiT7gpWdTkrjTayTW54LIv2AvW0lFxlOJdS4HXnh7ASToTVOF9s+0CjrHn13ivSdvhY/CjEaR4zqXIKXWqIkzu2SozTYXOYd0FEl1nGasFPvvm4w6EVFLnzsD6emmgIdT/IGaYs9ZAtQIQpMCISBlw2baqv5p7j6fhYPRK4ThgfkTXUm09oCKRBkpp8rvZ/th3Inuw==
Posted 1 day ago
5.0 years
0 Lacs
Delhi, India
On-site
Role Overview We are seeking an experienced Calypso Specialist with strong hands-on expertise in Calypso configuration, development, and support, particularly in the areas of settlements and cash flow management within the Capital Markets domain. This role involves managing post-trade workflows, trade lifecycle processing, and integration with downstream systems while working closely with stakeholders like Operations users. Key Responsibilities Calypso Configuration & Customization: Configure and manage Calypso modules for settlements and cash flow management. Work on customizations including Workflows, BOMessages, BOTransfers, Scheduled Tasks, Reports, and Engines. Trade Lifecycle Management: Handle end-to-end trade lifecycle processing with a focus on Straight-Through Processing (STP). Resolve trade breaks and discrepancies across asset classes such as Equities, Fixed Income, and Derivatives. System Integration: Work with SWIFT messaging, SSI management, and custodian integration. Collaboration: Interact with Operations users to address trade settlement, reconciliation, and reporting requirements. Key Skills & Expertise Calypso Expertise: Minimum 5 Years Of Hands-on Experience In Calypso. Proficiency in Calypso version 16 or higher, including work with Calypso APIs. Deep understanding of settlements workflows and cash flow management. Solid knowledge of Equities, FX, Fixed Income, and Derivatives instruments. Technical Skills: Java Development: Minimum 4 years of hands-on experience in Core Java (JDK 1.8 or above). Database Knowledge: Proficiency in writing SQL queries for databases like Oracle or Sybase. Unix Skills: Basic understanding of Unix commands. Version Control: Familiarity with tools like SVN or Git. Trading Processes: Strong understanding of trade settlement, reconciliation, and reporting processes. Soft Skills: Excellent analytical and problem-solving skills with a learn-to-do attitude. Strong communication and interpersonal skills to work effectively with stakeholders. Ability to work independently from day one. Agile Methodologies: Experience working in Agile environments for project delivery. Preferred Candidate Profile Proven hands-on experience in Calypso module configuration, development, and support. Exposure to test automation and scripting (advantageous). Strong ability to work independently and manage tasks efficiently. About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need. Job Id: X57qT777H6ne9FqwMGAAFBbrwIhXoRSJAiveCVbYbcoQmjmtScS3xbdtdbWJykNuyBHsAILzHcCsk3OAnmyjkh+VNef0cG8ETIum7GVpgX2UtaymnFdzcPFe//LQVViuYTSXGYQhN+IjFubvX+Cf/FEPbeo6zzoDR5ZQdzKY8OasTvBWZcQ/j1angHyZnBT7ZwRi0b3o0b0kGdvXKBhpKPhg8oVCt7NS8Q==
Posted 1 day ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Responsible for developing, optimize, and maintaining business intelligence and data warehouse systems, ensuring secure, efficient data storage and retrieval, enabling self-service data exploration, and supporting stakeholders with insightful reporting and analysis. Grade - T3_ Please note that the Job will close at 12am on Posting Close date, so please submit your application prior to the Close Date What Your Main Responsibilities Are Support the development and maintenance of business intelligence and analytics systems to support data-driven decision-making. Implement of business intelligence and analytics systems, ensuring alignment with business requirements. Design and optimize data warehouse architecture to support efficient storage and retrieval of large datasets. Enable self-service data exploration capabilities for users to analyze and visualize data independently. Develop reporting and analysis applications to generate insights from data for business stakeholders. Design and implement data models to organize and structure data for analytical purposes. Implement data security and federation strategies to ensure the confidentiality and integrity of sensitive information. Optimize business intelligence production processes and adopt best practices to enhance efficiency and reliability. Assist in training and support to users on business intelligence tools and applications. Collaborate and maintain relationships with vendors and oversee project management activities to ensure timely and successful implementation of business intelligence solutions. What We Are Looking For Education: Bachelor's degree or equivalent in Computer Science, MIS, Mathematics, Statistics, or similar discipline. Master's degree or PhD preferred. Relevant work experience in data engineering based on the following number of years: Standard I: Two (2) years Standard II: Three (3) years Senior I: Four (4) years Senior II: Five (5) years Knowledge, Skills And Abilities Fluency in English Analytical Skills Accuracy & Attention to Detail Numerical Skills Planning & Organizing Skills Presentation Skills Data Modeling and Database Design ETL (Extract, Transform, Load) Skills Programming Skills FedEx was built on a philosophy that puts people first, one we take seriously. We are an equal opportunity/affirmative action employer and we are committed to a diverse, equitable, and inclusive workforce in which we enforce fair treatment, and provide growth opportunities for everyone. All qualified applicants will receive consideration for employment regardless of age, race, color, national origin, genetics, religion, gender, marital status, pregnancy (including childbirth or a related medical condition), physical or mental disability, or any other characteristic protected by applicable laws, regulations, and ordinances. Our Company FedEx is one of the world's largest express transportation companies and has consistently been selected as one of the top 10 World’s Most Admired Companies by "Fortune" magazine. Every day FedEx delivers for its customers with transportation and business solutions, serving more than 220 countries and territories around the globe. We can serve this global network due to our outstanding team of FedEx team members, who are tasked with making every FedEx experience outstanding. Our Philosophy The People-Service-Profit philosophy (P-S-P) describes the principles that govern every FedEx decision, policy, or activity. FedEx takes care of our people; they, in turn, deliver the impeccable service demanded by our customers, who reward us with the profitability necessary to secure our future. The essential element in making the People-Service-Profit philosophy such a positive force for the company is where we close the circle, and return these profits back into the business, and invest back in our people. Our success in the industry is attributed to our people. Through our P-S-P philosophy, we have a work environment that encourages team members to be innovative in delivering the highest possible quality of service to our customers. We care for their well-being, and value their contributions to the company. Our Culture Our culture is important for many reasons, and we intentionally bring it to life through our behaviors, actions, and activities in every part of the world. The FedEx culture and values have been a cornerstone of our success and growth since we began in the early 1970’s. While other companies can copy our systems, infrastructure, and processes, our culture makes us unique and is often a differentiating factor as we compete and grow in today’s global marketplace.
Posted 1 day ago
10.0 years
0 Lacs
Kolkata, West Bengal, India
On-site
At EY, we’re all in to shape your future with confidence. We’ll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go. Join EY and help to build a better working world. Data Engineering Manager Objectives and Purpose The Data Engineering Manager leads large scale solution architecture design and optimisation to provide streamlined insights to partners throughout the business. This individual leads the team of Mid and Senior data engineers to partner with visualization on data quality and troubleshooting needs. The Data Engineering manager will: Implement data processes for the data warehouse and internal systems Lead a team of Junior and Senior Data Engineers in executing data processes and providing quality, timely data management Managing data architecture, designing ETL process Clean, aggregate and organize data from disparate sources and transfer it to data warehouses. Lead development testing and maintenance of data pipelines and platforms, to enable data quality to be utilized within business dashboards and tools. Support team members and direct reports in refining and validating data sets. Create, maintain, and support the data platform and infrastructure that enables the analytics front-end; this includes the testing, maintenance, construction, and development of architectures such as high-volume, large-scale data processing and databases with proper verification and validation processes. Your Key Responsibilities Lead the design, development, optimization, and maintenance of data architecture and pipelines adhering to ETL principles and business goals. Develop and maintain scalable data pipelines, build out new integrations using AWS native technologies and data bricks to support increases in data source, volume, and complexity. Define data requirements, gather and mine large scale of structured and unstructured data, and validate data by running various data tools in the Big Data Environment. Lead the ad hoc data analysis, support standardization, customization and develop the mechanisms to ingest, analyze, validate, normalize, and clean data. Write unit/integration/performance test scripts and perform data analysis required to troubleshoot data related issues. Implement processes and systems to drive data reconciliation and monitor data quality, ensuring production data is always accurate and available for key stakeholders, downstream systems, and business processes. Lead the evaluation, implementation and deployment of emerging tools and processes for analytic data engineering to improve productivity. Develop and deliver communication and education plans on analytic data engineering capabilities, standards, and processes. Solve complex data problems to deliver insights that help achieve business objectives. Partner with Business Analysts and Enterprise Architects to develop technical architectures for strategic enterprise projects and initiatives. Coordinate with Data Scientists, visualization developers and other data consumers to understand data requirements, and design solutions that enable advanced analytics, machine learning, and predictive modelling. Collaborate with AI/ML engineers to create data products for analytics and data scientist team members to improve productivity. Advise, consult, mentor and coach other data and analytic professionals on data standards and practices, promoting the values of learning and growth. Foster a culture of sharing, re-use, design for scale stability, and operational efficiency of data and analytical solutions. To qualify for the role, you must have the following: Preferred Skillsets Bachelor's degree in Engineering, Computer Science, Data Science, or related field 10+ years of experience in software development, data engineering, ETL, and analytics reporting development. Expert in building and maintaining data and system integrations using dimensional data modelling and optimized ETL pipelines. Experience in design and developing ETL pipelines using ETL tools like IICS, Datastage, Abinitio, Talend etc. Advanced experience utilizing modern data architecture and frameworks like data mesh, data fabric, data product design Experience with designing data integration frameworks capable of supporting multiple data sources, consisting of both structured and unstructured data Proven track record of designing and implementing complex data solutions Demonstrated understanding and experience using: Data Engineering Programming Languages (i.e., Python, SQL) Distributed Data Framework (e.g., Spark) Cloud platform services (AWS preferred) Relational Databases Working knowledge of DevOps (Github/Gitlab etc.) with continuous integration AWS knowledge on services like Lambda, DMS, Step Functions, S3, Event Bridge, Cloud Watch, Aurora RDS or related AWS ETL services Knowledge of Data lakes, Data warehouses Databricks/Delta Lakehouse architecture Deep understanding of database architecture, Data modelling concepts and administration. Handson experience of Spark Structured Streaming for building real-time ETL pipelines. Proficient in programming languages (e.g., SQL, Python, Pyspark) to design, develop, maintain, and optimize data architecture/pipelines that fit business goals. Extracts, transforms, and loads data from multiple external/internal sources using Databricks Lakehouse/Data Lake concepts into a single, consistent source to serve business users and data visualization needs. Leverages continuous integration and delivery principles to automate code deployment to elevated environments using GitHub Actions. Excellent written and verbal communication skills, including storytelling and interacting effectively with multifunctional teams and other strategic partners. Strong organizational skills with the ability to manage multiple projects simultaneously, operating as leading member across globally distributed teams. Strong problem solving and troubleshooting skills. Lead and oversee the code review process within the data engineering team to ensure high-quality, efficient, and maintainable code, while optimizing for performance and scalability. Ability to work in a fast-paced environment and adapt to changing business priorities. Identifying and implementing strategies to optimize AWS / Databricks cloud costs, ensuring efficient and cost-effective use of cloud resources. Understanding of Databricks Unity Catalog for effective data governance and implementing robust access control mechanisms is highly advantageous. Good To Have Skillsets Master's degree in engineering specialized in Computer Science, Data Science, or related field Demonstrated understanding and experience using: Knowledge in CDK Experience in IICS Data Integration tool Job orchestration tools like Tidal/Airflow/ or similar Knowledge on No SQL Experience in a global working environment Databricks Certified Data Engineer Professional AWS Certified Data Engineer - Associate EY | Building a better working world EY is building a better working world by creating new value for clients, people, society and the planet, while building trust in capital markets. Enabled by data, AI and advanced technology, EY teams help clients shape the future with confidence and develop answers for the most pressing issues of today and tomorrow. EY teams work across a full spectrum of services in assurance, consulting, tax, strategy and transactions. Fueled by sector insights, a globally connected, multi-disciplinary network and diverse ecosystem partners, EY teams can provide services in more than 150 countries and territories.
Posted 1 day ago
3.0 - 4.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Job Description Alimentation Couche-Tard Inc., (ACT) is a global Fortune 200 company. A leader in the convenience store and fuel space, it has footprint across 31 countries and territories. Circle K India Data & Analytics team is an integral part of ACT’s Global Data & Analytics Team, and the Data Scientist/Senior Data Scientist will be a key player on this team that will help grow analytics globally at ACT. The hired candidate will partner with multiple departments, including Global Marketing, Merchandising, Global Technology, and Business Units. _____________________________________________________________________________ Department: Data & Analytics Location: Cyber Hub, Gurugram, Haryana (5 days in office) Job Type: Permanent, Full-Time (40 Hours) Reports To: Senior Manager Data Science & Analytics _____________________________________________________________________________ About The Role The incumbent will be responsible for delivering advanced analytics projects that drive business results including interpreting business, selecting the appropriate methodology, data cleaning, exploratory data analysis, model building, and creation of polished deliverables. Roles & Responsibilities Analytics & Strategy Analyse large-scale structured and unstructured data; develop deep-dive analyses and machine learning models in retail, marketing, merchandising, and other areas of the business Utilize data mining, statistical and machine learning techniques to derive business value from store, product, operations, financial, and customer transactional data Apply multiple algorithms or architectures and recommend the best model with in-depth description to evangelize data-driven business decisions Utilize cloud setup to extract processed data for statistical modelling and big data analysis, and visualization tools to represent large sets of time series/cross-sectional data Operational Excellence Follow industry standards in coding solutions and follow programming life cycle to ensure standard practices across the project Structure hypothesis, build thoughtful analyses, develop underlying data models and bring clarity to previously undefined problems Partner with Data Engineering to build, design and maintain core data infrastructure, pipelines and data workflows to automate dashboards and analyses. Stakeholder Engagement Working collaboratively across multiple sets of stakeholders – Business functions, Data Engineers, Data Visualization experts to deliver on project deliverables Articulate complex data science models to business teams and present the insights in easily understandable and innovative formats Job Requirements Education Bachelor’s degree required, preferably with a quantitative focus (Statistics, Business Analytics, Data Science, Math, Economics, etc.) Master’s degree preferred (MBA/MS Computer Science/M.Tech Computer Science, etc.) Relevant Experience 3 - 4 years for Data Scientist Relevant working experience in a data science/advanced analytics role Behavioural Skills Delivery Excellence Business disposition Social intelligence Innovation and agility Knowledge Functional Analytics (Supply chain analytics, Marketing Analytics, Customer Analytics, etc.) Statistical modelling using Analytical tools (R, Python, KNIME, etc.) Knowledge of statistics and experimental design (A/B testing, hypothesis testing, causal inference) Practical experience building scalable ML models, feature engineering, model evaluation metrics, and statistical inference. Practical experience deploying models using MLOps tools and practices (e.g., MLflow, DVC, Docker, etc.) Strong coding proficiency in Python (Pandas, Scikit-learn, PyTorch/TensorFlow, etc.) Big data technologies & framework (AWS, Azure, GCP, Hadoop, Spark, etc.) Enterprise reporting systems, relational (MySQL, Microsoft SQL Server etc.), non-relational (MongoDB, DynamoDB) database management systems and Data Engineering tools Business intelligence & reporting (Power BI, Tableau, Alteryx, etc.) Microsoft Office applications (MS Excel, etc.)
Posted 1 day ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Title: GCC Speaker Program Ops, Sr. Associate Job Summary As a central support unit within Global Customer Capabilities (GCC) this role is instrumental in supporting the planning, execution, and Compliance of Speaker Programs within Amgen. The ideal candidate will possess strong administrative skills, attention to detail, and a passion for ensuring excellence. Responsibilities of this position will include: Ensure all Speaker Program activities comply with relevant regulatory requirements, including FDA guidelines, PhRMA Code, and Sunshine Act Reporting Oversee the planning, coordination, and execution of Amgen Speaker programs, ensuring alignment with company objectives and Compliance with industry regulations. Maintain SharePoint escalations between Business Unit, Vendors and Compliance Support the implementation of Compliance procedures and protocols to mitigate risks associated with Speaker Program Operations Manage BU contracting process, campaigns, nominations, documentation requests and point of contact for speakers for renewals and new speakers Run and/or distribute reports from Speaker Program Platform (SalesForce/Centris) Communicate program updates, guidelines, and requirements to relevant stake holders in a clear timely manner Monitor Speaker Program Operations email inboxes to answer questions, provide information and/or resolve issues. Adhere to processes and standards and suggest improvements. Support and drive special projects as needed Basic Qualifications Bachelor's degree & 5 years of directly related experience in Learning and Development, Event Management, Business, Pharmaceutical/ Biotechnology Industry Preferred Qualifications Previous job experience in the Pharmaceutical/ Biotechnology Industry Experience with managing relationships with competing demands and priorities Strong time management, organization, and prioritization skills Strong problem-solving skills Strong written and verbal communication skills Exceptional attention to detail with the ability to multi-task Ability to work well in teams and interact effectively with various levels of management. Willingness to be flexible to meet team goals and priorities. Strong computer and database skills with Microsoft Word, PowerPoint, Excel, Outlook, and Teams
Posted 1 day ago
5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Introduction We are seeking a skilled Data Analyst with 4–5 years of experience to design, develop, and maintain robust Power BI dashboards , build and optimize data queries from our Enterprise Data Warehouse (EDW) , normalize and transform complex data, and provide critical support for the Finance function. The ideal candidate will bring proven experience working with the Microsoft Azure tech stack , Power BI, and strong hands-on skills with Alteryx and Informatica to deliver trusted, actionable insights for the business. Your Role And Responsibilities Power BI Development & Visualization: Design, develop, and maintain interactive Power BI dashboards and reports to support business users with timely, actionable insights. Translate complex data requirements into visually compelling and easy-to-understand dashboards. Optimize report performance and implement best practices for data visualization and user experience. Data Extraction & Transformation Build, optimize, and maintain data queries from the Enterprise Data Warehouse (EDW) and other source systems. Develop, test, and deploy data pipelines using MS Azure services, Alteryx, and Informatica to ensure accurate, clean, and normalized datasets. Collaborate with Data Engineering teams to ensure efficient data flows, proper data modeling, and adherence to data governance standards. Finance Function Process Support Provide data preparation, validation, and reporting support for the process. Partner with Finance, Costing, and Operations teams to ensure timely and accurate report runs. Analyze process gaps and recommend improvements to streamline data handling and reporting. Collaboration & Continuous Improvement Work closely with business stakeholders to gather and clarify requirements, ensuring deliverables align with business objectives. Troubleshoot data issues and recommend solutions for data quality improvements. Stay current with advancements in the Microsoft Azure ecosystem, Power BI, and other BI tools to continuously improve solutions. Preferred Education Master's Degree Required Technical And Professional Expertise Bachelor’s degree in computer science, Information Systems, Data Analytics, or a related field. 4–5 years of hands-on experience in data analytics, business intelligence, or a related role. Advanced experience building Power BI dashboards, including DAX, Power Query (M), and data modeling. Strong proficiency with the Microsoft Azure data stack (e.g., Azure Data Factory, Azure SQL Database, Azure Synapse Analytics). Hands-on experience with Alteryx for data blending, preparation, and automation. Working knowledge of Informatica for ETL processes. Solid SQL skills for writing and optimizing queries for large, complex datasets. Strong understanding of data normalization, data governance, and best practices for data management. Excellent problem-solving skills with the ability to manage multiple tasks and priorities. Strong communication skills and ability to collaborate effectively with business and technical stakeholders. Preferred Technical And Professional Experience Ability to manage and make decisions about competing priorities and resources. Ability to delegate where appropriate. Must be a strong team player/leader. Ability to lead Data transformation projects with multiple junior data engineers. Strong oral written and interpersonal skills for interacting throughout all levels of the organization. Ability to communicate complex business problems and technical solutions.
Posted 1 day ago
5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Introduction A career in IBM Consulting embraces long-term relationships and close collaboration with clients across the globe. In this role, you will work for IBM BPO, part of Consulting that, accelerates digital transformation using agile methodologies, process mining, and AI-powered workflows. You'll work with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio, including IBM Software and Red Hat. Curiosity and a constant quest for knowledge serve as the foundation to success in IBM Consulting. In your role, you'll be supported by mentors and coaches who will encourage you to challenge the norm, investigate ideas outside of your role, and come up with creative solutions resulting in groundbreaking impact for a wide network of clients. Our culture of evolution and empathy centers on long-term career growth and learning opportunities in an environment that embraces your unique skills and experience. Your Role And Responsibilities Power BI Development & Visualization: Design, develop, and maintain interactive Power BI dashboards and reports to support business users with timely, actionable insights. Translate complex data requirements into visually compelling and easy-to-understand dashboards. Optimize report performance and implement best practices for data visualization and user experience. Data Extraction & Transformation Build, optimize, and maintain data queries from the Enterprise Data Warehouse (EDW) and other source systems. Develop, test, and deploy data pipelines using MS Azure services, Alteryx, and Informatica to ensure accurate, clean, and normalized datasets. Collaborate with Data Engineering teams to ensure efficient data flows, proper data modeling, and adherence to data governance standards. Finance Function Process Support Provide data preparation, validation, and reporting support for the process. Partner with Finance, Costing, and Operations teams to ensure timely and accurate report runs. Analyze process gaps and recommend improvements to streamline data handling and reporting. Collaboration & Continuous Improvement Work closely with business stakeholders to gather and clarify requirements, ensuring deliverables align with business objectives. Troubleshoot data issues and recommend solutions for data quality improvements. Stay current with advancements in the Microsoft Azure ecosystem, Power BI, and other BI tools to continuously improve solutions. Preferred Education Master's Degree Required Technical And Professional Expertise Bachelor’s degree in computer science, Information Systems, Data Analytics, or a related field. 4–5 years of hands-on experience in data analytics, business intelligence, or a related role. Advanced experience building Power BI dashboards, including DAX, Power Query (M), and data modeling. Strong proficiency with the Microsoft Azure data stack (e.g., Azure Data Factory, Azure SQL Database, Azure Synapse Analytics). Hands-on experience with Alteryx for data blending, preparation, and automation. Working knowledge of Informatica for ETL processes. Solid SQL skills for writing and optimizing queries for large, complex datasets. Strong understanding of data normalization, data governance, and best practices for data management. Excellent problem-solving skills with the ability to manage multiple tasks and priorities. Strong communication skills and ability to collaborate effectively with business and technical stakeholders. Preferred Technical And Professional Experience Ability to manage and make decisions about competing priorities and resources. Ability to delegate where appropriate. Must be a strong team player/leader. Ability to lead Data transformation projects with multiple junior data engineers. Strong oral written and interpersonal skills for interacting throughout all levels of the organization. Ability to communicate complex business problems and technical solutions.
Posted 1 day ago
0 years
0 Lacs
Goregaon, Maharashtra, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Manager Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. *Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary: A career within Data and Analytics services will provide you with the opportunity to help organizations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organizational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organizations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Responsibilities: Key Responsibilities · Python, TensorFlow or PyTorch, Scikit-learn, and XGBoost, computer vision and NLP, MLOPs. · Recommendation algorithms (collaborative filtering, content-based filtering). · Experienced with MLOps tools and cloud platforms - any of GCP , Azure, Databricks, or AWS. · VertexAI experience where working with model deployment, model training pipelines should be part of experience · Experience with real-world ML applications in retail such as recommendation systems, demand forecasting, inventory optimization, or customer segmentation. · Experience in Retail Mandatory skill sets: · Python, TensorFlow or PyTorch, Scikit-learn, and XGBoost, computer vision and NLP, MLOPs. · Experienced with MLOps tools and cloud platforms - any of GCP , Azure, Databricks, or AWS. Preferred skill sets: · Experienced with MLOps tools and cloud platforms - any of GCP , Azure, Databricks, or Years of experience required: 7-11 Education qualification: B.Tech / M.Tech / MBA / MCA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering, Master of Business Administration, Master of Engineering Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Artificial Intelligence Markup Language Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Coaching and Feedback, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling {+ 32 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date
Posted 1 day ago
2.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Description We have an exciting and rewarding opportunity for you to take your software engineering career to the next level. As a Software Engineer at JPMorgan Chase within the Consumer and community banking - Data Management, you serve as a seasoned member of an agile team to design and deliver trusted market-leading technology products in a secure, stable, and scalable way. You are responsible for carrying out critical technology solutions across multiple technical areas within various business functions in support of the firm’s business objectives. Job responsibilities Executes software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems. Creates secure and high-quality production code and maintains algorithms that run synchronously with in integrated systems. Produces architecture and design artifacts for complex applications while being accountable for ensuring design constraints are met by software code development. Gathers, analyzes, synthesizes, and develops visualizations and reporting from large, diverse data sets in service of continuous improvement of software applications and systems. Proactively identifies hidden problems and patterns in data and uses these insights to drive improvements to coding hygiene and system architecture. Contributes to software engineering communities of practice and events that explore new and emerging technologies. Adds to team culture of diversity, equity, inclusion, and respect. Required Qualifications, Capabilities, And Skills Formal training or certification on software engineering concepts and 2+ years applied experience. Expertise on UI development using React, Node JS and Jest framework. Hands on experience in building or enhancing UI frameworks and contributed to Single Page Application (SPA) development. Through understanding of application security. Especially around authentication and authorization. Hands on experience in building and consuming RESTful api using Java AWS development EAC, terraform, cloud design patterns, distributed Streaming (Kafka), OpenSearch, S3 and PostgreSQL. Deployed complex, high available scalable systems & resilient apps on AWS cloud. Serve as team member to the delivery of high-quality, full-stack UI solutions using React JS, Jest, Java and cloud-based technologies while actively contributing to the code base. Hands-on practical experience in system design, application development, testing, and operational stability Experience in developing, debugging, and maintaining code in a large corporate environment with one or more modern programming languages and database querying languages. Demonstrated knowledge of software applications and technical processes within a technical discipline (e.g., cloud, artificial intelligence, machine learning, mobile, etc.) ABOUT US
Posted 1 day ago
2.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Description You’re ready to gain the skills and experience needed to grow within your role and advance your career — and we have the perfect software engineering opportunity for you. As a Software Engineer II at JPMorgan Chase within the Infrastructure Platforms - Production Services, you are part of an agile team that works to enhance, design, and deliver the software components of the firm’s state-of-the-art technology products in a secure, stable, and scalable way. As an emerging member of a software engineering team, you execute software solutions through the design, development, and technical troubleshooting of multiple components within a technical product, application, or system, while gaining the skills and experience needed to grow within your role. Job Responsibilities Executes standard software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems Writes secure and high-quality code using the syntax of at least one programming language with limited guidance Designs, develops, codes, and troubleshoots with consideration of upstream and downstream systems and technical implications Applies knowledge of tools within the Software Development Life Cycle toolchain to improve the value realized by automation Applies technical troubleshooting to break down solutions and solve technical problems of basic complexity Gathers, analyzes, and draws conclusions from large, diverse data sets to identify problems and contribute to decision-making in service of secure, stable application development Learns and applies system processes, methodologies, and skills for the development of secure, stable code and systems Adds to team culture of diversity, opportunity, inclusion, and respect Required Qualifications, Capabilities, And Skills Formal training or certification on software engineering concepts and 2+ years applied experience Advanced in Java programming language Strong technical skills in developing Micro services using JAVA, Spring Boot and PL/SQL Experience in Java (Core & EE, Spring Boot, Spring MVC, Spring Cloud) and Oracle Database Strong skills around object oriented analysis and design (OOAD),Design principles and Design patterns with ability to trouble shoot and debug an application for any application issues Hands-on practical experience in system design, application development, testing, and operational stability Experience in developing, debugging, and maintaining code in a large corporate environment with one or more modern programming languages and database querying languages Demonstrable ability to code in one or more languages Experience across the whole Software Development Life Cycle Exposure to agile methodologies such as CI/CD, Applicant Resiliency, and Security Preferred Qualifications, Capabilities, And Skills Experience with cloud infrastructure and solutions (AWS Preferred) Experience with UI using Angular, Typescript, HTML, CSS and other JS driven Web frameworks
Posted 1 day ago
10.0 years
0 Lacs
Kochi, Kerala, India
On-site
At EY, we’re all in to shape your future with confidence. We’ll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go. Join EY and help to build a better working world. Data Engineering Manager Objectives and Purpose The Data Engineering Manager leads large scale solution architecture design and optimisation to provide streamlined insights to partners throughout the business. This individual leads the team of Mid and Senior data engineers to partner with visualization on data quality and troubleshooting needs. The Data Engineering manager will: Implement data processes for the data warehouse and internal systems Lead a team of Junior and Senior Data Engineers in executing data processes and providing quality, timely data management Managing data architecture, designing ETL process Clean, aggregate and organize data from disparate sources and transfer it to data warehouses. Lead development testing and maintenance of data pipelines and platforms, to enable data quality to be utilized within business dashboards and tools. Support team members and direct reports in refining and validating data sets. Create, maintain, and support the data platform and infrastructure that enables the analytics front-end; this includes the testing, maintenance, construction, and development of architectures such as high-volume, large-scale data processing and databases with proper verification and validation processes. Your Key Responsibilities Lead the design, development, optimization, and maintenance of data architecture and pipelines adhering to ETL principles and business goals. Develop and maintain scalable data pipelines, build out new integrations using AWS native technologies and data bricks to support increases in data source, volume, and complexity. Define data requirements, gather and mine large scale of structured and unstructured data, and validate data by running various data tools in the Big Data Environment. Lead the ad hoc data analysis, support standardization, customization and develop the mechanisms to ingest, analyze, validate, normalize, and clean data. Write unit/integration/performance test scripts and perform data analysis required to troubleshoot data related issues. Implement processes and systems to drive data reconciliation and monitor data quality, ensuring production data is always accurate and available for key stakeholders, downstream systems, and business processes. Lead the evaluation, implementation and deployment of emerging tools and processes for analytic data engineering to improve productivity. Develop and deliver communication and education plans on analytic data engineering capabilities, standards, and processes. Solve complex data problems to deliver insights that help achieve business objectives. Partner with Business Analysts and Enterprise Architects to develop technical architectures for strategic enterprise projects and initiatives. Coordinate with Data Scientists, visualization developers and other data consumers to understand data requirements, and design solutions that enable advanced analytics, machine learning, and predictive modelling. Collaborate with AI/ML engineers to create data products for analytics and data scientist team members to improve productivity. Advise, consult, mentor and coach other data and analytic professionals on data standards and practices, promoting the values of learning and growth. Foster a culture of sharing, re-use, design for scale stability, and operational efficiency of data and analytical solutions. To qualify for the role, you must have the following: Preferred Skillsets Bachelor's degree in Engineering, Computer Science, Data Science, or related field 10+ years of experience in software development, data engineering, ETL, and analytics reporting development. Expert in building and maintaining data and system integrations using dimensional data modelling and optimized ETL pipelines. Experience in design and developing ETL pipelines using ETL tools like IICS, Datastage, Abinitio, Talend etc. Advanced experience utilizing modern data architecture and frameworks like data mesh, data fabric, data product design Experience with designing data integration frameworks capable of supporting multiple data sources, consisting of both structured and unstructured data Proven track record of designing and implementing complex data solutions Demonstrated understanding and experience using: Data Engineering Programming Languages (i.e., Python, SQL) Distributed Data Framework (e.g., Spark) Cloud platform services (AWS preferred) Relational Databases Working knowledge of DevOps (Github/Gitlab etc.) with continuous integration AWS knowledge on services like Lambda, DMS, Step Functions, S3, Event Bridge, Cloud Watch, Aurora RDS or related AWS ETL services Knowledge of Data lakes, Data warehouses Databricks/Delta Lakehouse architecture Deep understanding of database architecture, Data modelling concepts and administration. Handson experience of Spark Structured Streaming for building real-time ETL pipelines. Proficient in programming languages (e.g., SQL, Python, Pyspark) to design, develop, maintain, and optimize data architecture/pipelines that fit business goals. Extracts, transforms, and loads data from multiple external/internal sources using Databricks Lakehouse/Data Lake concepts into a single, consistent source to serve business users and data visualization needs. Leverages continuous integration and delivery principles to automate code deployment to elevated environments using GitHub Actions. Excellent written and verbal communication skills, including storytelling and interacting effectively with multifunctional teams and other strategic partners. Strong organizational skills with the ability to manage multiple projects simultaneously, operating as leading member across globally distributed teams. Strong problem solving and troubleshooting skills. Lead and oversee the code review process within the data engineering team to ensure high-quality, efficient, and maintainable code, while optimizing for performance and scalability. Ability to work in a fast-paced environment and adapt to changing business priorities. Identifying and implementing strategies to optimize AWS / Databricks cloud costs, ensuring efficient and cost-effective use of cloud resources. Understanding of Databricks Unity Catalog for effective data governance and implementing robust access control mechanisms is highly advantageous. Good To Have Skillsets Master's degree in engineering specialized in Computer Science, Data Science, or related field Demonstrated understanding and experience using: Knowledge in CDK Experience in IICS Data Integration tool Job orchestration tools like Tidal/Airflow/ or similar Knowledge on No SQL Experience in a global working environment Databricks Certified Data Engineer Professional AWS Certified Data Engineer - Associate EY | Building a better working world EY is building a better working world by creating new value for clients, people, society and the planet, while building trust in capital markets. Enabled by data, AI and advanced technology, EY teams help clients shape the future with confidence and develop answers for the most pressing issues of today and tomorrow. EY teams work across a full spectrum of services in assurance, consulting, tax, strategy and transactions. Fueled by sector insights, a globally connected, multi-disciplinary network and diverse ecosystem partners, EY teams can provide services in more than 150 countries and territories.
Posted 1 day ago
10.0 years
0 Lacs
Kanayannur, Kerala, India
On-site
At EY, we’re all in to shape your future with confidence. We’ll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go. Join EY and help to build a better working world. Data Engineering Manager Objectives and Purpose The Data Engineering Manager leads large scale solution architecture design and optimisation to provide streamlined insights to partners throughout the business. This individual leads the team of Mid and Senior data engineers to partner with visualization on data quality and troubleshooting needs. The Data Engineering manager will: Implement data processes for the data warehouse and internal systems Lead a team of Junior and Senior Data Engineers in executing data processes and providing quality, timely data management Managing data architecture, designing ETL process Clean, aggregate and organize data from disparate sources and transfer it to data warehouses. Lead development testing and maintenance of data pipelines and platforms, to enable data quality to be utilized within business dashboards and tools. Support team members and direct reports in refining and validating data sets. Create, maintain, and support the data platform and infrastructure that enables the analytics front-end; this includes the testing, maintenance, construction, and development of architectures such as high-volume, large-scale data processing and databases with proper verification and validation processes. Your Key Responsibilities Lead the design, development, optimization, and maintenance of data architecture and pipelines adhering to ETL principles and business goals. Develop and maintain scalable data pipelines, build out new integrations using AWS native technologies and data bricks to support increases in data source, volume, and complexity. Define data requirements, gather and mine large scale of structured and unstructured data, and validate data by running various data tools in the Big Data Environment. Lead the ad hoc data analysis, support standardization, customization and develop the mechanisms to ingest, analyze, validate, normalize, and clean data. Write unit/integration/performance test scripts and perform data analysis required to troubleshoot data related issues. Implement processes and systems to drive data reconciliation and monitor data quality, ensuring production data is always accurate and available for key stakeholders, downstream systems, and business processes. Lead the evaluation, implementation and deployment of emerging tools and processes for analytic data engineering to improve productivity. Develop and deliver communication and education plans on analytic data engineering capabilities, standards, and processes. Solve complex data problems to deliver insights that help achieve business objectives. Partner with Business Analysts and Enterprise Architects to develop technical architectures for strategic enterprise projects and initiatives. Coordinate with Data Scientists, visualization developers and other data consumers to understand data requirements, and design solutions that enable advanced analytics, machine learning, and predictive modelling. Collaborate with AI/ML engineers to create data products for analytics and data scientist team members to improve productivity. Advise, consult, mentor and coach other data and analytic professionals on data standards and practices, promoting the values of learning and growth. Foster a culture of sharing, re-use, design for scale stability, and operational efficiency of data and analytical solutions. To qualify for the role, you must have the following: Preferred Skillsets Bachelor's degree in Engineering, Computer Science, Data Science, or related field 10+ years of experience in software development, data engineering, ETL, and analytics reporting development. Expert in building and maintaining data and system integrations using dimensional data modelling and optimized ETL pipelines. Experience in design and developing ETL pipelines using ETL tools like IICS, Datastage, Abinitio, Talend etc. Advanced experience utilizing modern data architecture and frameworks like data mesh, data fabric, data product design Experience with designing data integration frameworks capable of supporting multiple data sources, consisting of both structured and unstructured data Proven track record of designing and implementing complex data solutions Demonstrated understanding and experience using: Data Engineering Programming Languages (i.e., Python, SQL) Distributed Data Framework (e.g., Spark) Cloud platform services (AWS preferred) Relational Databases Working knowledge of DevOps (Github/Gitlab etc.) with continuous integration AWS knowledge on services like Lambda, DMS, Step Functions, S3, Event Bridge, Cloud Watch, Aurora RDS or related AWS ETL services Knowledge of Data lakes, Data warehouses Databricks/Delta Lakehouse architecture Deep understanding of database architecture, Data modelling concepts and administration. Handson experience of Spark Structured Streaming for building real-time ETL pipelines. Proficient in programming languages (e.g., SQL, Python, Pyspark) to design, develop, maintain, and optimize data architecture/pipelines that fit business goals. Extracts, transforms, and loads data from multiple external/internal sources using Databricks Lakehouse/Data Lake concepts into a single, consistent source to serve business users and data visualization needs. Leverages continuous integration and delivery principles to automate code deployment to elevated environments using GitHub Actions. Excellent written and verbal communication skills, including storytelling and interacting effectively with multifunctional teams and other strategic partners. Strong organizational skills with the ability to manage multiple projects simultaneously, operating as leading member across globally distributed teams. Strong problem solving and troubleshooting skills. Lead and oversee the code review process within the data engineering team to ensure high-quality, efficient, and maintainable code, while optimizing for performance and scalability. Ability to work in a fast-paced environment and adapt to changing business priorities. Identifying and implementing strategies to optimize AWS / Databricks cloud costs, ensuring efficient and cost-effective use of cloud resources. Understanding of Databricks Unity Catalog for effective data governance and implementing robust access control mechanisms is highly advantageous. Good To Have Skillsets Master's degree in engineering specialized in Computer Science, Data Science, or related field Demonstrated understanding and experience using: Knowledge in CDK Experience in IICS Data Integration tool Job orchestration tools like Tidal/Airflow/ or similar Knowledge on No SQL Experience in a global working environment Databricks Certified Data Engineer Professional AWS Certified Data Engineer - Associate EY | Building a better working world EY is building a better working world by creating new value for clients, people, society and the planet, while building trust in capital markets. Enabled by data, AI and advanced technology, EY teams help clients shape the future with confidence and develop answers for the most pressing issues of today and tomorrow. EY teams work across a full spectrum of services in assurance, consulting, tax, strategy and transactions. Fueled by sector insights, a globally connected, multi-disciplinary network and diverse ecosystem partners, EY teams can provide services in more than 150 countries and territories.
Posted 1 day ago
2.0 years
0 Lacs
Kochi, Kerala, India
On-site
Job Overview Responsible for study build & design, edit specifications, system configurations and is accountable for associated study design components. This role will collaborate with various stakeholders- DTL, Programmer, Validation Team, vendors, statisticians, and client representatives. In addition to project deliveries, the role also would be responsible for the project financials from programming shared services perspective. Essential Functions Interprets the study protocol. Design and update the eCRF using third party or in house CDMS tools in alignment with industry standards like SDTM, CDASH as applicable. Create and update Edit Specification Document. Generate specifications for EDC build components (e.g., Rights and Roles, System Settings and Home Page) Complete the Study Authorization Form and Trial Capacity Request Form (InForm) Attend the Pre-Design Meeting, Online Screen Review Meeting, Unblinded Data review meeting. Attend and present comments at the Internal Design Review Meeting. May lead Online Screen Review Meeting. Facilitate the internal Edit Specification Review Meeting and leads the discussions regarding the Edit Specification Document. Design the database to collect LLRR data within the InForm database and ensure Rights and Roles document has appropriate access for Entry and Updates. Communicate any project risks to the Data Team Lead, including the potential for missing a timeline in the Data Management Project Plan. Escalate potential quality issues. Ensure the completion and documentation of all project-specific training, as well as staying current with required Standard Operating Procedures. Reviews build timelines and provide input as applicable. Reviews QIP for own projects, identify out of scope activities if any and inform relevant parties. Responsible for multiple study design projects at the same time. Might be working on projects across multiple platforms. Identify areas for process improvements on an ongoing basis. Actively take part and contribute towards process improvement initiatives as assigned besides providing suggestions for continuous improvement of processes. All responsibilities are essential job functions unless noted as nonessential (N). Qualifications Bachelor's Degree Bachelor in Science/Computer science/Information Technology or Bachelor in Technology Req 2-4 years of relevant core Technical designer experience and total exp being 7+ yrs. Req IQVIA is a leading global provider of clinical research services, commercial insights and healthcare intelligence to the life sciences and healthcare industries. We create intelligent connections to accelerate the development and commercialization of innovative medical treatments to help improve patient outcomes and population health worldwide. Learn more at https://jobs.iqvia.com
Posted 1 day ago
10.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
At EY, we’re all in to shape your future with confidence. We’ll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go. Join EY and help to build a better working world. Data Engineering Manager Objectives and Purpose The Data Engineering Manager leads large scale solution architecture design and optimisation to provide streamlined insights to partners throughout the business. This individual leads the team of Mid and Senior data engineers to partner with visualization on data quality and troubleshooting needs. The Data Engineering manager will: Implement data processes for the data warehouse and internal systems Lead a team of Junior and Senior Data Engineers in executing data processes and providing quality, timely data management Managing data architecture, designing ETL process Clean, aggregate and organize data from disparate sources and transfer it to data warehouses. Lead development testing and maintenance of data pipelines and platforms, to enable data quality to be utilized within business dashboards and tools. Support team members and direct reports in refining and validating data sets. Create, maintain, and support the data platform and infrastructure that enables the analytics front-end; this includes the testing, maintenance, construction, and development of architectures such as high-volume, large-scale data processing and databases with proper verification and validation processes. Your Key Responsibilities Lead the design, development, optimization, and maintenance of data architecture and pipelines adhering to ETL principles and business goals. Develop and maintain scalable data pipelines, build out new integrations using AWS native technologies and data bricks to support increases in data source, volume, and complexity. Define data requirements, gather and mine large scale of structured and unstructured data, and validate data by running various data tools in the Big Data Environment. Lead the ad hoc data analysis, support standardization, customization and develop the mechanisms to ingest, analyze, validate, normalize, and clean data. Write unit/integration/performance test scripts and perform data analysis required to troubleshoot data related issues. Implement processes and systems to drive data reconciliation and monitor data quality, ensuring production data is always accurate and available for key stakeholders, downstream systems, and business processes. Lead the evaluation, implementation and deployment of emerging tools and processes for analytic data engineering to improve productivity. Develop and deliver communication and education plans on analytic data engineering capabilities, standards, and processes. Solve complex data problems to deliver insights that help achieve business objectives. Partner with Business Analysts and Enterprise Architects to develop technical architectures for strategic enterprise projects and initiatives. Coordinate with Data Scientists, visualization developers and other data consumers to understand data requirements, and design solutions that enable advanced analytics, machine learning, and predictive modelling. Collaborate with AI/ML engineers to create data products for analytics and data scientist team members to improve productivity. Advise, consult, mentor and coach other data and analytic professionals on data standards and practices, promoting the values of learning and growth. Foster a culture of sharing, re-use, design for scale stability, and operational efficiency of data and analytical solutions. To qualify for the role, you must have the following: Preferred Skillsets Bachelor's degree in Engineering, Computer Science, Data Science, or related field 10+ years of experience in software development, data engineering, ETL, and analytics reporting development. Expert in building and maintaining data and system integrations using dimensional data modelling and optimized ETL pipelines. Experience in design and developing ETL pipelines using ETL tools like IICS, Datastage, Abinitio, Talend etc. Advanced experience utilizing modern data architecture and frameworks like data mesh, data fabric, data product design Experience with designing data integration frameworks capable of supporting multiple data sources, consisting of both structured and unstructured data Proven track record of designing and implementing complex data solutions Demonstrated understanding and experience using: Data Engineering Programming Languages (i.e., Python, SQL) Distributed Data Framework (e.g., Spark) Cloud platform services (AWS preferred) Relational Databases Working knowledge of DevOps (Github/Gitlab etc.) with continuous integration AWS knowledge on services like Lambda, DMS, Step Functions, S3, Event Bridge, Cloud Watch, Aurora RDS or related AWS ETL services Knowledge of Data lakes, Data warehouses Databricks/Delta Lakehouse architecture Deep understanding of database architecture, Data modelling concepts and administration. Handson experience of Spark Structured Streaming for building real-time ETL pipelines. Proficient in programming languages (e.g., SQL, Python, Pyspark) to design, develop, maintain, and optimize data architecture/pipelines that fit business goals. Extracts, transforms, and loads data from multiple external/internal sources using Databricks Lakehouse/Data Lake concepts into a single, consistent source to serve business users and data visualization needs. Leverages continuous integration and delivery principles to automate code deployment to elevated environments using GitHub Actions. Excellent written and verbal communication skills, including storytelling and interacting effectively with multifunctional teams and other strategic partners. Strong organizational skills with the ability to manage multiple projects simultaneously, operating as leading member across globally distributed teams. Strong problem solving and troubleshooting skills. Lead and oversee the code review process within the data engineering team to ensure high-quality, efficient, and maintainable code, while optimizing for performance and scalability. Ability to work in a fast-paced environment and adapt to changing business priorities. Identifying and implementing strategies to optimize AWS / Databricks cloud costs, ensuring efficient and cost-effective use of cloud resources. Understanding of Databricks Unity Catalog for effective data governance and implementing robust access control mechanisms is highly advantageous. Good To Have Skillsets Master's degree in engineering specialized in Computer Science, Data Science, or related field Demonstrated understanding and experience using: Knowledge in CDK Experience in IICS Data Integration tool Job orchestration tools like Tidal/Airflow/ or similar Knowledge on No SQL Experience in a global working environment Databricks Certified Data Engineer Professional AWS Certified Data Engineer - Associate EY | Building a better working world EY is building a better working world by creating new value for clients, people, society and the planet, while building trust in capital markets. Enabled by data, AI and advanced technology, EY teams help clients shape the future with confidence and develop answers for the most pressing issues of today and tomorrow. EY teams work across a full spectrum of services in assurance, consulting, tax, strategy and transactions. Fueled by sector insights, a globally connected, multi-disciplinary network and diverse ecosystem partners, EY teams can provide services in more than 150 countries and territories.
Posted 1 day ago
3.0 - 5.0 years
0 Lacs
Mysore, Karnataka, India
On-site
Introduction A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. You'll work with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio; including Software and Red Hat. Curiosity and a constant quest for knowledge serve as the foundation to success in IBM Consulting. In your role, you'll be encouraged to challenge the norm, investigate ideas outside of your role, and come up with creative solutions resulting in ground breaking impact for a wide network of clients. Our culture of evolution and empathy centers on long-term career growth and development opportunities in an environment that embraces your unique skills and experience. Your Role And Responsibilities As a Software Developer you'll participate in many aspects of the software development lifecycle, such as design, code implementation, testing, and support. You will create software that enables your clients' hybrid-cloud and AI journeys. You'll have the opportunity to work with the latest technologies, ensuring the applications delivered are high performing, highly available, responsive, and maintainable. Your Primary Responsibilities Include Analytical Problem-Solving and Solution Enhancement: Analyze, validate and propose improvements to existing failures, with the support of the architect and technical leader. Comprehensive Engagement Across Process Phases: Involvement in every step of the process, from design, development, testing release changes and troubleshoot where necessary, providing a great customer service. Strategic Stakeholder Engagement and Innovative Coding Solutions: Drive key discussions with your stakeholders and analyze the current landscape for opportunities to operate and code creative solutions. Preferred Education Master's Degree Required Technical And Professional Expertise Minimum 3-5 years of experience in architecting and planning SAP Upgrade and database migration projects Proven track record of delivering SAP upgrades, migrations and other Basis related projects Demonstrated consulting background in providing both technical delivery and advisory services in a senior role Experience have experience in Real world SAP HANA BASIS & BTP Admin Experience in backup/Restore/Recovery of SAP/Oracle installations, Server Monitoring and optimizing techniques Preferred Technical And Professional Experience Certification in SAP OS (Operating System)/ DB (Database) Migration Consultant or Technology Associate - SAP Landscape Transformation 2.0 Experience in working in implementation, upgrade, maintenance and post- production support projects HANA Migration Basis (ECC
Posted 1 day ago
0 years
0 Lacs
Thiruvananthapuram, Kerala, India
On-site
Location: Trivandrum About us At Arbor, we're on a mission to transform the way schools work for the better. We believe in a future of work in schools where being challenged doesn't mean being burnt out and overworked. Where data guides progress without overwhelming staff. And where everyone working in a school is reminded why they got into education every day. Our MIS and school management tools are already making a difference in over 7,000 schools and trusts. Giving time and power back to staff, turning data into clear, actionable insights, and supporting happier working days. At the heart of our brand is a recognition that the challenges schools face today aren't just about efficiency, outputs and productivity - but about creating happier working lives for the people who drive education everyday: the staff. We want to make schools more joyful places to work, as well as learn. About the role We are looking for a smart and hard-working Associate Product Engineer to join our Engineering team and help us contribute towards the development of the company's core systems and processes, along with contributing to auxiliary services as required. The remit and focus of the role is to specialise in PHP application and web development, while actively participating in the creation of solutions that support various systems where needed. The emphasis for this role is on learning, collaborating with the team, and gaining proficiency in PHP and web development. It's a broad and exciting role, so we're looking for someone up for a challenge - if you're an enthusiastic and collaborative candidate, this is the role for you. Core responsibilities Actively contribute to the successful implementation of relevant features, enhancing the core systems and platforms Collaborate with the team to improve product lead time, ensuring efficient delivery Play a role in identifying and resolving bugs and issues within the system, actively participating in problem-solving efforts Assist in the creation of clear and comprehensive technical documentation for the system, fostering a knowledge-sharing culture within the team Collaborate closely with other Engineers to help solutionize feature requests Develop and implement appropriate tests around new features, ensuring thorough test coverage and reliability Write clean, well-documented code using standard design patterns and methodologies, emphasizing maintainability and readability Actively participate in code reviews and pair programming sessions with colleagues, fostering a collaborative and learning-oriented environment Requirements About you Fundamental understanding of Object-Oriented languages and software development Basic knowledge of relational database technologies A positive and proactive attitude to problem solving Enthusiastic about learning and developing skills in a collaborative environment A team player, willing to muck in and help others when needed, driven personality who asks questions and actively participates in discussions Good written and spoken English so you can present your ideas Bonus skills Understanding of software engineering principles, such as SOLID, DRY etc Familiarity with Scrum methodology or other agile development processes Basic awareness of PHP Familiarity with software best practices such as Refactoring, Clean Code, Domain-Driven Design, Test-Driven Development, etc Benefits What we offer The chance to work alongside a team of hard-working, passionate people in a role where you'll see the impact of your work everyday. We also offer: Flexible work environment (3 days work from office) Group Term Life Insurance paid out at 3x Annual CTC (Arbor India) 32 days holiday (plus Arbor Holidays). This is made up of 25 days annual leave plus 7 extra companywide days given over Easter, Summer & Christmas Work time: 9.30 am to 6 pm (8.5 hours only) Compensation - 100% fixed salary disbursement and no variable component Interview process Phone screen 1st stage Assessment with a task 2nd stage Final Round We are committed to a fair and comfortable recruitment process, so if you require any reasonable adjustments during your application or interview process, please reach out to a member of the team at careers@arbor-education.com. Our commitment is also backed by our partnership with Neurodiversity Consultancy, Lexxic who provide us with training, support and advice. Arbor Education is an equal opportunities organisation Our goal is for Arbor to be a workplace which represents, celebrates and supports people from all backgrounds, and which gives them the tools they need to thrive - whatever their ambitions may be so we support and promote diversity and equality, and actively encourage applications from people of all backgrounds. Refer a friend Know someone else who would be good for this role? You can refer a friend, family member or colleague, if they are offered a role with Arbor, we will say thank you with a voucher valued up to £200! Simply email: careers@arbor-education.com Please note: We are unable to provide visa sponsorship at this time.
Posted 1 day ago
7.0 years
0 Lacs
Gurugram, Haryana, India
Remote
About The Role: Grade Level (for internal use): 09 S&P Global Mobility The Role: Senior Salesforce Consultant About The Role: We are seeking a highly skilled and experienced Senior Salesforce Consultant to join our team. In this role, you will lead the design, development, and implementation of Salesforce solutions that align with business goals. You will work closely with stakeholders, business analysts, and technical teams to deliver high-impact CRM solutions, drive user adoption, and ensure best practices in Salesforce architecture and delivery. The Team: The Salesforce development team within Carfax of S&P Mobility is responsible for designing, building, and optimizing scalable solutions on the Salesforce platform to meet business needs and improve user experiences. The team thrives on collaboration, continuous learning, and innovation, often working cross-functionally to deliver high-impact features.. The Impact: The Salesforce developer role directly contributes to business growth by streamlining operations, automating key processes, and enabling data-driven decision-making through tailored Salesforce solutions. Salesforce developers help the business stay competitive in the market by rapidly adapting to client needs and industry trends through scalable, efficient technology What’s In It For You: High-impact work : Contribute to mission-critical projects that shape business strategy and directly influence client experiences across global markets. Professional growth : Gain continuous learning opportunities through hands-on development, certifications, and exposure to the latest Salesforce technologies and tools. Strategic exposure : Collaborate with cross-functional teams, including senior stakeholders and policy-makers, gaining insight into high-level decision-making. Global reach : Work in a dynamic, international environment that offers the chance to develop scalable solutions used across multiple regions and industries Key Responsibilities: Collaborate with business stakeholders to understand requirements and translate them into scalable Salesforce solutions. Design and implement Salesforce configurations, customizations, and integrations. Provide technical solutions and establish best practices across integration, application development, deployment, testing (both unit and system), and iterative improvements. Mentor junior consultants and developers on Salesforce best practices. Conduct workshops, discovery sessions, and training to support project delivery and user adoption. Create detailed documentation including technical designs, data models, and process flows. Provide expert guidance on Salesforce products such as Sales Cloud, Service Cloud, and/or CPQ. Manage project timelines, deliverables, and stakeholder communication. Ensure data integrity, security, and compliance within the Salesforce platform. Stay current on Salesforce releases, features, and industry trends. Explore new Salesforce offerings and work with product experts to explore new solutions, demo to the Agilent business team and prepare the roadmap for future growth Required Qualifications: Bachelor’s/Master’s degree in Computer Science, Information Systems, or a related field. 7+ years of hands-on experience with Salesforce CRM implementation and consulting. Strong understanding of Salesforce architecture, data modeling, and development lifecycle. Proficiency in declarative tools (Flows, Process Builder, Lightning App Builder) and Apex/Visualforce/LWC development. Understanding and working knowledge of integrating third-party components with Salesforce using REST/SOAP APIs or Data Loader Strong hands on experience utilizing Salesforce Apex, Visual Force, Lightning Web Components, SOQL/SOCL, and DML for customization and development. Experience completing multiple end-to-end Salesforce.com implementation projects requiring integration into legacy and other ERP systems using Salesforce APIs Experience building applications using No-Code/Low Code applications using Flow builder and Process builder. Experience in object modeling in Salesforce and understanding of fundamental database concepts Experience with Agile methodologies, JIRA, and deployment tools Ability to multi-task & handle fast paced situations Excellent Oral and written communication Skills Ability to be highly productive, both working alone and in close collaboration within a team, and able to use good judgment to make effective decisions within appropriate risk mitigation. Experience with integration tools (Informatica, etc.) and APIs. Handle Salesforce admin side implementations and configurations. Salesforce certifications such as Salesforce Certified Administrator, Platform Developer I & II, and Salesforce Certified Consultant (Sales/Service Cloud). Excellent communication and stakeholder management skills. Proven ability to manage multiple projects and deliver high-quality work on time. What We Offer: Competitive salary and performance bonuses. Flexible working arrangements (remote/Hybrid). Ongoing learning and certification support. Dynamic, collaborative team environment. Opportunities to work on high-impact Salesforce projects. S&P Global delivers essential intelligence that powers decision making. We provide the world’s leading organizations with the right data, connected technologies and expertise they need to move ahead. As part of our team, you’ll help solve complex challenges that equip businesses, governments and individuals with the knowledge to adapt to a changing economic landscape. S&P Global Mobility turns invaluable insights captured from automotive data to help our clients understand today’s market, reach more customers, and shape the future of automotive mobility. About S&P Global Mobility At S&P Global Mobility, we provide invaluable insights derived from unmatched automotive data, enabling our customers to anticipate change and make decisions with conviction. Our expertise helps them to optimize their businesses, reach the right consumers, and shape the future of mobility. We open the door to automotive innovation, revealing the buying patterns of today and helping customers plan for the emerging technologies of tomorrow. For more information, visit www.spglobal.com/mobility. What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our Benefits Include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring And Opportunity At S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert: If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com. S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, “pre-employment training” or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here. Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 318709 Posted On: 2025-08-07 Location: Gurgaon, Haryana, India
Posted 1 day ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39928 Jobs | Dublin
Wipro
19405 Jobs | Bengaluru
Accenture in India
15976 Jobs | Dublin 2
EY
15128 Jobs | London
Uplers
11281 Jobs | Ahmedabad
Amazon
10521 Jobs | Seattle,WA
Oracle
9339 Jobs | Redwood City
IBM
9274 Jobs | Armonk
Accenture services Pvt Ltd
7978 Jobs |
Capgemini
7754 Jobs | Paris,France