Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
10.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Who we are Mindtickle is the market-leading revenue productivity platform that combines on-the-job learning and deal execution to get more revenue per rep. Mindtickle is recognized as a market leader by top industry analysts and is ranked by G2 as the #1 sales onboarding and training product. We’re honoured to be recognized as a Leader in the first-ever Forrester Wave™: Revenue Enablement Platforms, Q3 2024! Job Snapshot We’re looking for a Senior Technical Program Manager (TPM) to join our Technical Solutions team and lead strategic, customer-facing solution projects. This is a highly experienced, individual contributor role that combines the technical expertise of a Solutions Architect with the disciplined execution of a seasoned TPM. You’ll work directly with enterprise customers, internal engineering, product, and services teams to deliver complex technical programs that generate measurable business value. If you enjoy bringing clarity to chaos, translating business needs into scalable technical solutions, and owning delivery from start to finish - we want to hear from you. What’s in it for you? Be the technical owner and advisor for enterprise customer engagements — designing, guiding, and delivering scalable solutions. Own timelines, risk mitigation, and stakeholder alignment across all levels, fostering transparent communication around shared outcomes and customer expectations. Drive end-to-end execution of technical programs across integrations, data pipelines, custom reporting, and automation workflows. Manage cross-functional delivery across Engineering, Product, and Services to ensure high-quality outcomes. Communicate with precision — from technical deep dives to executive-level status updates. Collaborate with Sales and Pre-Sales for scoping, and with Customer Success to drive post-go-live adoption. Own timelines, risk mitigation, and stakeholder alignment across all levels, fostering transparent communication around shared outcomes and customer expectations. Serve as the primary TPM interface for strategic enterprise customers - leading technical planning, progress reviews, risk discussions, and ensuring alignment among diverse stakeholder groups. Define, implement, and continuously evolve solution delivery standards, customer onboarding playbooks, and escalation management procedures to enable consistent, scalable service excellence. Build and institutionalize reusable frameworks and tools to enable predictable, repeatable execution in a fast-paced, dynamic customer-facing delivery environment. Act as the “connective tissue” across the services organization, bridging strategy with execution and aligning efforts between Professional Services (PS), Managed Services (MS), Customer Success Managers (CSMs), and Support teams. Lead operational excellence initiatives focused on process optimization, automation, and scalability to support growing customer demands and business complexity. Define and track key delivery metrics and KPIs (e.g., SLA adherence, escalation resolution time, customer satisfaction, throughput) with a focus on continuous improvement and scale readiness. Bring strong “T-shaped” technical expertise - deep understanding of relevant technologies, architectures, and integration points—to effectively evaluate technical feasibility, risks, and solutions. Operate with an enterprise mindset, addressing complex, multi-stakeholder environments, compliance requirements, and large-scale deployments. Deliver executive ready communications on delivery status, risk posture, and critical decisions to senior leadership and customer executives. We’d love to hear from you, if you: 10+ years of total experience in technical program management roles. Demonstrated success in similar TPM roles within enterprise technology companies, managing strategic, technical, and customer-facing delivery in services or solutions teams. Proven ability to operate effectively at the intersection of technology, process, and customer engagement in complex, matrixed organizations. Proven ability to lead enterprise programs with customer-facing responsibilities. Strong technical foundation in APIs, integrations, SSO, ETL tools, databases, scripting, and cloud architectures. Skilled in driving large-scale initiatives involving cross-functional teams, multiple stakeholders, and high business impact. Exceptional project leadership, stakeholder management, and communication skills. A sharp business and product mindset — you understand the “why,” not just the “how.” Comfort with ambiguity, ownership, and fast-paced environments. Our culture & accolades As an organization, it’s our priority to create a highly engaging and rewarding workplace. We offer tons of awesome perks and many opportunities for growth. Our culture reflects our employee's globally diverse backgrounds along with our commitment to our customers, and each other, and a passion for excellence. We live up to our values, DAB, Delight your customers, Act as a Founder, and Better Together. Mindtickle is proud to be an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, colour, religion, sex, national origin, disability, protected veteran status, or any other characteristic protected by law. Your Right to Work - In compliance with applicable laws, all persons hired will be required to verify identity and eligibility to work in the respective work locations and to complete the required employment eligibility verification document form upon hire.
Posted 6 days ago
8.0 years
0 Lacs
India
On-site
Job Title: SAP Data Lead – TM & EWM Job Summary: We are seeking a skilled SAP Data Lead to drive data strategy and execution for a large-scale Brownfield migration from SAP ECC to S/4HANA 2023 Private Cloud Edition (AWS) under the RISE with SAP framework. The ideal candidate will have deep experience in data migration and data governance , with strong functional understanding of SAP TM and EWM modules. Key Responsibilities: Own and lead the end-to-end data migration lifecycle for TM and EWM modules during the ECC to S/4HANA conversion. Define and implement data strategy , including data cleansing, harmonization, mapping, and transformation. Collaborate with functional teams to understand master and transactional data requirements in TM and EWM. Coordinate with business stakeholders to validate legacy data and ensure alignment with S/4HANA data models . Utilize SAP tools such as SAP Migration Cockpit , LTMC , LTMOM , and Data Services for migration execution. Ensure data quality and integrity through rigorous validation, reconciliation, and audit processes . Support cutover planning and execution, including mock runs and go-live readiness. Work closely with BASIS and infrastructure teams to manage data loads in Private Cloud environments. Provide guidance on data governance , ownership, and ongoing maintenance post-migration. Document data migration processes and provide training to key users and stakeholders. Required Skills & Experience: 8+ years of experience in SAP data management , with at least 2 full-cycle S/4HANA migration projects . Strong hands-on experience in SAP TM and EWM data structures and business processes. Proven expertise in data migration tools and methodologies (SAP Migration Cockpit, LSMW, LTMC, etc.). Experience with RISE with SAP and Private Cloud Edition (PCE) deployments. Familiarity with SAP ECC and S/4HANA data models , especially in logistics and supply chain domains. Excellent analytical, problem-solving, and stakeholder management skills. Preferred Qualifications: SAP certification in Data Management , TM , or EWM . Experience with Agile/Scrum methodologies. Exposure to SAP BODS , MDG , or third-party ETL tools .
Posted 6 days ago
6.0 years
0 Lacs
India
On-site
Job Description: Responsibilities: Develop and implement data models and algorithms to solve complex business problems. Utilize Databricks to manage and analyse large datasets efficiently. Collaborate with cross-functional teams to understand business requirements and deliver data-driven insights. Design and build scalable data pipelines and ETL processes. Perform data exploration, preprocessing, and feature engineering. Conduct statistical analysis and machine learning model development. Communicate findings and insights to stakeholders through data visualization and reports. Stay current with industry trends and best practices in data science and big data technologies. Requirements: Minimum 6 years of experience as a Data Scientist Required. Proven experience as a Data Scientist or similar role. Proficiency with Databricks and its ecosystem. Strong programming skills in Python, R, or Scala. Experience with big data technologies such as Apache Spark, Databricks. Knowledge of SQL and experience with relational databases. Familiarity with cloud platforms (e.g., AWS, Azure, Google Cloud). Strong analytical and problem-solving skills. Excellent communication and teamwork abilities. Bachelor's degree in Data Science, Computer Science, Statistics, or a related field (or equivalent experience). Preferred Qualifications: Advanced degree (Master's or Ph.D.) in a relevant field. Experience with machine learning frameworks (e.g., TensorFlow, PyTorch). Knowledge of data visualization tools (e.g., Tableau, Power BI). Familiarity with version control systems (e.g., Git).
Posted 6 days ago
5.0 years
0 Lacs
India
On-site
Coursera was launched in 2012 by Andrew Ng and Daphne Koller with a mission to provide universal access to world-class learning. It is now one of the largest online learning platforms in the world, with 183 million registered learners as of June 30, 2025 . Coursera partners with over 350 leading university and industry partners to offer a broad catalog of content and credentials, including courses, Specializations, Professional Certificates, and degrees. Coursera’s platform innovations enable instructors to deliver scalable, personalized, and verified learning experiences to their learners. Institutions worldwide rely on Coursera to upskill and reskill their employees, citizens, and students in high-demand fields such as GenAI, data science, technology, and business. Coursera is a Delaware public benefit corporation and a B Corp. Join us in our mission to create a world where anyone, anywhere can transform their life through access to education. We're seeking talented individuals who share our passion and drive to revolutionize the way the world learns. At Coursera, we are committed to building a globally diverse team and are thrilled to extend employment opportunities to individuals in any country where we have a legal entity. We require candidates to possess eligible working rights and have a compatible timezone overlap with their team to facilitate seamless collaboration. Coursera has a commitment to enabling flexibility and workspace choices for employees. Our interviews and onboarding are entirely virtual, providing a smooth and efficient experience for our candidates. As an employee, we enable you to select your main way of working, whether it's from home, one of our offices or hubs, or a co-working space near you. Job Overview: Does architecting high quality and scalable data pipelines powering business critical applications excite you? How about working with cutting edge technologies alongside some of the brightest and most collaborative individuals in the industry? Join us, in our mission to bring the best learning to every corner of the world! We’re looking for a passionate and talented individual with a keen eye for data to join the Data Engineering team at Coursera! Data Engineering plays a crucial role in building a robust and reliable data infrastructure that enables data-driven decision-making, as well as various data analytics and machine learning initiatives within Coursera. In addition, Data Engineering today owns many external facing data products that drive revenue and boost partner and learner satisfaction. You firmly believe in Coursera's potential to make a significant impact on the world, and align with our core values: Learners first: Champion the needs, potential, and progress of learners everywhere. Play for team Coursera: Excel as an individual and win as a team. Put Coursera’s mission and results before personal goals. Maximize impact: Increase leverage by focusing on things that produce bigger results with less effort. Learn, change, and grow: Move fast, take risks, innovate, and learn quickly. Invite and offer feedback with respect, courage, and candor. Love without limits: Celebrate the diversity and dignity of every one of our employees, learners, customers, and partners. Your Responsibilities Architect scalable data models and construct high quality ETL pipelines that act as the backbone of our core data lake, with cutting edge technologies such as Airflow, DBT, Databricks, Redshift, Spark. Your work will lay the foundation for our data-driven culture. Design, build, and launch self-serve analytics products. Your creations will empower our internal and external customers, providing them with rich insights to make informed decisions. Be a technical leader for the team. Your guidance in technical and architectural designs for major team initiatives will inspire others. Help shape the future of Data Engineering at Coursera and foster a culture of continuous learning and growth. Partner with data scientists, business stakeholders, and product engineers to define, curate, and govern high-fidelity data. Develop new tools and frameworks in collaboration with other engineers. Your innovative solutions will enable our customers to understand and access data more efficiently, while adhering to high standards of governance and compliance. Work cross-functionally with product managers, engineers, and business teams to enable major product and feature launches. Your Skills 5+ years experience in data engineering with expertise in data architecture and pipelines Strong programming skills in Python Proficient with relational databases, data modeling, and SQL Experience with big data technologies (eg: Hive, Spark, Presto) Familiarity with batch and streaming architectures preferred Hands-on experience with some of: AWS, Databricks, Delta Lake, Airflow, DBT, Redshift, Datahub, Elementary Knowledgeable on data governance and compliance best practices Ability to communicate technical concepts clearly and concisely Independence and passion for innovation and learning new technologies If this opportunity interest you, you might like these courses on Coursera - Big Data Specialization Data Warehousing for Business Intelligence IBM Data Engineering Professional Certificate Coursera is an Equal Employment Opportunity Employer and considers all qualified applicants without regard to race, color, religion, sex, sexual orientation, gender identity, age, marital status, national origin, protected veteran status, disability, or any other legally protected class. If you are an individual with a disability and require a reasonable accommodation to complete any part of the application process, please contact us at accommodations@coursera.org. For California Candidates, please review our CCPA Applicant Notice here. For our Global Candidates, please review our GDPR Recruitment Notice here.
Posted 6 days ago
5.0 years
0 Lacs
Gurugram, Haryana, India
On-site
The BI Data Engineer is a key role within the Enterprise Data team. We are looking for expert Azure data engineer with deep Data engineering, ADF Integration and database development experience. This is a unique opportunity to be involved in delivering leading-edge business analytics using the latest and greatest cutting-edge BI tools, such as cloud-based databases, self-service analytics and leading visualisation tools enabling the company’s aim to become a fully digital organisation. Job Description: Key Responsibilities: Build Enterprise data engineering and Integration solutions using the latest Azure platform, Azure Data Factory, Azure SQL Database, Azure Synapse and Azure Fabric Development of enterprise ETL and integration routines using ADF Evaluate emerging Data enginnering technologies, standards and capabilities Partner with business stakeholder, product managers, and data scientists to understand business objectives and translate them into technical solutions. Work with DevOps, engineering, and operations teams to implement CI/CD pipelines and ensure smooth deployment of data engiinering solutions Required Skills And Experience Technical Expertise : Expertise in the Azure platform including Azure Data Factory, Azure SQL Database, Azure Synapse and Azure Fabric Exposure to Data bricks and lakehouse arcchitect & technologies Extensive knowledge of data modeling, ETL processes and data warehouse design principles. Experienc in machine learning and AI services in Azure. Professional Experience : 5+ years of experience in database development using SQL 5+ Years integration and data engineering experience 5+ years experience using Azure SQL DB, ADF and Azure Synapse 2+ Years experience using Power BI Comprehensive understanding of data modelling Relevant certifications in data engineering, machine learning, AI. Key Competencies: Expertise in data engineering and database development. Familiarity with the Microsoft Fabric technologies including One Lake, Lakehouse and Data Factory Strong understanding of data governance, compliance, and security frameworks. Proven ability to drive innovation in data strategy and cloud solutions. A deep understanding of business intelligence workflows and the ability to align technical solutions Strong database design skills, including an understanding of both normalised form and dimensional form databases. In-depth knowledge and experience of data-warehousing strategies and techniques e.g., Kimball Data warehousing Experience in Cloud based data integration tools like Azure Data Factory Experience in Azure Dev Ops or JIRA is a plus Experience working with finance data is highly desirable Familiarity with agile development techniques and objectives Location: Pune Brand: Dentsu Time Type: Full time Contract Type: Permanent
Posted 6 days ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Talent Worx is thrilled to announce an exciting opportunity for the roles of Snowflake and Spark Developers! Join us in revolutionizing the data analytics landscape as we partner with one of the Big 4 firms in India. What impact will you make? Your contributions will play a vital role in shaping our clients' success stories by utilizing innovative technologies and frameworks. Envision a dynamic culture that supports inclusion, collaboration, and exceptional performance. With us, you will discover unrivaled opportunities to accelerate your career and achieve your goals. The Team In our Analytics & Cognitive (A&C) practice, you will find a dedicated team committed to unlocking the value hidden within large datasets. Our globally-connected network ensures that our clients gain actionable insights that support fact-driven decision-making, leveraging advanced techniques including big data, cloud computing, cognitive capabilities, and machine learning. Work you will do As a key player in our organization, you will contribute directly to enhancing our clients' competitive positioning and performance with innovative and sustainable solutions. We expect you to collaborate closely with our teams and clients to deliver outstanding results across various projects. Requirements 5+years of relevant experience in Spark and Snowflake with practical experience in at least one project implementation Strong experience in developing ETL pipelines and data processing workflows using Spark Expertise in Snowflake architecture, including data loading and unloading processes, table structures, and virtual warehouses Proficiency in writing complex SQL queries in Snowflake for data transformation and analysis Experience with data integration tools and techniques, ensuring the seamless ingestion of data Familiarity with building and monitoring data pipelines in a cloud environment Exposure to Agile methodology and tools like Jira and Confluence Strong analytical and problem-solving skills, with meticulous attention to detail Excellent communication and interpersonal skills to foster collaborations with clients and team members Ability to travel as required by project demands Qualifications Snowflake certification or equivalent qualification is a plus Prior experience working with both Snowflake and Spark in a corporate setting Formal education in Computer Science, Information Technology, or a related field Proven track record of working with cross-functional teams Benefits Work with one of the Big 4's in India Healthy work Environment Work Life Balance
Posted 6 days ago
6.0 years
4 - 6 Lacs
Hyderābād
On-site
We are seeking a Senior Data Engineer for our Marketing team in Thomson Reuters. Design and develop our data transformation initiatives as we build the data foundation to drive our marketing strategy to enhance our internal and external customer experiences and personalization. This is a mission-critical role with substantial scope, complexity, executive visibility, and has a large opportunity for impact. You will play a critical role in ensuring that customer data is effectively managed and utilized to drive business insights and facilitating informed decision-making and help Thomson Reuters rapidly scale our digital customer experiences. About the Role In this role as a Senior Data Engineer, you will: Independently own and manage assigned projects and meet deadlines, clearly communicating progress and barriers to manager and stakeholders. Serve as a visible Subject Matter Expert on our Customer Data Platform, maintaining up-to-date awareness of industry trends, cutting-edge technologies, and best practices on relevant topics including unified customer profiles, deterministic and probabilistic matching, identity graphs, data enrichment, etc. Design and implement data ingestion pipelines to collect and ingest customer data into the Customer Data Platform from various sources. This involves setting up data pipelines, APIs, and ETL (Extract, Transform, Load) processes. Create and design data models, schemas, and database structures in Snowflake and the Customer Data Platform. Carry out comprehensive data analysis from various system sources to yield enhanced insights into customer behavior and preferences. Gather and analyze data from various touchpoints, including online interactions, transactional systems, and customer feedback channels, creating a comprehensive customer profile that presents a 360-degree view. Ensure the launch of new data, segmentation, and profile capabilities, as well as the evolutions of the platform, go smoothly. This includes testing, post-launch monitoring, and overall setup for long-term success. Collaborate with marketers and other stakeholders to understand their data needs and translate those needs into technical requirements. Actively identify and propose innovations in data practices that evolve capabilities, improve efficiency or standardization, and better support stakeholders. Shift Timings: 2 PM to 11 PM (IST). Work from office for 2 days in a week (Mandatory). About You You’re a fit for the role of Senior Data Engineer, if your background includes: Bachelor’s or master’s degree in data science, business, technology, or an equivalent field. Strong Data Engineering background with 6+ years of experience working on large data transformation projects, related to customer data platforms, Identity Resolution, and Identity Graphs. Solid foundation in SQL and familiarity with other query engines, along with hands-on experience with Snowflake, AWS Cloud, DBT, and Real-time APIs. Expertise in using Presto for querying data across multiple sources and Digdag for workflow management, including the ability to create, schedule, and monitor data workflows. Proficient in configuring and implementing any industry-leading customer data platform, including data integration, segmentation, and activations is a must. Experience using marketing data sources such as CRM especially Salesforce, marketing automation platform especially Eloqua, web tracking Adobe Analytics is a plus. Exposure to Gen AI, capable of leveraging AI solutions to address complex data challenges. Excellent oral, written, and visual (Power point slides) communication skills, especially in breaking down complex information into understandable pieces, telling stories with data, and translating technical concepts for non-technical audiences. Strong ability to organize, prioritize, and complete tasks with a high attention to detail, even in the face of ambiguity and environmental barriers. Knowledge of marketing or digital domains and of professional services industry, especially legal, tax, and accounting is a plus. Experience in working in iterative development and a solid grasp of agile practices. #LI-GS2 What’s in it For You? Hybrid Work Model: We’ve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrow’s challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits: We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our values: Obsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact: Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world. About Us Thomson Reuters informs the way forward by bringing together the trusted content and technology that people and organizations need to make the right decisions. We serve professionals across legal, tax, accounting, compliance, government, and media. Our products combine highly specialized software and insights to empower professionals with the data, intelligence, and solutions needed to make informed decisions, and to help institutions in their pursuit of justice, truth, and transparency. Reuters, part of Thomson Reuters, is a world leading provider of trusted journalism and news. We are powered by the talents of 26,000 employees across more than 70 countries, where everyone has a chance to contribute and grow professionally in flexible work environments. At a time when objectivity, accuracy, fairness, and transparency are under attack, we consider it our duty to pursue them. Sound exciting? Join us and help shape the industries that move society forward. As a global business, we rely on the unique backgrounds, perspectives, and experiences of all employees to deliver on our business goals. To ensure we can do that, we seek talented, qualified employees in all our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under applicable law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. We also make reasonable accommodations for qualified individuals with disabilities and for sincerely held religious beliefs in accordance with applicable law. More information on requesting an accommodation here. Learn more on how to protect yourself from fraudulent job postings here. More information about Thomson Reuters can be found on thomsonreuters.com.
Posted 6 days ago
0 years
3 - 6 Lacs
Hyderābād
On-site
Company Description Experian is a global data and technology company, powering opportunities for people and businesses around the world. We help to redefine lending practices, uncover and prevent fraud, simplify healthcare, create marketing solutions, all using our unique combination of data, analytics and software. We also assist millions of people to realise their financial goals and help them save time and money. We operate across a range of markets, from financial services to healthcare, automotive, agribusiness, insurance, and many more industry segments. We invest in people and new advanced technologies to unlock the power of data. As a FTSE 100 Index company listed on the London Stock Exchange (EXPN), we have a team of 22,500 people across 32 countries. Our corporate headquarters are in Dublin, Ireland. Learn more at experianplc.co m Job Description Job description We are looking for a Senior Software Engineer to join our Ascend Cloud Foundation Platform team. Background: We unlock the power of data to create opportunities for consumers, businesses and society. At life’s big moments – from buying a home or car, to sending a child to university, to growing a business exponentially by connecting it with new customers – we empower consumers and our clients to manage their data with confidence so they can maximize every opportunity. We require a senior software engineer in Hyderabad, India to work alongside our UK colleagues to deliver business outcomes for the UK&I region. You will join an established agile technical team, where you will work with the Lead Engineer and Product Owner to help develop the consumer data attributes, work with data analytics to validate the accuracy of the calculations whilst ensuring that you work to the highest technical standards. Key responsibilities: Design, develop, and maintain scalable and efficient data pipelines and ETL processes to extract, transform, and load data from various sources into our data lake or warehouse. Collaborate with cross-functional teams including data scientists, analysts, and software engineers to understand data requirements, define data models, and implement solutions that meet business needs. Ensure the security, integrity, and quality of data throughout the data lifecycle, implementing best practices for data governance, encryption, and access control. Develop and maintain data infrastructure components such as data warehouses, data lakes, and data processing frameworks, leveraging cloud services (e.g., AWS, Azure, GCP) and containerization technologies (e.g., Docker, Kubernetes). Implement monitoring, logging, and alerting mechanisms to ensure the reliability and availability of data pipelines and systems, and to proactively identify and address issues. Work closely with stakeholders to understand business requirements, prioritize tasks, and deliver solutions in a timely manner within an Agile working environment. Collaborate with the risk, security and compliance teams to ensure adherence to regulatory requirements (e.g., GDPR, PCI DSS) and industry standards related to data privacy and security. Stay updated on emerging technologies, tools, and best practices in the field of data engineering, and propose innovative solutions to improve efficiency, performance, and scalability. Mentor and coach junior engineers, fostering a culture of continuous learning and professional development within the team. Participate in code reviews, design discussions, and other Agile ceremonies to promote collaboration, transparency, and continuous improvement. Qualifications Qualifications Qualified to Degree, HND or HNC standard in a software engineering and/or data engineering discipline or can demonstrate commercial experience Required skills/ experience: Experience of the full development lifecycle Strong communication skills with the ability to explain solutions to technical and non-technical audiences Write clean, scalable and re-usable code that implements SOLID principles, common design patterns where applicable and adheres to published coding standards Excellent attention to detail, ability to analyse, investigate and compare large data sets when required. 3 or more years of programming using Scala 2 or more years of programming using Python Some experience of using Terraform to provision and deploy cloud services and components Experience of developing on Apache Spark Experience of developing with AWS cloud services including (but not limited to) AWS Glue, S3, Step Functions, Lambdas, EventBridge and SQS BDD / TDD experience Jenkins CI / CD experience Application Lifecycle Management Tools - BitBucket & Jira Performing Pull Request reviews Understanding of Agile methodologies Automated Testing Tools Advantageous experience: Mentoring or coaching junior engineers Cloud Solution Architecture Document databases Relational Databases Experience with Container technologies (e.g. Kubernetes) Would consider alternative skills and experience: Java (rather than Scala) Google Cloud or Microsoft Azure (rather than AWS) Azure Pipelines or TeamCity (rather than Jenkins) Github (rather than BitBucket) Azure DevOps (rather than Jira) CloudFormation (rather than Terraform) Additional Information Our uniqueness is that we celebrate yours. Experian's culture and people are important differentiators. We take our people agenda very seriously and focus on what matters; DEI, work/life balance, development, authenticity, collaboration, wellness, reward & recognition, volunteering... the list goes on. Experian's people first approach is award-winning; World's Best Workplaces™ 2024 (Fortune Global Top 25), Great Place To Work™ in 24 countries, and Glassdoor Best Places to Work 2024 to name a few. Check out Experian Life on social or our Careers Site and Glassdoor to understand why. Experian is proud to be an Equal Opportunity and Affirmative Action employer. Innovation is a critical part of Experian's DNA and practices, and our diverse workforce drives our success. Everyone can succeed at Experian and bring their whole self to work, irrespective of their gender, ethnicity, religion, color, sexuality, physical ability or age. If you have a disability or special need that requires accommodation, please let us know at the earliest opportunity. Benefits Experian care for employee's work life balance, health, safety and wellbeing. In support of this endeavor, we offer best-in-class family well-being benefits, enhanced medical benefits and paid time off. Experian Careers - Creating a better tomorrow together Find out what its like to work for Experian by clicking here
Posted 6 days ago
5.0 - 7.0 years
4 - 10 Lacs
Hyderābād
On-site
Description The U.S. Pharmacopeial Convention (USP) is an independent scientific organization that collaborates with the world's top authorities in health and science to develop quality standards for medicines, dietary supplements, and food ingredients. USP's fundamental belief that Equity = Excellence manifests in our core value of Passion for Quality through our more than 1,300 hard-working professionals across twenty global locations to deliver the mission to strengthen the supply of safe, quality medicines and supplements worldwide. At USP, we value inclusivity for all. We recognize the importance of building an organizational culture with meaningful opportunities for mentorship and professional growth. From the standards we create, the partnerships we build, and the conversations we foster, we affirm the value of Diversity, Equity, Inclusion, and Belonging in building a world where everyone can be confident of quality in health and healthcare. USP is proud to be an equal employment opportunity employer (EEOE) and affirmative action employer. We are committed to creating an inclusive environment in all aspects of our work—an environment where every employee feels fully empowered and valued irrespective of, but not limited to, race, ethnicity, physical and mental abilities, education, religion, gender identity, and expression, life experience, sexual orientation, country of origin, regional differences, work experience, and family status. We are committed to working with and providing reasonable accommodation to individuals with disabilities. Brief Job Overview The Digital & Innovation group at USP is seeking a Full Stack Developers with programming skills in Cloud technologies to be able to build innovative digital products. We are seeking someone who understands the power of Digitization and help drive an amazing digital experience to our customers. How will YOU create impact here at USP? In this role at USP, you contribute to USP's public health mission of increasing equitable access to high-quality, safe medicine and improving global health through public standards and related programs. In addition, as part of our commitment to our employees, Global, People, and Culture, in partnership with the Equity Office, regularly invests in the professional development of all people managers. This includes training in inclusive management styles and other competencies necessary to ensure engaged and productive work environments. The Sr. Software Engineer/Software Engineer has the following responsibilities: Build scalable applications/ platforms using cutting edge cloud technologies. Constantly review and upgrade the systems based on governance principles and security policies. Participate in code reviews, architecture discussions, and agile development processes to ensure high-quality, maintainable, and scalable code. Document and communicate technical designs, processes, and solutions to both technical and non-technical stakeholders Who is USP Looking For? The successful candidate will have a demonstrated understanding of our mission, commitment to excellence through inclusive and equitable behaviors and practices, ability to quickly build credibility with stakeholders, along with the following competencies and experience: Education Bachelor's or Master's degree in Computer Science, Engineering, or a related field Experience Sr. Software Engineer: 5-7 years of experience in software development, with a focus on cloud computing Software Engineer: 2-4 years of experience in software development, with a focus on cloud computing Strong knowledge of cloud platforms (e.g., AWS , Azure, Google Cloud) and services, including compute, storage, networking, and security Extensive knowledge on Java spring boot applications and design principles. Strong programming skills in languages such as Python Good experience with AWS / Azure services, such as EC2, S3, IAM, Lambda, RDS, DynamoDB, API Gateway, and Cloud Formation Knowledge of cloud architecture patterns, best practices, and security principles Familiarity with data pipeline / ETL / Orchestration tools, such as Apache NiFi, AWS Glue, or Apache Airflow. Good experience with front end technologies like React.js/Node.js etc Strong experience in micro services, automated testing practices. Experience leading initiatives related to continuous improvement or implementation of new technologies. Works independently on most deliverables Strong analytical and problem-solving skills, with the ability to develop creative solutions to complex problems Ability to manage multiple projects and priorities in a fast-paced, dynamic environment Additional Desired Preferences Experience with scientific chemistry nomenclature or prior work experience in life sciences, chemistry, or hard sciences or degree in sciences Experience with pharmaceutical datasets and nomenclature Experience with containerization technologies, such as Docker and Kubernetes, is a plus Experience working with knowledge graphs Ability to explain complex technical issues to a non-technical audience Self-directed and able to handle multiple concurrent projects and prioritize tasks independently Able to make tough decisions when trade-offs are required to deliver results Strong communication skills required: Verbal, written, and interpersonal Supervisory Responsibilities No Benefits USP provides the benefits to protect yourself and your family today and tomorrow. From company-paid time off and comprehensive healthcare options to retirement savings, you can have peace of mind that your personal and financial well-being is protected Who is USP? The U.S. Pharmacopeial Convention (USP) is an independent scientific organization that collaborates with the world's top authorities in health and science to develop quality standards for medicines, dietary supplements, and food ingredients. USP's fundamental belief that Equity = Excellence manifests in our core value of Passion for Quality through our more than 1,300 hard-working professionals across twenty global locations to deliver the mission to strengthen the supply of safe, quality medicines and supplements worldwide. At USP, we value inclusivity for all. We recognize the importance of building an organizational culture with meaningful opportunities for mentorship and professional growth. From the standards we create, the partnerships we build, and the conversations we foster, we affirm the value of Diversity, Equity, Inclusion, and Belonging in building a world where everyone can be confident of quality in health and healthcare. USP is proud to be an equal employment opportunity employer (EEOE) and affirmative action employer. We are committed to creating an inclusive environment in all aspects of our work—an environment where every employee feels fully empowered and valued irrespective of, but not limited to, race, ethnicity, physical and mental abilities, education, religion, gender identity, and expression, life experience, sexual orientation, country of origin, regional differences, work experience, and family status. We are committed to working with and providing reasonable accommodation to individuals with disabilities.
Posted 6 days ago
10.0 years
5 - 10 Lacs
Hyderābād
Remote
Join Amgen's Mission to Serve Patients If you feel like you’re part of something bigger, it’s because you are. At Amgen, our shared mission—to serve patients—drives all that we do. It is key to our becoming one of the world’s leading biotechnology companies. We are global collaborators who achieve together—researching, manufacturing, and delivering ever-better products that reach over 10 million patients worldwide. It’s time for a career you can be proud of. Specialist IS Software engineer Live What you will do Let’s do this. Let’s change the world. In this vital role We are looking for a creative and technically skilled Specialist IS Software engineer - Data Management Lead . This role will be responsible for leading data management initiatives collaborating across business, IT, and data governance teams. The ideal candidate will have extensive experience in configuring and implementing Collibra products, established track record of building high-quality data governance and data quality solutions with a strong hands-on design and engineering skills. The candidate must also possess strong analytical and communication skills. As a Collibra Lead Developer, you will play a key role in the design, implementation, and management of our Collibra Data Governance and Data Quality platform. You will work closely with stakeholders across the organization to ensure the successful deployment of data governance processes, solutions, and best practices. Building and integrating information systems to meet the company’s needs. Design and implement data governance frameworks, policies, and procedures within Collibra. Configure, implement, and maintain Collibra Data Quality Center to support enterprise-wide data quality initiatives Lead the implementation and configuration of Collibra Data Governance platform. Develop, customize, and maintain Collibra workflows, dashboards, and business rules. Collaborate with data stewards, data owners, and business analysts to understand data governance requirements and translate them into technical solutions Provide technical expertise and support to business users and IT teams on Collibra Data Quality functionalities. Collaborate with data engineers and architects to implement data quality solutions within data pipelines and data warehouses. Participate in data quality improvement projects, identifying root causes of data issues and implementing corrective actions Integrate Collibra with other enterprise data management systems (e.g., data catalogs, BI tools, data lakes). Provide technical leadership and mentoring to junior developers and team members. Troubleshoot and resolve issues with Collibra environment and data governance processes. Assist with training and enablement of business users on Collibra platform features and functionalities. Stay up to date with new releases, features, and best practices in Collibra and data governance. Basic Qualifications: Master’s degree in computer science & engineering preferred with 10+ years of software development experience OR, Bachelor’s degree in computer science & engineering preferred with 10+ years of software development experience Proven experience (7+ years) in data governance or data management roles. Strong experience with Collibra Data Governance platform, including design, configuration, and development. Hands-on experience with Collibra workflows, rules engine, and data stewardship processes. Experience with integrations between Collibra and other data management tools. Proficiency in SQL and scripting languages (e.g., Python, JavaScript). Strong problem-solving and troubleshooting skills. Excellent communication and collaboration skills to work with both technical and non-technical stakeholders Self-starter with strong communication and collaboration skills to work effectively with cross-functional teams. Excellent problem-solving skills and attention to detail. Domain knowledge of the Life sciences Industry Recent experience working in a Scaled Agile environment with Agile tools, e.g. Jira, Confluence, etc. Preferred Qualifications: Deep expertise in Collibra platform including Data Governance and Data Quality. In-depth knowledge of data governance principles, data stewardship processes, data quality concepts, data profiling and validation methodologies, techniques, and best practices. Hands-on experience in implementing and configuring Collibra Data Governance, Collibra Data Quality, including developing metadata ingestion, data quality rules, scorecards, and workflows. Strong experience in configuring and connecting to various data sources for metadata, data lineage, data profiling and data quality. Experience integrating data management capabilities (MDM, Reference Data) Good experience with Azure cloud services, Azure Data technologies and Databricks Solid understanding of relational database concepts and ETL processes. Proficient use of tools, techniques, and manipulation including programming languages (Python, PySpark, SQL etc.), for data profiling, and validation. Data modeling with tools like Erwin and knowledge of insurance industry standards (e.g., ACORD) and insurance data (policy, claims, underwriting, etc.). Familiarity with data visualization tools like Power BI. Good to Have Skills Willingness to work on AI Applications Experience with popular large language models Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills. Ability to work effectively with global, remote teams. High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals. Strong presentation and public speaking skills. Thrive What you can expect of us As we work to develop treatments that take care of others, we also work to care for our teammates’ professional and personal growth and well-being. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now for a career that defies imagination In our quest to serve patients above all else, Amgen is the first to imagine, and the last to doubt. Join us. careers.amgen.com Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation.
Posted 6 days ago
14.0 years
0 Lacs
Hyderābād
On-site
We are seeking a highly experienced Salesforce Architect with over 14 years of expertise in designing and implementing scalable Salesforce solutions. The ideal candidate will possess a deep understanding of Salesforce platform capabilities, architecture best practices, and enterprise application integration. You will play a pivotal role in defining architectural roadmaps, ensuring optimal performance, and leading technical teams to deliver business-critical solutions. Key Responsibilities: 1. Architecture and Design Define and design scalable Salesforce architecture, ensuring alignment with business goals and IT strategy. Lead the technical design process and ensure compliance with architectural standards, security policies, and governance frameworks. Evaluate and select appropriate Salesforce tools, technologies, and APIs to build robust solutions. Develop and maintain architectural blueprints and technical documentation. 2. Solution Implementation and Integration Lead end-to-end Salesforce implementation projects, including configuration, custom development, and integration with enterprise applications. Define and implement data models, security models, and sharing rules across Salesforce platforms. Design and oversee the integration of Salesforce with other enterprise systems such as ERP, marketing automation, and custom applications using APIs, middleware, and integration tools. 3. Technical Leadership and Governance Provide technical leadership to development teams, ensuring adherence to Salesforce best practices and coding standards. Collaborate with stakeholders, business analysts, and product owners to translate business requirements into scalable technical solutions. Conduct code reviews, troubleshoot performance issues, and provide guidance on optimizing Salesforce implementations. 4. Security and Compliance Ensure Salesforce solutions comply with security standards, data privacy regulations, and industry best practices. Implement role-based security, object-level permissions, and data encryption to protect sensitive information. 5. Continuous Improvement and Innovation Stay up to date with Salesforce product releases, new features, and industry trends. Drive the adoption of new tools and technologies to enhance Salesforce platform efficiency and performance. Required Skills and Experience: Technical Skills: Strong expertise in Salesforce Service Cloud, Experience Cloud, Sales Cloud, and Marketing Cloud. Proficiency in Apex, Agent force, Visualforce, AURA, Lightning Web Components (LWC), SOQL, and SOSL. Experience with Salesforce API integrations (REST/SOAP), middleware, and ETL tools. Hands-on experience in CI/CD pipelines, version control (Git), and deployment tools (Copado). Knowledge of data migration strategies and tools such as Data Loader, MuleSoft, and Informatica. Architectural Expertise: Strong understanding of Salesforce architecture patterns, multi-org strategy, and governance models. Expertise in designing multi-cloud solutions and integrating Salesforce with enterprise systems. Experience with Salesforce DevOps, release management, and sandbox management. Certifications: Salesforce Certified Technical Architect (CTA) (Preferred or willingness to pursue) Salesforce Certified Application Architect Salesforce Certified System Architect Other relevant certifications (e.g., Platform Developer II, Integration Architecture Designer) Soft Skills: Strong leadership and mentorship skills to guide development teams. Excellent communication and collaboration skills to engage with business and technical stakeholders. Ability to manage multiple projects and prioritize tasks effectively. Analytical mindset with problem-solving capabilities. Education: Bachelor's or Master’s degree in Computer Science, Information Technology, or a related field. Preferred Qualifications: Experience working in Agile environments with a strong understanding of Agile delivery frameworks (Scrum/SAFe). Hands-on experience with Salesforce Einstein Analytics, CPQ, and Field Service Lightning is a plus. Work Environment: Opportunity to work on cutting-edge Salesforce implementations and enterprise-level solutions. Work from office per policy guidelines and work with teams across EU, APAC and US time zones. Education: Bachelor's degree in computer science, engineering, information systems and/or equivalent formal training or work experience. Relevant Master’s degree, TOGAF certification and SAFe Agile certification strongly preferred. Experience: Eight (8) years equivalent work experience in information technology or engineering environment with a direct responsibility for strategy formulation and solution/technical architecture, as well as designing, architecting, developing, implementing and monitoring efficient and effective solutions to diverse and complex business problems. Knowledge, Skills and Abilities Fluency in English Accuracy & Attention to Detail Influencing & Persuasion Planning & Organizing Problem Solving Project Management Preferred Qualifications: Pay Transparency: Pay: Additional Details: FedEx was built on a philosophy that puts people first, one we take seriously. We are an equal opportunity/affirmative action employer and we are committed to a diverse, equitable, and inclusive workforce in which we enforce fair treatment, and provide growth opportunities for everyone. All qualified applicants will receive consideration for employment regardless of age, race, color, national origin, genetics, religion, gender, marital status, pregnancy (including childbirth or a related medical condition), physical or mental disability, or any other characteristic protected by applicable laws, regulations, and ordinances. Our Company FedEx is one of the world's largest express transportation companies and has consistently been selected as one of the top 10 World’s Most Admired Companies by "Fortune" magazine. Every day FedEx delivers for its customers with transportation and business solutions, serving more than 220 countries and territories around the globe. We can serve this global network due to our outstanding team of FedEx team members, who are tasked with making every FedEx experience outstanding. Our Philosophy The People-Service-Profit philosophy (P-S-P) describes the principles that govern every FedEx decision, policy, or activity. FedEx takes care of our people; they, in turn, deliver the impeccable service demanded by our customers, who reward us with the profitability necessary to secure our future. The essential element in making the People-Service-Profit philosophy such a positive force for the company is where we close the circle, and return these profits back into the business, and invest back in our people. Our success in the industry is attributed to our people. Through our P-S-P philosophy, we have a work environment that encourages team members to be innovative in delivering the highest possible quality of service to our customers. We care for their well-being, and value their contributions to the company. Our Culture Our culture is important for many reasons, and we intentionally bring it to life through our behaviors, actions, and activities in every part of the world. The FedEx culture and values have been a cornerstone of our success and growth since we began in the early 1970’s. While other companies can copy our systems, infrastructure, and processes, our culture makes us unique and is often a differentiating factor as we compete and grow in today’s global marketplace.
Posted 6 days ago
4.0 years
0 Lacs
Hyderābād
On-site
About Us: Location - Hyderabad, India Department - Product R&D Level - Support Working Pattern - Work from office. Benefits - Benefits at Ideagen Salary - this will be discussed at the next stage of the process, if you do have any questions, please feel free to reach out! As a Level 2 Software Engineer, you will build high-quality, innovative, and fully performing integration solutions that complies with coding standards and technical design. You will contribute to the design and implementation of several integrations across different Ideagen modules. You will take ownership of the entire B2B integration lifecycle, from preliminary planning through requirements gathering, design, development, documentation, testing, deployment, and ongoing maintenance of integration solutions. Finally, you will contribute thoughts positively within the Agile development team and demonstrate an enthusiastic ‘can-do’ attitude. Responsibilities: Design, Develop, and Manage Integration Solutions which involve building robust data integration pipelines using Biztalk and C# DotNet Technologies. Develop and support B2B solutions using BizTalk Server 2020 having excellent knowledge with Biztalk artifacts like Schemas, Maps, Pipelines, Orchestrations, Adapters. Utilize XML and XSLT extensively for data transformation and enrichment between systems. Deploying integration solutions to Biztalk Server Console. Configure and manage secure file transfers (SFTP, FTPS) for data exchange. Implement and support data exchange using HL7 and other industry standards. Develop and Optimize T-SQL Queries, stored procedures, and scripts to support data processing and transformation. Design and maintain end to end solutions to accomplish connections with several vendors and partners to establish secure, scalable integration frameworks using Biztalk Adapters and DotNet Code. Create robust ETL workflows using tools like SSIS, Azure Data Factory, or other ETL platforms. Design and consume RESTful and SOAP APIs for real-time and batch data integration. Develop custom connectors and middleware when needed. Troubleshoot Integration Solutions and ensure timely delivery. Monitor and support integration flows, ensuring error handling, logging, and alerts are in place. Develop and maintain Technical Design Documents for all Integration processes and solutions. Collaborate with multiple cross functional teams including Dev, QA, Infra, Business teams to understand the technical customer requirements and deliver robust solutions. Work within an Agile Development Team using e.g., Scrum framework. Provide unit tests to support and validate for any development work undertaken. Perform tasks with limited supervision and require substantial use of independent judgment within the scope. Skills and Experience: A minimum of 4 years of hands-on experience in a Data Integration role is highly preferred. Primary skills: Biztalk Server 2020, C#, .Net, SQL Server 2017, exposure to Rest API. Secondary skills: SSIS (Good to have). A proven ability to deliver end to end integration solutions with Biztalk Server 2020, T-SQL and REST APIs. Demonstrated proficiency in writing T-SQL queries, stored procedures, and scripts for efficient data processing and transformation. Experience in deploying solutions to Biztalk Server Console. Experience using XML and XSLT to develop robust solutions. Hands on experience of Biztalk Artifacts like Schemas, Maps, Pipelines, Orchestrations, Adapters for developing B2B real time solutions. Strong understanding of C# DotNet and experience with its usage in integration solutions. Solid understanding of RESTful APIs and experience with their integration. Experience using Source Control, preferably BitBucket and Git. Exceptional communication and presentation skills in English, both verbal and written, are essential for this role. Ability to write unit test cases and perform unit testing. Ability to create Technical Design Documents with optimal design. Understanding of Agile software development methodologies/frameworks such as Scrum Desirable Database development experience, preferably SQL Server and MongoDB. Exposure to ETL platforms like SSIS, Azure DataFactory. Exposure to AWS/Azure, Postman. About Ideagen Ideagen is the invisible force behind many things we rely on every day - from keeping airplanes soaring in the sky, to ensuring the food on our tables is safe, to helping doctors and nurses care for the sick. So, when you think of Ideagen, think of it as the silent teammate that's always working behind the scenes to help those people who make our lives safer and better. Everyday millions of people are kept safe using Ideagen software. We have offices all over the world including America, Australia, Malaysia and India with people doing lots of different and exciting jobs. We’re building a future-ready team, and AI is part of how we work smarter. If you're curious, adaptable and open to using AI to improve how you work, you’ll thrive at Ideagen! What is next? If your application meets the requirements for this role, our Talent Acquisition team will be in touch to guide you through the next steps. To ensure a flexible and inclusive process, please let us know if you require any reasonable adjustments by contacting us at recruitment@ideagen.com. All matters will be treated with strict confidence. At Ideagen, we value the importance of work-life balance and welcome candidates seeking flexible or part-time working arrangements. If this is something you are interested in, please let us know during the application process. Enhance your career and make the world a safer place! #LI-FullTime
Posted 6 days ago
3.0 years
6 - 8 Lacs
Hyderābād
On-site
- 3+ years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience - Experience with data visualization using Tableau, Quicksight, or similar tools - Experience with data modeling, warehousing and building ETL pipelines - Experience in Statistical Analysis packages such as R, SAS and Matlab - Experience using SQL to pull data from a database or data warehouse and scripting experience (Python) to process data for modeling The ShipTech BI team is looking for a smart and ambitious individual to support developing the operational reporting structure in Amazon Logistics. The potential candidate will support analysis, improvement and creation of metrics and dashboards on Transportation by Amazon, In addition, they will work with internal customers at all levels of the organization – Operations, Customer service, HR, Technology, Operational Research. The potential candidate will enjoy the challenges and rewards of working in a fast-growing organization. This is a high visibility position. As an Amazon Data Business Intelligence Engineer you will be working in one of the world's largest and most complex data warehouse environments. You should have deep expertise in the design, creation, management, and business use of extremely large datasets. You should have excellent business and communication skills to be able to work with business owners to develop and define key business questions, and to build data sets that answer those questions. You should be expert at designing, implementing, and operating stable, scalable, low cost solutions to flow data from production systems into the data warehouse and into end-user facing applications. You should be able to work with business customers in understanding the business requirements and implementing reporting solutions. Above all you should be passionate about bringing large datasets together to answer business questions and drive change. Key Responsibilities: - Design automated solutions for recurrent reporting (daily/weekly/monthly). - Design automated processes for in-depth analysis databases. - Design automated data control processes. - Collaborate with the software development team to build the designed solutions. - Learn, publish, analyze and improve management information dashboards, operational business metrics decks and key performance indicators. - Improve tools, processes, scale existing solutions, create new solutions as required based on stakeholder needs. - Provide in-depth analysis to management with the support of accounting, finance, transportation and supply chain teams. - Participate in annual budgeting and forecasting efforts. - Perform monthly variance analysis and identify risks & opportunities. Experience with AWS solutions such as EC2, DynamoDB, S3, and Redshift Experience in data mining, ETL, etc. and using databases in a business environment with large-scale, complex datasets Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.
Posted 6 days ago
2.0 - 3.0 years
0 Lacs
Telangana
On-site
Role : ML Engineer (Associate / Senior) Experience : 2-3 Years (Associate) 4-5 Years (Senior) Mandatory Skill: Python/ MLOps/ Docker and Kubernetes/FastAPI or Flask/CICD/Jenkins/Spark/SQL/RDB/Cosmos/Kafka/ADLS/API/Databricks Location: Bangalore Notice Period: less than 60 Days Job Description: Other Skills: Azure /LLMOps / ADF/ETL We are seeking a talented and passionate Machine Learning Engineer to join our team and play a pivotal role in developing and deploying cutting-edge machine learning solutions. You will work closely with other engineers and data scientists to bring machine learning models from proof-of-concept to production, ensuring they deliver real-world impact and solve critical business challenges. Collaborate with data scientists, model developers , software engineers, and other stakeholders to translate business needs into technical solutions. Experience of having deployed ML models to production Create high performance real-time inferencing APIs and batch inferencing pipelines to serve ML models to stakeholders. Integrate machine learning models seamlessly into existing production systems. Continuously monitor and evaluate model performance and retrain the models automatically or periodically Streamline existing ML pipelines to increase throughput. Identify and address security vulnerabilities in existing applications proactively. Design, develop, and implement machine learning models for preferably insurance related applications. Well versed with Azure ecosystem Knowledge of NLP and Generative AI techniques. Relevant experience will be a plus. Knowledge of machine learning algorithms and libraries (e.g., TensorFlow, PyTorch) will be a plus. Stay up-to-date on the latest advancements in machine learning and contribute to ongoing innovation within the team.
Posted 6 days ago
10.0 years
3 - 5 Lacs
Hyderābād
On-site
If you are a current employee who is interested in applying to this position, please navigate to the internal Careers site to apply. Disclaimer: MarketStar is committed to ensuring integrity and transparency in our recruitment practices. We DO NOT charge any fees at any stage of the recruitment process. In case you receive any unsolicited requests for payments, please report to
Posted 6 days ago
4.0 years
4 - 6 Lacs
Hyderābād
On-site
Overview: We have an exciting role to head our creative studio for one of Omnicom’s largest advertising agency. This leadership role will require to lead and drive world-class advertising, creative and studio deliverables working with global brands and agency leaders. This role would be overall responsible for production, practice a people management. Ab Omnicom Global Solutions Omnicom Global Solutions (OGS) is an agile innovation hub of Omnicom Group, a leading global marketing and corporate communications company. Guided by the principles of Knowledge, Innovation, and Transformation, OGS is designed to deliver scalable, customized, and contextualized solutions that meet the evolving needs of our Practice Areas within Omnicom. OGS India plays a key role for our group companies and global agencies by providing stellar products, solutions, and services in the areas of Creative Services, Technology, Marketing Science (Data & Analytics), Advanced Analytics, Market Research, Business Support Services, Media Services, and Project Management. We currently have 4000+ awesome colleagues in OGS India who are committed to solving our clients’ pressing business issues. We are growing rapidly and looking for talented professionals like you to be part of this journey. Let us build this, together! Responsibilities: About our Agency Omnicom Health Shared Services Omnicom Health Group is the world’s largest and most diverse global healthcare network, pioneering solutions that shape a healthier future for all. At OHG, you’re not just part of a network—you’re part of a movement. Our ambition is to be the case study others aspire to, challenging the status quo and redefining what’s possible. With flagship locations globally, we deliver local expertise and groundbreaking healthcare solutions across consulting, strategy, creative, media, and more. Our 29 specialized companies work seamlessly to drive innovation with precision and impact. Know more at: https://omnicomhealthgroup.com/ The OGS-OH partnership empowers some of the world’s iconic brands with Knowledge, Innovation, and Transformation. When you join, you become part of a dynamic team that delivers high-impact solutions in the healthcare marketing and communications space. Here’s what makes us unique: We are a growing community that blends creativity, technology, and data-driven insights to transform healthcare. Bringing you the best of both worlds – our team partners with key OH strategists while staying rooted in OGS’ culture and values. Access to top healthcare and biopharmaceutical brands. Helping you own your career – unlock diverse learning and upskilling opportunities, along with personalized talent development programs. Empowering you with an inclusive, rewarding, and engaging work environment centred around your well-being. Qualifications: JD Shared by Agency: Reporting & Insights – Specialist (Subject Matter Expert) Function: Market Science Level: SME Experience Required: 4–6 years of experience in marketing analytics, reporting architecture, data pipeline optimization, or performance intelligence strategy 1. Role Summary As a Specialist (SME) in the Reporting & Insights team within Market Science, you will serve as a domain expert in building robust reporting frameworks, optimizing data flows, and enabling scalable reporting systems across clients and platforms. You will lead reporting innovations, consult on best practices, and ensure governance across measurement and dashboarding processes. Your expertise will directly influence the development of strategic performance reporting for Omnicom Health clients, ensuring insights are timely, trusted, and actionable. 2. Key Responsibilities Architect reporting ecosystems using BI tools and advanced analytics workflows. Standardize KPIs, data definitions, and visualization best practices across clients. Collaborate with data engineering teams to enhance data warehousing/reporting infrastructure. Drive adoption of reporting automation, modular dashboards, and scalable templates. Ensure compliance with data governance, privacy, and client reporting SLAs. Act as the go-to expert for dashboarding tools, marketing KPIs, and campaign analytics. Conduct training and peer reviews to improve reporting maturity across teams. 3. Skills & Competencies Skill / Competency Proficiency Level Must-Have / Good-to-Have Criticality Index BI Tools Mastery (Power BI, Tableau) Advanced Must-Have High Data Architecture & ETL Intermediate Must-Have High Cross-Platform Reporting Logic Advanced Must-Have High Stakeholder Consulting Advanced Must-Have High Data Governance & QA Intermediate Must-Have High Leadership & Influence Intermediate Must-Have Medium Training & Enablement Intermediate Good-to-Have Medium 4. Day-to-Day Deliverables Will Include Designing and reviewing dashboards for performance, scalability, and accuracy Standardizing metrics, filters, and visualizations across platforms and markets Troubleshooting data discrepancies and establishing QA protocols Supporting onboarding of new clients or business units into the reporting framework Publishing playbooks and SOPs on reporting automation and delivery standards Conducting stakeholder walkthroughs and enablement sessions 5. Key Attributes for Success in This Role Strategic thinker with a hands-on approach to reporting and automation High attention to detail and process consistency Confident in translating business needs into scalable BI solutions Adaptable to changing client needs, tools, and data environments Collaborative, yet assertive in driving reporting excellence 6. Essential Tools/Platforms & Certifications Tools : Power BI, Advance Excel, Redshift , Alteryx (basics) Certifications : Power BI/Tableau Professional, , Data Engineering/ETL certifications – Preferred
Posted 6 days ago
12.0 - 15.0 years
2 - 4 Lacs
Hyderābād
Remote
Join Amgen's Mission to Serve Patients If you feel like you’re part of something bigger, it’s because you are. At Amgen, our shared mission—to serve patients—drives all that we do. It is key to our becoming one of the world’s leading biotechnology companies. We are global collaborators who achieve together—researching, manufacturing, and delivering ever-better products that reach over 10 million patients worldwide. It’s time for a career you can be proud of. Principal IS Architect Live What you will do Let’s do this. Let’s change the world. In this vital role We are seeking a visionary and technically exceptional Principal IS Architect to lead the design and development of enterprise-wide intelligent search solutions. s a senior-level IT professional who designs and oversees the implementation of robust and scalable data and AI solutions, often utilizing the Java programming language and related technologies. This role requires a strong understanding of both data architecture principles and AI/ML concepts, along with expertise in Java development and cloud platforms You’ll lead by example—mentoring engineers, setting standards, and driving the technical vision for our next-generation search capabilities. This person will also be responsible for defining the roadmap for Products They will work closely with Development teams and act as a bridge between Product owners and Development teams to perform Proof of Concepts on provided design and technology, develop re-usable components etc. This is a senior role in the organization which along with a team of other architects will help design the future state of technology at Amgen India Design and Strategy: Responsibilities include developing and maintaining foundational architecture for data and AI initiatives, defining the technical roadmap, and translating business requirements into technical specifications Data Architecture: This involves designing and implementing data models, database designs, and ETL processes, as well as leading the design of scalable data architectures. The role also includes establishing best practices for data management and ensuring data security and compliance. AI Architecture and Implementation: Key tasks include architecting and overseeing the implementation of AI/ML frameworks and solutions, potentially with a focus on generative AI models, and defining processes for AI/ML development and MLOps. Develop end-to-end solution architectures for data-driven and AI-focused applications, ensuring alignment with business objectives and technology strategy. Lead architecture design efforts across data pipelines, machine learning models, AI applications, and analytics platforms in our Gap Data Platform area. Collaborate closely with business partners, product managers, data scientists, software engineers, and the broader Global Technology Solutions teams in vetting solution design and delivering business value. Provide technical leadership and mentoring in data engineering and AI best practices. Evaluate and recommend emerging data technologies, AI techniques, and cloud services to enhance business capabilities. Ensure the scalability, performance, and security of data and AI architectures. Establish and maintain architectural standards, including patterns and guidelines for data and AI projects. Create architecture artifacts(concept, system, data architecture) for data and AI projects/initiatives. Create and oversee architecture center of excellence for data and AI area to coach and mentor resources working in this area. Set technical direction, best practices, and coding standards for search engineering across the organization. Review designs, mentor senior and mid-level engineers, and champion architecture decisions aligned with product goals and compliance needs. Own performance, scalability, observability, and reliability of search services in production. Resolving technical problems as they arise. Providing technical guidance and mentorship to junior developers. Continually researching current and emerging technologies and proposing changes where needed. .Assessing the business impact that certain technical choices have. Providing updates to stakeholders on product development processes, costs, and budgets. Work closely with Information Technology professionals within the company to ensure hardware is available for projects and working properly Work closely with project management teams to successfully monitor progress of initiatives Current understanding of best practices regarding system security measures Positive outlook in meeting challenges and working to a high level Advanced understanding of business analysis techniques and processes Account for possible project challenges on constraints including, risks, time, resources and scope Possesses strong rapid prototyping skills and can quickly translate concepts into working code Take ownership of complex software projects from conception to deployment. Manage software delivery scope, risk, and timeline Participate to both front-end and back-end development using cloud technology. Develop innovative solution using generative AI technologies Define and implement robust software architectures on the cloud, AWS preferred Conduct code reviews to ensure code quality and alignment to best practices. Create and maintain documentation on software architecture, design, deployment, disaster recovery, and operations. Identify and resolve technical challenges effectively. Stay updated with the latest trends and advancements Work closely with product team, business team, and other key partners. Basic Qualifications: Master’s degree in computer science & engineering preferred with 12-15 years of software development experience OR, Bachelor’s degree in computer science & engineering preferred with 11-15 years of software development experience Minimum of 7 years of professional experience in technology, including at least 3 years in a data architecture and AI solution architect role. Strong expertise in cloud platforms, preferably Azure and GCP, and associated data and AI services. Proven experience in architecting and deploying scalable data solutions, including data lakes, warehouses, and streaming platforms. Working knowledge of tools/technologies like Azure Data Factory, Confluent Kafka, Spark, Databricks, BigQuery and Vertex AI. Deep understanding of AI/ML frameworks and tools such as TensorFlow, PyTorch, Spark ML, or Azure ML. Preferred Qualifications: Programming Languages: Proficiency in multiple languages (e.g., Python, Java,Data bricks, Vertex) is crucial and must Experienced with API integration, serverless, microservices architecture. Proficiency with programming languages like Python, Java, or Scala. Proficiency vAzure Data Factory, Confluent Kafka, Spark, Databricks, BigQuery and Vertex AI. Proficiency with AI/ML frameworks and tools such as TensorFlow, PyTorch, Spark ML, or Azure ML Solid understanding of data governance, security, privacy, and compliance standards. Exceptional communication, presentation, and stakeholder management skills. Experience working in agile project environments Good to Have Skills Willingness to work on AI Applications Experience with popular large language models Experience with Langchain or llamaIndex framework for language models Experience with prompt engineering, model fine tuning Knowledge of NLP techniques for text analysis and sentiment analysis Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills. Ability to work effectively with global, remote teams. High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals. Strong presentation and public speaking skills. Thrive What you can expect of us As we work to develop treatments that take care of others, we also work to care for our teammates’ professional and personal growth and well-being. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now for a career that defies imagination In our quest to serve patients above all else, Amgen is the first to imagine, and the last to doubt. Join us. careers.amgen.com Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation.
Posted 6 days ago
8.0 years
3 - 6 Lacs
Hyderābād
Remote
Company Description It all started in sunny San Diego, California in 2004 when a visionary engineer, Fred Luddy, saw the potential to transform how we work. Fast forward to today — ServiceNow stands as a global market leader, bringing innovative AI-enhanced technology to over 8,100 customers, including 85% of the Fortune 500®. Our intelligent cloud-based platform seamlessly connects people, systems, and processes to empower organizations to find smarter, faster, and better ways to work. But this is just the beginning of our journey. Join us as we pursue our purpose to make the world work better for everyone. Job Description What you get to do in this role: Develop and maintain AI-powered internal tools that automate workflows and boost stakeholders’ productivity, with specialized focus on sales analytics and strategic planning operations Build and deliver ETL pipelines for Power BI/Snowflake datasets optimized LLM consumption, enabling efficient AI-driven analysis and empowering power users. Collaborate cross-functionally with Data & Analytics teams and Sales Operations teams to identify high-value AI use cases and rapidly prototype AI-enabled utilities that align with business goals Transform enterprise data from Power BI and Snowflake into LLM-optimized formats while ensuring data integrity and reliable performance across AI-driven solutions Manage complete AI agent development lifecycle from ideation, testing, production deployment, and user adoption while implementing continuous integration and documenting best practices Champion organizational AI adoption by ensuring seamless system integration, demonstrating clear business value, and maintaining high standards for performance and user experience Qualifications In order to be successful in this role: Experience in leveraging or critically thinking about how to integrate AI into work processes, decision-making, or problem-solving. This may include using AI-powered tools, automating workflows, analyzing AI-driven insights, or exploring AI's potential impact on the function or industry. 8+ years of proven track record supporting sales organizations or sales business processes through analytics, automation, or technical solutions Demonstrated history of building and deploying AI agents or automation tools in real-world business settings for sales workflow automation Proven experience with semantic modeling in Power BI or Snowflake, plus familiarity with transforming sales data models for LLM integration and sales analytics optimization. (include data restructuring for LLM integration) Strong understanding of data engineering, APIs, and cloud-based architecture with experience in sales data Ability to function both independently and as part of cross-functional teams including sales teams and business stakeholders in fast-paced environments Hands-on experience with rapid prototyping, iterative testing, and agile methodologies specifically applied to sales tools and business process improvements Additional Information Work Personas We approach our distributed world of work with flexibility and trust. Work personas (flexible, remote, or required in office) are categories that are assigned to ServiceNow employees depending on the nature of their work and their assigned work location. Learn more here. To determine eligibility for a work persona, ServiceNow may confirm the distance between your primary residence and the closest ServiceNow office using a third-party service. Equal Opportunity Employer ServiceNow is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, creed, religion, sex, sexual orientation, national origin or nationality, ancestry, age, disability, gender identity or expression, marital status, veteran status, or any other category protected by law. In addition, all qualified applicants with arrest or conviction records will be considered for employment in accordance with legal requirements. Accommodations We strive to create an accessible and inclusive experience for all candidates. If you require a reasonable accommodation to complete any part of the application process, or are unable to use this online application and need an alternative method to apply, please contact globaltalentss@servicenow.com for assistance. Export Control Regulations For positions requiring access to controlled technology subject to export control regulations, including the U.S. Export Administration Regulations (EAR), ServiceNow may be required to obtain export control approval from government authorities for certain individuals. All employment is contingent upon ServiceNow obtaining any export license or other approval that may be required by relevant export control authorities. From Fortune. ©2025 Fortune Media IP Limited. All rights reserved. Used under license.
Posted 6 days ago
2.0 - 6.0 years
3 - 8 Lacs
Hyderābād
On-site
Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. Associate IS Engineer - Veeva Vault Promomats_Medcomms What you will do Let’s do this. Let’s change the world. In this vital role in the Veeva Vault team you will be responsible for designing, developing, and maintaining software applications and solutions in Amgen’s Vault PromoMats and Vault MedComm, that meet business needs and ensuring the availability and performance of critical systems and applications. This role involves working closely with product managers, designers, and other engineers to create high-quality, scalable software solutions and automating operations, monitoring system health, and responding to incidents to minimize downtime. Roles & Responsibilities: Possesses strong rapid prototyping skills and can quickly translate concepts into working code. Lead day to day operations and maintenance of Amgen’s Amgen’s Vault PromoMats and Vault MedComm and its hosted applications. Stay updated with the latest trends, advancements and standard processes for Veeva Vault Platform ecosystem. Design, develop, and implement applications and modules, including custom reports, SDKs, interfaces, and enhancements. Analyze and understand the functional & technical requirements of applications, solutions and systems, translate them into software architecture and design specifications. Develop and complete unit tests, integration tests, and other testing strategies to ensure the quality of the software following IS change control and GxP Validation process while exhibiting expertise in Risk Based Validation methodology. Work closely with multi-functional teams, including product management, design, and QA, to deliver high-quality software on time. Maintain detailed documentation of software designs, code, and development processes. Work on integrating with other systems and platforms to ensure seamless data flow and functionality. Stay up to date on Veeva Vault Features, new releases and standard methodologies around Veeva Platform Governance. What we expect of you Basic Qualifications and Experience: Bachelor’s degree and 2 to 6 years of Information Systems experience or related field Functional Skills: Must-Have Skills: Experience with Amgen’s Vault PromoMats and Vault MedComm, including Veeva configuration settings and custom builds. Strong knowledge of information systems and network technologies. Experience in building configured and custom solutions on Veeva Vault Platform. Experience in managing systems, implementing and validating projects in GxP regulated environments. Extensive expertise in SDLC, including requirements, design, testing, data analysis, creating and managing change controls. Proficiency in programming languages such as Python, JavaScript etc. Solid understanding of software development methodologies, including Agile and Scrum. Experience with version control systems such as Git. Good-to-Have Skills: Familiarity with relational databases (such as MySQL, SQL server, PostgreSQL etc.) Proficiency in programming languages such as Python, JavaScript or other programming languages Outstanding written and verbal communication skills, and ability to translate technical concepts for non-technical audiences. Experience with ETL Tools (Informatica, Databricks). Experience with API integrations such as MuleSoft. Solid understanding & Proficiency in writing SQL queries. Hands on experience on reporting tools such as Tableau, Spotfire & Power BI. Professional Certifications: Veeva Vault Platform Administrator or Equivalent Vault Certification (Must-Have) SAFe for Teams (Preferred) Soft Skills: Excellent analytical and fix skills. Strong verbal and written communication skills. Ability to work effectively with global, virtual teams. Team-oriented, with a focus on achieving team goals. Strong presentation and public speaking skills. Work Hours: This position requires you to work a later shift and may be assigned a second or third shift schedule. Candidates must be willing and able to work during evening or night shifts, as required. Potential Shifts (subject to change based on business requirements): Second Shift: 2:00pm – 10:00pm IST; Third Shift: 10:00 pm – 7:00 am IST. What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
Posted 6 days ago
4.0 years
8 - 8 Lacs
Hyderābād
On-site
Hyderabad, India Technology In-Office 11047 Job Description Job Purpose The Property Data Engineer is responsible for developing and maintaining data conversion programs that transform raw property assessment data into standardized formats based on specifications by Property Data Analyst and Senior Analysts. This role requires not only advanced programming and ETL skills but also a deep understanding of the structure, nuances, and business context of assessment data. Even with clear and well-documented conversion instructions, engineers without prior exposure to this domain often face significant challenges in interpreting and transforming the data accurately. Data Engineer plays a critical role in ensuring the accuracy, efficiency and scalability of data processing pipelines that support the Assessor Operations. Responsibilities Depending on the specific team and role, the Property Data Engineer may be responsible for some or all the following tasks: Develop and maintain data conversion programs using C#, Python, JavaScript, and SQL. Implement ETL workflows using tools such as Pentaho Kettle, SSIS, and internal applications. Collaborate with Analysts and Senior Analysts to interpret conversion instructions and translate them into executable code. Troubleshoot and resolve issues identified during quality control reviews. Recommend and implement automation strategies to improve data processing efficiency. Perform quality checks on converted data and ensure alignment with business rules and standards. Contribute to the development of internal tools and utilities to support data transformation tasks. Maintain documentation for code, workflows, and processes to support team knowledge sharing. Programming (Skill Level: Advanced to Expert) Create and maintain conversion programs in SQL, Visual Studio using C#, Python or JavaScript. Use JavaScript within Pentaho Kettle workflows and SSIS for data transformation. Build and enhance in-house tools to support custom data processing needs. Ensure code is modular, maintainable, and aligned with internal development standards. Ensure code quality through peer reviews, testing and adherence to development standards. ETL Execution (Skill Level: Advanced to Expert ) Execute and troubleshoot ETL processes using tools like Kettle, SSIS, and proprietary tools. Input parameters, execute jobs, and perform quality checks on output files. Troubleshoot ETL failures and optimize performance. Recommend and implement automation strategies to improve data processing efficiency and accuracy. Data File Manipulation (Skill Level: Advanced to Expert) Work with a wide variety of file formats (CSV, Excel, TXT, XML, etc.) to prepare data for conversion. Apply advanced techniques to clean, merge, and structure data. Develop scripts and tools to automate repetitive data preparation tasks. Ensure data is optimized for downstream ETL and analytical workflows. Data Analysis (Skill Level: Supportive – Applied) Leverage prior experience in data analysis to independently review and interpret source data when developing or refining conversion programs. Analyze data structures, field patterns, and anomalies to improve the accuracy and efficiency of conversion logic. Use SQL queries, Excel tools, and internal utilities to validate assumptions and enhance the clarity of analyst-provided instructions. Collaborate with Analysts and Senior Analysts to clarify ambiguous requirements and suggest improvements based on technical feasibility and data behavior. Conduct targeted research using public data sources (e.g., assessor websites) to resolve data inconsistencies or fill in missing context during development. Quality Control (Skill Level: Engineer-Level) Perform initial quality control on converted data outputs before formal review by Associates, Analysts, or Senior Analysts for formal review. Validate that the program output aligns with conversion instructions and meets formatting and structural expectations. Use standard scripts, ad-hoc SQL queries, and internal tools to identify and correct discrepancies in the data. Address issues identified during downstream QC reviews by updating conversion logic or collaborating with analysts to refine requirements. Ensure that all deliverables meet internal quality standards prior to release or further review. Knowledge and Experience Minimum Education: Bachelor’s degree in Computer Science, Information Systems, Software Engineering, Data Engineering, or a related technical field; or equivalent practical experience in software development or data engineering. Preferred Education: Bachelor’s degree (as above) plus additional coursework or certifications in: Data Engineering ETL Development Cloud Data Platforms (e.g., AWS, Azure, GCP) SQL and Database Management Programming (C#, Python, JavaScript) 4+ years of experience in software development, data engineering, or ETL pipeline development. Expert-level proficiency in programming languages such as SQL, Visual Studio using C#, Python, and JavaScript. Experience with ETL tools such as Pentaho Kettle, SSIS, or similar platforms. Strong understanding of data structures, file formats (CSV, Excel, TXT, XML), and data transformation techniques. Familiarity with relational databases and SQL for data querying and validation. Ability to read and interpret technical documentation and conversion instructions. Strong problem-solving skills and attention to detail. Ability to work independently and collaboratively in a fast-paced environment. Familiarity with property assessment, GIS, tax or public property records data. Preferred Skills Experience developing and maintaining data conversion programs in Visual Studio. Experience with property assessment, GIS, tax or public records data. Experience building internal tools or utilities to support data transformation workflows. Knowledge of version control systems (e.g., Git, Jira) and agile development practices. Exposure to cloud-based data platforms or services (e.g., Azure Data Factory, AWS Glue). Ability to troubleshoot and optimize ETL performance and data quality. Strong written and verbal communication skills for cross-functional collaboration.
Posted 6 days ago
7.0 years
5 - 18 Lacs
India
Remote
Role: Senior Appian Developer (Hybrid) Position Type: Full-Time Contract (40hrs/week) Contract Duration: Long Term Work Schedule: 8 hours/day (Mon-Fri) Location: Hyderabad, India - Hybrid (3 days/week on site) What You'll Do: Troubleshoot and resolve technical issues related to Appian applications, ensuring minimal downtime and optimal performance. Diagnose and fix problems in Talend workflows, focusing on data extraction, transformation, and loading processes. Manage and troubleshoot SQL Server databases, ensuring data integrity, performance, and security Working knowledge of automations built on Power Automate from troubleshooting and maintenance perspective Handle Autosys job scheduling and automation, ensuring smooth execution of batch jobs and workflows. Collaborate with cross-functional teams to gather requirements, design solutions, and implement troubleshooting strategies. Document and track issues, resolutions, and best practices to improve the overall troubleshooting process. Provide technical support during production releases and maintenance windows, working closely with the Operations team. Stay up-to-date with the latest industry trends and best practices in troubleshooting and technical support. Qualifications: Bachelor's degree in Computer Science, Information Technology, or a related field. Talents Needed for Success: Minimum of 7 years of experience in technical troubleshooting and support. Proven experience in troubleshooting Appian applications, with a strong understanding of Appian architecture and integration patterns. Expertise in Talend, including designing and troubleshooting ETL processes. Proficiency in SQL Server, including database design, optimization, and performance tuning. Experience with Autosys job scheduling and automation, including setting up and managing jobs using Autosys. Strong analytical and problem-solving skills, with the ability to work independently and as part of a team. Excellent communication and interpersonal skills, with the ability to collaborate effectively with stakeholders at all levels. Additional Skills: Experience with version control systems (e.g., Git) and collaboration tools (e.g., Jira, Confluence). Knowledge of scripting languages such as Python and Shell/Batch programming is a plus. Understanding of Agile processes and methodologies, with experience in working in an Agile framework using Scrum. Job Types: Full-time, Contractual / Temporary Contract length: 12 months Pay: ₹543,352.07 - ₹1,855,655.80 per year Application Question(s): How many years of experience do you have of troubleshooting in Appian applications? How much experience do you have inTalend, including designing and troubleshooting ETL processes? How much Experience do you have with Autosys job scheduling and automation? Are you comfortable to work 3 days onsite and 2 days remote in a week? How soon you can join us? License/Certification: Appian L2 certification (Required) Location: Hyderabad Jubilee Ho, Hyderabad, Telangana (Required)
Posted 6 days ago
0 years
0 Lacs
Telangana
On-site
We are looking for an experienced and motivated Senior Data Engineer to join our dynamic team. In this role, The role primarily focuses on MDM and associated ETL and real-time feeds monitoring and support. This engineer will be part of the global L1/L2 production support team , which is split between Chubb Engineering Centers in India and Mexico. Key responsibilities will include monitoring ETL processes, handling automated issues, and ensuring compliance with security policies. A good understanding of MDM Informatica and Data Factory is preferred The ideal candidate will have experience with Powercenter, MDM, Azure Data Factory, be able to identify and resolve data quality issues, proactively monitor production systems, performance bottlenecks, and other ETL-related problems. Responsibilities: Monitor ETL jobs including Powercenter/IICS, kafka based near real-time updates, batch processes. Troubleshoot production incidents Understands data mapping and data modeling methodologies including normal form, star, and snowflake to reduce data redundancy and improve data integrity. Maintains knowledge on current and emerging developments/trends for assigned area(s) of responsibility, assesses the impact, and collaborates with Scrum Team and Leadership to
Posted 6 days ago
5.0 - 8.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Cortex is urgently hiring for the role : ''Data Engineer'' Experience: 5 to 8 years Location: Bangalore, Noida, and Hyderabad (Hybrid, weekly 2 Days office must) NP: Immediate to 10days only Key skills: Candidates Must have experience in Python, Kafka Stream, Pyspark, and Azure Databricks Role Overview We are looking for a highly skilled with expertise in Kafka, Python, and Azure Databricks (preferred) to drive our healthcare data engineering projects. The ideal candidate will have deep experience in real-time data streaming, cloud-based data platforms, and large-scale data processing. This role requires strong technical leadership, problem-solving abilities, and the ability to collaborate with cross-functional teams. Key Responsibilities Lead the design, development, and implementation of real-time data pipelines using Kafka, Python, and Azure Databricks. Architect scalable data streaming and processing solutions to support healthcare data workflows. Develop, optimize, and maintain ETL/ELT pipelines for structured and unstructured healthcare data. Ensure data integrity, security, and compliance with healthcare regulations (HIPAA, HITRUST, etc.). Collaborate with data engineers, analysts, and business stakeholders to understand requirements and translate them into technical solutions. Troubleshoot and optimize Kafka streaming applications, Python scripts, and Databricks workflows. Mentor junior engineers, conduct code reviews, and ensure best practices in data engineering. Stay updated with the latest cloud technologies, big data frameworks, and industry trends. If you are interested kindly send your resume to us by just clicking '' easy apply''. This job is posted by Aishwarya.K Business HR - Day recruitment Cortex Consultants LLC (US) | Cortex Consulting Pvt Ltd (India) | Tcell (Canada) US | India | Canada
Posted 6 days ago
10.0 years
20 - 30 Lacs
Hyderābād
On-site
Job Title: SAP FICO Consultant ( Carve out) Experience required: 10+ Years Location: Hyderabad Work mode: Onsite Availability: immediate to 15 days Job Description: All the candidates must have worked on Carve-out 10+ years of experience in SAP FICO implementation and support. At least 2–3 full-lifecycle carve-out projects or M&A separation projects in SAP environment. Strong understanding of SAP Financial Accounting and Controlling, including: GL, AP, AR, Asset Accounting Cost Center Accounting, Internal Orders, Product Costing, and Profitability Analysis (COPA) Experience with SAP S/4HANA is highly desirable. Deep knowledge of legal entity structuring, company code creation, and data partitioning. Experience with cross-module integration (SD, MM, PP). Strong data migration, cleansing, and mapping skills. Excellent communication and stakeholder management skills. Understanding of compliance (IFRS/GAAP), SOX controls, and audit readiness during separation. Responsibilities: Lead or support SAP FICO stream in carve-out or divestiture projects, ensuring smooth financial separation and reporting. Perform financial impact analysis, legal entity setup, and company code restructuring. Design and configure SAP FICO modules (GL, AR, AP, AA, CO, PCA, CCA, COPA) for the new entity or separated business unit. Manage data separation, including historical and open financial transactions, master data, and cost objects. Work with SAP Migration tools (LSMW, BODS, or third-party ETL tools) to extract and transform financial data for the new entity. Coordinate closely with the Basis, Security, SD/MM/PP teams, and external stakeholders to ensure complete functional carve-out. Support cutover planning, testing (SIT/UAT), and hyper care phases. Provide advisory support on taxation, intercompany transactions, and financial consolidation implications. Document business process design, configurations, and user guides. Job Types: Full-time, Permanent Pay: ₹2,000,000.00 - ₹3,000,000.00 per year Schedule: Day shift Experience: SAP Finance & Controlling: 10 years (Required) SAP S/4HANA: 8 years (Required) Data migration: 10 years (Required) Carve-Out Project: 4 years (Required) SAP FICO: 10 years (Required) Location: Hyderabad, Telangana (Preferred) Work Location: In person
Posted 6 days ago
4.0 years
0 Lacs
Telangana
On-site
We are looking for an experienced and motivated Senior Data Engineer to join our dynamic team. In this role, The role primarily focuses on MDM and associated ETL and real-time feeds monitoring and support. This engineer will be part of the global L1/L2 production support team , which is split between Chubb Engineering Centers in India and Mexico. Key responsibilities will include monitoring ETL processes, handling automated issues, and ensuring compliance with security policies. A good understanding of MDM Informatica and Data Factory is preferred The ideal candidate will have experience with Powercenter, MDM, Azure Data Factory, be able to identify and resolve data quality issues, proactively monitor production systems, performance bottlenecks, and other ETL-related problems. Responsibilities: Monitor ETL jobs including Powercenter/IICS, kafka based near real-time updates, batch processes. Troubleshoot production incidents Understands data mapping and data modeling methodologies including normal form, star, and snowflake to reduce data redundancy and improve data integrity. Maintains knowledge on current and emerging developments/trends for assigned area(s) of responsibility, assesses the impact, and collaborates with Scrum Team and Leadership to 4 Year/bachelor’s degree or equivalent work experience (4 years of experience in lieu of Bachelors)_ At least 5+ years of Strong understanding of ETL development concepts and tools such as ETL development solutions (e.g. Powercenter and/or IICS, Informatica MDM, Azure Data Factory, Snowflake) Experience with Data Warehousing and Business Intelligence concepts and technologies Knowledge of SQL and advanced programming languages such as Python and Java Demonstrated critical thinking skills and the ability to identify and resolve data quality issues, performance bottlenecks, and other ETL-related problems Experience with Agile methodologies and project-management skills Excellent communication and interpersonal skills 2+ years of experience in scheduling jobs using Autosys (or comparable distributed scheduler) 3+ years of experience writing Unix/Linux or Windows Scripts in tools such as PERL, Shell script, Python, etc. 3+ years of experience in creating complex technical specifications from business requirements/specifications
Posted 6 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15458 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France