Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Notice period 30 days to immediate Role description Myrefers GCP PythonApache beamp3 to 8 years of overall IT experience which includes hands on experience in Big Data technologies Mandatory Hands on experience in Python and PySpark Python as a language is practically usable for anything we are looking for application Development and Extract Transform Load and Data lake curation experience using Python Build pySpark applications using Spark Dataframes in Python using Jupyter notebook and PyCharm IDE Worked on optimizing spark jobs that processes huge volumes of data Hands on experience in version control tools like Git Worked on Amazons Analytics services like Amazon EMR Amazon Athena AWS Glue Worked on Amazons Compute services like Amazon Lambda Amazon EC2 and Amazons Storage service like S3 and few other services like SNS Experience knowledge of bash shell scripting will be a plus Has built ETL processes to take data copy it structurally transform it etc involving a wide variety of formats like CSV TSV XML and JSON Experience in working with fixed width delimited multi record file formats etc Good to have knowledge of datawarehousing concepts dimensions facts schemas snowflake star etc Have worked with columnar storage formats Parquet Avro ORC etc Well versed with compression techniques Snappy Gzip Good to have knowledge of AWS databases atleast one Aurora RDS Redshift ElastiCache DynamoDB Skills Mandatory Skills :GCP, Apache Spark,Python,SparkSQL,Big Data Hadoop Ecosystem
Posted 6 hours ago
12.0 years
5 - 10 Lacs
Hyderābād
On-site
Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. [Senior Manager Software Development Engineering] What you will do Let’s do this. Let’s change the world. In this vital role you will be responsible for designing, developing, and maintaining software applications and solutions that meet business needs and ensuring the availability and performance of critical systems and applications. This role involves working closely with product managers, designers, and other engineers to create high-quality, scalable software solutions and automating operations, monitoring system health, and responding to incidents to minimize downtime. Roles & Responsibilities: Provide technical leadership to enhance the culture of innovation, automation, and solving difficult scientific and business challenges. Technical leadership includes providing vision and direction to develop scalable reliable solutions. Provide leadership to select right-sized and appropriate tools and architectures based on requirements, data source format, and current technologies Develop, refactor, research and improve Weave cloud platform capabilities. Understand business drivers and technical needs so our cloud services seamlessly, automatically, and securely provides them the best service. Develop data flow pipelines to extract, transform, and load data from various data sources in various forms, including custom ETL pipelines that enable model and product development Build strong partnership with stakeholder Build data products and service processes which perform data transformation, metadata extraction, workload management and error processing management to ensure high quality data Provide clear documentation for delivered solutions and processes, integrating documentation Collaborate with business partners to understand user stories and ensure technical solution/build can deliver to those needs Work with multi-functional teams to design and document effective and efficient solutions. Develop change management strategies and assist in their implementation. Mentor junior data engineers on standard methodologies in the industry and in the Amgen data landscape What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Basic Qualifications and Experience: Doctorate Degree /Master's degree / Bachelor's degree and 12to 17 years Computer Science, IT or related field experience Preferred Skills: Must-Have Skills: Superb communication and interpersonal skills, with the ability to work cross-functionally with multi-functional GTM, product, and engineering teams. Minimum of 10+ years overall Software Engineer or Cloud Architect experience Minimum 3+ years in architecture role using public cloud solutions such as AWS Experience with AWS Technology stack Good-to-Have Skills: Familiarity with big data technologies, AI platforms, and cloud-based data solutions. Ability to work effectively across matrixed organizations and lead collaboration between data and AI teams. Passion for technology and customer success, particularly in driving innovative AI and data solutions. Experience working with teams of data scientists, software engineers and business experts to drive insights Experience with AWS Services such as EC2, S3, Redshift/Spectrum, Glue, Athena, RDS, Lambda, and API gateway. Experience with Big Data Technologies (Hadoop, Hive, Hbase, Pig, Spark, etc) Good understanding of relevant data standards and industry trends Ability to understand new business requirements and prioritize them for delivery Experience working in biopharma/life sciences industry Proficient in one of the coding languages (Python, Java, Scala) Hands on experience writing SQL using any RDBMS (Redshift, Postgres, MySQL, Teradata, Oracle, etc.). Experience with Schema Design & Dimensional data modeling. Experience with software DevOps CI/CD tools, such Git, Jenkins, Linux, and Shell Script Hands on experience using Databricks/Jupyter or similar notebook environment. Experience working with GxP systems Experience working in an agile environment (i.e. user stories, iterative development, etc.) Experience working with test-driven development and software test automation Experience working in a Product environment Good overall understanding of business, manufacturing, and laboratory systems common in the pharmaceutical industry, as well as the integration of these systems through applicable standards. Soft Skills: Excellent analytical and troubleshooting skills. Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to handle multiple priorities successfully. Team-oriented, with a focus on achieving team goals What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
Posted 7 hours ago
3.0 years
0 Lacs
Hyderābād
On-site
DESCRIPTION AOP team within Amazon Transportation is looking for an innovative, hands-on and customer-obsessed Business Intelligence Engineer for Analytics team. Candidate must be detail oriented, have superior verbal and written communication skills, strong organizational skills, excellent technical skills and should be able to juggle multiple tasks at once. Ideal candidate must be able to identify problems before they happen and implement solutions that detect and prevent outages. The candidate must be able to accurately prioritize projects, make sound judgments, work to improve the customer experience and get the right things done. This job requires you to constantly hit the ground running and have the ability to learn quickly. Primary responsibilities include defining the problem and building analytical frameworks to help the operations to streamline the process, identifying gaps in the existing process by analyzing data and liaising with relevant team(s) to plug it and analyzing data and metrics and sharing update with the internal teams. Key job responsibilities 1) Apply multi-domain/process expertise in day to day activities and own end to end roadmap. 2) Translate complex or ambiguous business problem statements into analysis requirements and maintain high bar throughout the execution. 3) Define analytical approach; review and vet analytical approach with stakeholders. 4) Proactively and independently work with stakeholders to construct use cases and associated standardized outputs 5) Scale data processes and reports; write queries that clients can update themselves; lead work with data engineering for full-scale automation 6) Have a working knowledge of the data available or needed by the wider business for more complex or comparative analysis 7) Work with a variety of data sources and Pull data using efficient query development that requires less post processing (e.g., Window functions, virt usage) 8) When needed, pull data from multiple similar sources to triangulate on data fidelity 9) Actively manage the timeline and deliverables of projects, focusing on interactions in the team 10) Provide program communications to stakeholders 11) Communicate roadblocks to stakeholders and propose solutions 12) Represent team on medium-size analytical projects in own organization and effectively communicate across teams [January 21, 2025, 1:30 PM] Dhingra, Gunjit: Day in life A day in the life 1) Solve ambiguous analyses with less well-defined inputs and outputs; drive to the heart of the problem and identify root causes 2) Have the capability to handle large data sets in analysis through the use of additional tools 3) Derive recommendations from analysis that significantly impact a department, create new processes, or change existing processes 4) Understand the basics of test and control comparison; may provide insights through basic statistical measures such as hypothesis testing 5) Identify and implement optimal communication mechanisms based on the data set and the stakeholders involved 6) Communicate complex analytical insights and business implications effectively About the team AOP (Analytics Operations and Programs) team is missioned to standardize BI and analytics capabilities, and reduce repeat analytics/reporting/BI workload for operations across IN, AU, BR, MX, SG, AE, EG, SA marketplace. AOP is responsible to provide visibility on operations performance and implement programs to improve network efficiency and defect reduction. The team has a diverse mix of strong engineers, Analysts and Scientists who champion customer obsession. We enable operations to make data-driven decisions through developing near real-time dashboards, self-serve dive-deep capabilities and building advanced analytics capabilities. We identify and implement data-driven metric improvement programs in collaboration (co-owning) with Operations teams. BASIC QUALIFICATIONS 3+ years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience Experience with data visualization using Tableau, Quicksight, or similar tools Experience with data modeling, warehousing and building ETL pipelines Experience in Statistical Analysis packages such as R, SAS and Matlab Experience using SQL to pull data from a database or data warehouse and scripting experience (Python) to process data for modeling PREFERRED QUALIFICATIONS Experience with AWS solutions such as EC2, DynamoDB, S3, and Redshift Experience in data mining, ETL, etc. and using databases in a business environment with large-scale, complex datasets Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.
Posted 7 hours ago
4.0 years
0 Lacs
Hyderābād
On-site
DESCRIPTION Amazon Transportation team is looking for an innovative, hands-on and customer-obsessed Business Analyst for Analytics team. Candidate must be detail oriented, have superior verbal and written communication skills, strong organizational skills, excellent technical skills and should be able to juggle multiple tasks at once. Ideal candidate must be able to identify problems before they happen and implement solutions that detect and prevent outages. The candidate must be able to accurately prioritize projects, make sound judgments, work to improve the customer experience and get the right things done. This job requires you to constantly hit the ground running and have the ability to learn quickly. Primary responsibilities include defining the problem and building analytical frameworks to help the operations to streamline the process, identifying gaps in the existing process by analyzing data and liaising with relevant team(s) to plug it and analyzing data and metrics and sharing update with the internal teams. Key job responsibilities 1) Apply multi-domain/process expertise in day to day activities and own end to end roadmap. 2) Translate complex or ambiguous business problem statements into analysis requirements and maintain high bar throughout the execution. 3) Define analytical approach; review and vet analytical approach with stakeholders. 4) Proactively and independently work with stakeholders to construct use cases and associated standardized outputs 5) Scale data processes and reports; write queries that clients can update themselves; lead work with data engineering for full-scale automation 6) Have a working knowledge of the data available or needed by the wider business for more complex or comparative analysis 7) Work with a variety of data sources and Pull data using efficient query development that requires less post processing (e.g., Window functions, virt usage) 8) When needed, pull data from multiple similar sources to triangulate on data fidelity 9) Actively manage the timeline and deliverables of projects, focusing on interactions in the team 10) Provide program communications to stakeholders 11) Communicate roadblocks to stakeholders and propose solutions 12) Represent team on medium-size analytical projects in own organization and effectively communicate across teams A day in the life 1) Solve ambiguous analyses with less well-defined inputs and outputs; drive to the heart of the problem and identify root causes 2) Have the capability to handle large data sets in analysis through the use of additional tools 3) Derive recommendations from analysis that significantly impact a department, create new processes, or change existing processes 4) Understand the basics of test and control comparison; may provide insights through basic statistical measures such as hypothesis testing 5) Identify and implement optimal communication mechanisms based on the data set and the stakeholders involved 6) Communicate complex analytical insights and business implications effectively About the team AOP (Analytics Operations and Programs) team is missioned to standardize BI and analytics capabilities, and reduce repeat analytics/reporting/BI workload for operations across IN, AU, BR, MX, SG, AE, EG, SA marketplace. AOP is responsible to provide visibility on operations performance and implement programs to improve network efficiency and defect reduction. The team has a diverse mix of strong engineers, Analysts and Scientists who champion customer obsession. We enable operations to make data-driven decisions through developing near real-time dashboards, self-serve dive-deep capabilities and building advanced analytics capabilities. We identify and implement data-driven metric improvement programs in collaboration (co-owning) with Operations teams. BASIC QUALIFICATIONS 4+ years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience Experience with data visualization using Tableau, Quicksight, or similar tools Experience with data modeling, warehousing and building ETL pipelines Experience with AWS solutions such as EC2, DynamoDB, S3, and Redshift Experience using SQL to pull data from a database or data warehouse and scripting experience (Python) to process data for modeling Experience developing and presenting recommendations of new metrics allowing better understanding of the performance of the business 4+ years of ecommerce, transportation, finance or related analytical field experience PREFERRED QUALIFICATIONS Experience in Statistical Analysis packages such as R, SAS and Matlab Experience in data mining, ETL, etc. and using databases in a business environment with large-scale, complex datasets Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.
Posted 7 hours ago
5.0 years
2 - 3 Lacs
Hyderābād
On-site
Category: Business Consulting, Strategy and Digital Transformation Main location: India, Andhra Pradesh, Hyderabad Position ID: J0725-0862 Employment Type: Full Time Position Description: Job Title: Data EngineerExperience Level: 5+ YearsLocation: Hyderabad Job Summary We are looking for a seasoned and innovative Senior Data Engineer to join our dynamic data team. This role is ideal for professionals with a strong foundation in data engineering, coupled with hands-on experience in machine learning workflows, statistical analysis, and big data technologies. You will play a critical role in building scalable data pipelines, enabling advanced analytics, and supporting data science initiatives. Proficiency in Python is essential, and experience with PySpark is a strong plus. Key Responsibilities Data Pipeline Development: Design and implement scalable, high-performance ETL/ELT pipelines using Python and PySpark. ML & Statistical Integration: Collaborate with data scientists to integrate machine learning models and statistical analysis into data workflows. Data Modeling: Create and optimize data models (relational, dimensional, and columnar) to support analytics and ML use cases. Big Data Infrastructure: Manage and optimize data platforms such as Snowflake, Redshift, BigQuery, and Databricks. Performance Tuning: Monitor and enhance the performance of data pipelines and queries. Data Governance: Ensure data quality, integrity, and compliance through robust governance practices. Cross-functional Collaboration: Partner with analysts, scientists, and product teams to translate business needs into technical solutions. Automation & Monitoring: Automate data workflows and implement monitoring and alerting systems. Mentorship: Guide junior engineers and promote best practices in data engineering and ML integration. Innovation: Stay current with emerging technologies in data engineering, ML, and analytics. Required Qualifications Bachelor’s or Master’s degree in Computer Science, Data Science, Engineering, or a related field. 5+ years of experience in data engineering with a strong focus on Python and big data tools. Solid understanding of machine learning concepts and statistical analysis techniques. Proficiency in SQL and Python; experience with PySpark is highly desirable. Experience with cloud platforms (AWS, Azure, or GCP) and data tools (e.g., Glue, Data Factory, Dataflow). Familiarity with data warehousing and lakehouse architectures. Knowledge of data modeling techniques (e.g., star schema, snowflake schema). Experience with version control systems like Git. Strong problem-solving skills and ability to work in a fast-paced environment. Excellent communication and collaboration skills. Skills: English Data Engineering Python SQLite Statistical Analysis What you can expect from us: Together, as owners, let’s turn meaningful insights into action. Life at CGI is rooted in ownership, teamwork, respect and belonging. Here, you’ll reach your full potential because… You are invited to be an owner from day 1 as we work together to bring our Dream to life. That’s why we call ourselves CGI Partners rather than employees. We benefit from our collective success and actively shape our company’s strategy and direction. Your work creates value. You’ll develop innovative solutions and build relationships with teammates and clients while accessing global capabilities to scale your ideas, embrace new opportunities, and benefit from expansive industry and technology expertise. You’ll shape your career by joining a company built to grow and last. You’ll be supported by leaders who care about your health and well-being and provide you with opportunities to deepen your skills and broaden your horizons. Come join our team—one of the largest IT and business consulting services firms in the world.
Posted 7 hours ago
7.0 years
0 Lacs
India
On-site
About Us: MatchMove is a leading embedded finance platform that empowers businesses to embed financial services into their applications. We provide innovative solutions across payments, banking-as-a-service, and spend/send management, enabling our clients to drive growth and enhance customer experiences. Are You The One? As a Technical Lead Engineer - Data, you will architect, implement, and scale our end-to-end data platform built on AWS S3, Glue, Lake Formation, and DMS. You will lead a small team of engineers while working cross-functionally with stakeholders from fraud, finance, product, and engineering to enable reliable, timely, and secure data access across the business. You will champion best practices in data design, governance, and observability, while leveraging GenAI tools to improve engineering productivity and accelerate time to insight. You will contribute to: Owning the design and scalability of the data lake architecture for both streaming and batch workloads, leveraging AWS-native services. Leading the development of ingestion, transformation, and storage pipelines using AWS Glue, DMS, Kinesis/Kafka, and PySpark. Structuring and evolving data into OTF formats (Apache Iceberg, Delta Lake) to support real-time and time-travel queries for downstream services. Driving data productization, enabling API-first and self-service access to curated datasets for fraud detection, reconciliation, and reporting use cases. Defining and tracking SLAs and SLOs for critical data pipelines, ensuring high availability and data accuracy in a regulated fintech environment. Collaborating with InfoSec, SRE, and Data Governance teams to enforce data security, lineage tracking, access control, and compliance (GDPR, MAS TRM). Using Generative AI tools to enhance developer productivity — including auto-generating test harnesses, schema documentation, transformation scaffolds, and performance insights. Mentoring data engineers, setting technical direction, and ensuring delivery of high-quality, observable data pipelines. Responsibilities: Architect scalable, cost-optimized pipelines across real-time and batch paradigms, using tools such as AWS Glue, Step Functions, Airflow, or EMR. Manage ingestion from transactional sources using AWS DMS, with a focus on schema drift handling and low-latency replication. Design efficient partitioning, compression, and metadata strategies for Iceberg or Hudi tables stored in S3, and cataloged with Glue and Lake Formation. Build data marts, audit views, and analytics layers that support both machine-driven processes (e.g. fraud engines) and human-readable interfaces (e.g. dashboards). Ensure robust data observability with metrics, alerting, and lineage tracking via OpenLineage or Great Expectations. Lead quarterly reviews of data cost, performance, schema evolution, and architecture design with stakeholders and senior leadership. Enforce version control, CI/CD, and infrastructure-as-code practices using GitOps and tools like Terraform. Requirements: At-least 7 years of experience in data engineering. Deep hands-on experience with AWS data stack: Glue (Jobs & Crawlers), S3, Athena, Lake Formation, DMS, and Redshift Spectrum. Expertise in designing data pipelines for real-time, streaming, and batch systems, including schema design, format optimization, and SLAs. Strong programming skills in Python (PySpark) and advanced SQL for analytical processing and transformation. Proven experience managing data architectures using open table formats (Iceberg, Delta Lake, Hudi) at scale. Understanding of stream processing with Kinesis/Kafka and orchestration via Airflow or Step Functions. Experience implementing data access controls, encryption policies, and compliance workflows in regulated environments. Ability to integrate GenAI tools into data engineering processes to drive measurable productivity and quality gains — with strong engineering hygiene. Demonstrated ability to lead teams, drive architectural decisions, and collaborate with cross-functional stakeholders. Brownie Points: Experience working in a PCI DSS or any other central bank regulated environment with audit logging and data retention requirements. Experience in the payments or banking domain, with use cases around reconciliation, chargeback analysis, or fraud detection. Familiarity with data contracts, data mesh patterns, and data as a product principles. Experience using GenAI to automate data documentation, generate data tests, or support reconciliation use cases. Exposure to performance tuning and cost optimization strategies in AWS Glue, Athena, and S3. Experience building data platforms for ML/AI teams or integrating with model feature stores. MatchMove Culture: We cultivate a dynamic and innovative culture that fuels growth, creativity, and collaboration. Our fast-paced fintech environment thrives on adaptability, agility, and open communication. We focus on employee development, supporting continuous learning and growth through training programs, learning on the job and mentorship. We encourage speaking up, sharing ideas, and taking ownership. Embracing diversity, our team spans across Asia, fostering a rich exchange of perspectives and experiences. Together, we harness the power of fintech and e-commerce to make a meaningful impact on people's lives. Personal Data Protection Act: By submitting your application for this job, you are authorizing MatchMove to: collect and use your personal data, and to disclose such data to any third party with whom MatchMove or any of its related corporation has service arrangements, in each case for all purposes in connection with your job application, and employment with MatchMove; and retain your personal data for one year for consideration of future job opportunities (where applicable).
Posted 7 hours ago
10.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Acuity Knowledge Partners (Acuity) is a leading provider of bespoke research, analytics and technology solutions to the financial services sector, including asset managers, corporate and investment banks, private equity and venture capital firms, hedge funds and consulting firms. Its global network of over 6,000 analysts and industry experts, combined with proprietary technology, supports more than 600 financial institutions and consulting companies to operate more efficiently and unlock their human capital, driving revenue higher and transforming operations. Acuity is headquartered in London and operates from 10 locations worldwide. The company fosters a diverse, equitable and inclusive work environment, nurturing talent, regardless of race, gender, ethnicity or sexual orientation. Acuity was established as a separate business from Moody’s Corporation in 2019, following its acquisition by Equistone Partners Europe (Equistone). In January 2023, funds advised by global private equity firm Permira acquired a majority stake in the business from Equistone, which remains invested as a minority shareholder. For more information, visit www.acuitykp.com Position Title- Associate Director (Senior Architect – Data) Department-IT Location- Gurgaon/ Bangalore Job Summary The Enterprise Data Architect will enhance the company's strategic use of data by designing, developing, and implementing data models for enterprise applications and systems at conceptual, logical, business area, and application layers. This role advocates data modeling methodologies and best practices. We seek a skilled Data Architect with deep knowledge of data architecture principles, extensive data modeling experience, and the ability to create scalable data solutions. Responsibilities include developing and maintaining enterprise data architecture, ensuring data integrity, interoperability, security, and availability, with a focus on ongoing digital transformation projects. Key Responsibilities Strategy & Planning Develop and deliver long-term strategic goals for data architecture vision and standards in conjunction with data users, department managers, clients, and other key stakeholders. Create short-term tactical solutions to achieve long-term objectives and an overall data management roadmap. Establish processes for governing the identification, collection, and use of corporate metadata; take steps to assure metadata accuracy and validity. Establish methods and procedures for tracking data quality, completeness, redundancy, and improvement. Conduct data capacity planning, life cycle, duration, usage requirements, feasibility studies, and other tasks. Create strategies and plans for data security, backup, disaster recovery, business continuity, and archiving. Ensure that data strategies and architectures are aligned with regulatory compliance. Develop a comprehensive data strategy in collaboration with different stakeholders that aligns with the transformational projects’ goals. Ensure effective data management throughout the project lifecycle. Acquisition & Deployment Ensure the success of enterprise-level application rollouts (e.g. ERP, CRM, HCM, FP&A, etc.) Liaise with vendors and service providers to select the products or services that best meet company goals Operational Management o Assess and determine governance, stewardship, and frameworks for managing data across the organization. o Develop and promote data management methodologies and standards. o Document information products from business processes and create data entities o Create entity relationship diagrams to show the digital thread across the value streams and enterprise o Create data normalization across all systems and data base to ensure there is common definition of data entities across the enterprise o Document enterprise reporting needs develop the data strategy to enable single source of truth for all reporting data o Address the regulatory compliance requirements of each country and ensure our data is secure and compliant o Select and implement the appropriate tools, software, applications, and systems to support data technology goals. o Oversee the mapping of data sources, data movement, interfaces, and analytics, with the goal of ensuring data quality. o Collaborate with project managers and business unit leaders for all projects involving enterprise data. o Address data-related problems regarding systems integration, compatibility, and multiple-platform integration. o Act as a leader and advocate of data management, including coaching, training, and career development to staff. o Develop and implement key components as needed to create testing criteria to guarantee the fidelity and performance of data architecture. o Document the data architecture and environment to maintain a current and accurate view of the larger data picture. o Identify and develop opportunities for data reuse, migration, or retirement. Data Architecture Design: Develop and maintain the enterprise data architecture, including data models, databases, data warehouses, and data lakes. Design and implement scalable, high-performance data solutions that meet business requirements. Data Governance: Establish and enforce data governance policies and procedures as agreed with stakeholders. Maintain data integrity, quality, and security within Finance, HR and other such enterprise systems. Data Migration: Oversee the data migration process from legacy systems to the new systems being put in place. Define & Manage data mappings, cleansing, transformation, and validation to ensure accuracy and completeness. Master Data Management: Devise processes to manage master data (e.g., customer, vendor, product information) to ensure consistency and accuracy across enterprise systems and business processes. Provide data management (create, update and delimit) methods to ensure master data is governed Stakeholder Collaboration: Collaborate with various stakeholders, including business users, other system vendors, and stakeholders to understand data requirements. Ensure the enterprise system meets the organization's data needs. Training and Support: Provide training and support to end-users on data entry, retrieval, and reporting within the candidate enterprise systems. Promote user adoption and proper use of data. 10 Data Quality Assurance: Implement data quality assurance measures to identify and correct data issues. Ensure the Oracle Fusion and other enterprise systems contain reliable and up-to-date information. Reporting and Analytics: Facilitate the development of reporting and analytics capabilities within the Oracle Fusion and other systems Enable data-driven decision-making through robust data analysis. Continuous Improvement: Continuously monitor and improve data processes and the Oracle Fusion and other system's data capabilities. Leverage new technologies for enhanced data management to support evolving business needs. Technology and Tools: Oracle Fusion Cloud Data modeling tools (e.g., ER/Studio, ERwin) ETL tools (e.g., Informatica, Talend, Azure Data Factory) Data Pipelines: Understanding of data pipeline tools like Apache Airflow and AWS Glue. Database management systems: Oracle Database, MySQL, SQL Server, PostgreSQL, MongoDB, Cassandra, Couchbase, Redis, Hadoop, Apache Spark, Amazon RDS, Google BigQuery, Microsoft Azure SQL Database, Neo4j, OrientDB, Memcached) Data governance tools (e.g., Collibra, Informatica Axon, Oracle EDM, Oracle MDM) Reporting and analytics tools (e.g., Oracle Analytics Cloud, Power BI, Tableau, Oracle BIP) Hyperscalers / Cloud platforms (e.g., AWS, Azure) Big Data Technologies such as Hadoop, HDFS, MapReduce, and Spark Cloud Platforms such as Amazon Web Services, including RDS, Redshift, and S3, Microsoft Azure services like Azure SQL Database and Cosmos DB and experience in Google Cloud Platform services such as BigQuery and Cloud Storage. Programming Languages: (e.g. using Java, J2EE, EJB, .NET, WebSphere, etc.) SQL: Strong SQL skills for querying and managing databases. Python: Proficiency in Python for data manipulation and analysis. Java: Knowledge of Java for building data-driven applications. Data Security and Protocols: Understanding of data security protocols and compliance standards. Key Competencies Qualifications: Education: Bachelor’s degree in computer science, Information Technology, or a related field. Master’s degree preferred. Experience: 10+ years overall and at least 7 years of experience in data architecture, data modeling, and database design. Proven experience with data warehousing, data lakes, and big data technologies. Expertise in SQL and experience with NoSQL databases. Experience with cloud platforms (e.g., AWS, Azure) and related data services. Experience with Oracle Fusion or similar ERP systems is highly desirable. Skills: Strong understanding of data governance and data security best practices. Excellent problem-solving and analytical skills. Strong communication and interpersonal skills. Ability to work effectively in a collaborative team environment. Leadership experience with a track record of mentoring and developing team members. Excellent in documentation and presentations. Good knowledge of applicable data privacy practices and laws. Certifications: Relevant certifications (e.g., Certified Data Management Professional, AWS Certified Big Data – Specialty) are a plus. Behavioral A self-starter, an excellent planner and executor and above all, a good team player Excellent communication skills and inter-personal skills are a must Must possess organizational skills, including multi-task capability, priority setting and meeting deadlines Ability to build collaborative relationships and effectively leverage networks to mobilize resources Initiative to learn business domain is highly desirable Likes dynamic and constantly evolving environment and requirements
Posted 7 hours ago
0 years
0 Lacs
Gurugram, Haryana, India
Remote
Every day, tens of millions of people come to Roblox to explore, create, play, learn, and connect with friends in 3D immersive digital experiences– all created by our global community of developers and creators. At Roblox, we’re building the tools and platform that empower our community to bring any experience that they can imagine to life. Our vision is to reimagine the way people come together, from anywhere in the world, and on any device. We’re on a mission to connect a billion people with optimism and civility, and looking for amazing talent to help us get there. A career at Roblox means you’ll be working to shape the future of human interaction, solving unique technical challenges at scale, and helping to create safer, more civil shared experiences for everyone. Roblox Operating System (ROS) is our internal productivity platform that governs how Roblox operates as a company. Through an integrated suite of tools, ROS shapes how we make talent and personnel decisions, plan and organize work, discover knowledge, and scale efficiently. We are seeking a Senior Data Engineer to enhance our data posture and architecture, synchronizing data across vital third-party systems like Workday, Greenhouse, GSuite, and JIRA, as well as our internal Roblox OS application database. Our Roblox OS app suite encompasses internal tools and third-party applications for People Operations, Talent Acquisition, Budgeting, Roadmapping, and Business Analytics. We envision an integrated platform that streamlines processes while providing employees and leaders with the information they need to support the business. This is a new team in our Roblox India location, working closely with data scientists & analysts, product & engineering, and other stakeholders in India & US. You will report to the Engineering Manager of the Roblox OS Team in your local location and collaborate with Roblox internal teams globally. Work Model : This role is based in Gurugram and follows a hybrid structure — 3 days from the office (Tuesday, Wednesday & Thursday) and 2 days work from home. Shift Time : 2:00pm - 10:30pm IST (Cabs will be provided) You Will Design and Build Scalable Data Pipelines: Architect, develop, and maintain robust, scalable data pipelines using orchestration frameworks like Airflow to synchronize data between internal systems. Implement and Optimize ETL Processes: Apply strong understanding of ETL (Extract, Transform, Load) processes and best practices for seamless data integration and transformation. Develop Data Solutions with SQL: Utilize your proficiency in SQL and relational databases (e.g., PostgreSQL) for advanced querying, data modeling, and optimizing data solutions. Contribute to Data Architecture: Actively participate in data architecture and implementation discussions, ensuring data integrity and efficient data transposition. Manage and optimize data infrastructure, including database, cloud storage solutions, and API endpoints. Write High-Quality Code: Focus on developing clear, readable, testable, modular, and well-monitored code for data manipulation, automation, and software development with a strong emphasis on data integrity. Troubleshoot and Optimize Performance: Apply excellent analytical and problem-solving skills to diagnose data issues and optimize pipeline performance. Collaborate Cross-Functionally: Work effectively with cross-functional teams, including data scientists, analysts, and business stakeholders, to translate business needs into technical data solutions. Ensure Data Governance and Security: Implement data anonymization and pseudonymization techniques to protect sensitive data, and contribute to master data management (MDM) concepts including data quality, lineage, and governance frameworks. You Have Data Engineering Expertise: At least 6+ Proven experience designing, building, and maintaining scalable data pipelines, coupled with a strong understanding of ETL processes and best practices for data integration. Database and Data Warehousing Proficiency: Deep proficiency in SQL and relational databases (e.g., PostgreSQL), and familiarity with at least one cloud-based data warehouse solution (e.g., Snowflake, Redshift, BigQuery). Technical Acumen: Strong scripting skills for data manipulation and automation. Familiarity with data streaming platforms (e.g., Kafka, Kinesis), and knowledge of containerization (e.g., Docker) and cloud infrastructure (e.g., AWS, Azure, GCP) for deploying and managing data solutions. Data & Cloud Infrastructure Management: Experience with managing and optimizing data infrastructure, including database, cloud storage solutions, and configuring API endpoints. Software Development Experience: Experience in software development with a focus on data integrity and transposition, and a commitment to writing clear, readable, testable, modular, and well-monitored code. Problem-Solving & Collaboration Skills: Excellent analytical and problem-solving abilities to troubleshoot complex data issues, combined with strong communication and collaboration skills to work effectively across teams. Passion for Data: A genuine passion for working with amounts of data from various sources, understanding the critical impact of data quality on company strategy at an executive level. Adaptability: Ability to thrive and deliver results in a fast-paced environment with competing priorities. Roles that are based in an office are onsite Tuesday, Wednesday, and Thursday, with optional presence on Monday and Friday (unless otherwise noted). Roblox provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws. Roblox also provides reasonable accommodations for all candidates during the interview process.
Posted 7 hours ago
1.0 years
0 Lacs
Hyderābād
Remote
DESCRIPTION Want to join the Earth’s most customer centric company? Do you like to dive deep to understand problems? Are you someone who likes to challenge Status Quo? Do you strive to excel at goals assigned to you? If yes, we have opportunities for you. Global Operations – Artificial Intelligence (GO-AI) at Amazon is looking to hire candidates who can excel in a fast-paced dynamic environment. Are you somebody that likes to use and analyze big data to drive business decisions? Do you enjoy converting data into insights that will be used to enhance customer decisions worldwide for business leaders? Do you want to be part of the data team which measures the pulse of innovative machine vision-based projects? If your answer is yes, join our team. GO-AI is looking for a motivated individual with strong skills and experience in resource utilization planning, process optimization and execution of scalable and robust operational mechanisms, to join the GO-AI Ops DnA team. In this position you will be responsible for supporting our sites to build solutions for the rapidly expanding GO-AI team. The role requires the ability to work with a variety of key stakeholders across job functions with multiple sites. We are looking for an entrepreneurial and analytical program manager, who is passionate about their work, understands how to manage service levels across multiple skills/programs, and who is willing to move fast and experiment often. Key job responsibilities Design and develop highly available dashboards and metrics using SQL and Excel/Tableau Execute high priority (i.e. cross functional, high impact) projects to create robust, scalable analytics solutions and frameworks with the help of Analytics/BIE managers Work closely with internal stakeholders such as business teams, engineering teams, and partner teams and align them with respect to your focus area Creates and maintains comprehensive business documentation including user stories, acceptance criteria, and process flows that help the BIE understand the context for developing ETL processes and visualization solutions. Performs user acceptance testing and business validation of delivered dashboards and reports, ensuring that BIE-created solutions meet actual operational needs and can be effectively utilized by site managers and operations teams. Monitors business performance metrics and operational KPIs to proactively identify emerging analytical requirements, working with BIEs to rapidly develop solutions that address real-time operational challenges in the dynamic AI-enhanced fulfillment environment. About the team The Global Operations – Artificial Intelligence (GO-AI) team remotely handles exceptions in the Amazon Robotic Fulfillment Centers Globally. GO-AI seeks to complement automated vision based decision-making technologies by providing remote human support for the subset of tasks which require higher cognitive ability and cannot be processed through automated decision making with high confidence. This team provides end-to-end solutions through inbuilt competencies of Operations and strong central specialized teams to deliver programs at Amazon scale. It is operating multiple programs including Nike IDS, Proteus, Sparrow and other new initiatives in partnership with global technology and operations teams. BASIC QUALIFICATIONS Experience defining requirements and using data and metrics to draw business insights Knowledge of SQL Knowledge of data visualization tools such as Quick Sight, Tableau, Power BI or other BI packages Knowledge of Python, VBA, Macros, Selenium scripts 1+ year of experience working in Analytics / Business Intelligence environment with prior experience of design and execution of analytical projects PREFERRED QUALIFICATIONS Experience in using AI tools Experience in Amazon Redshift and other AWS technologies for large datasets Analytical mindset and ability to see the big picture and influence others Detail-oriented and must have an aptitude for solving unstructured problems. The role will require the ability to extract data from various sources and to design/construct/execute complex analyses to finally come up with data/reports that help solve the business problem Good oral, written and presentation skills combined with the ability to be part of group discussions and explaining complex solutions Ability to apply analytical, computer, statistical and quantitative problem solving skills is required Ability to work effectively in a multi-task, high volume environment Ability to be adaptable and flexible in responding to deadlines and workflow fluctuations Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.
Posted 7 hours ago
15.0 years
0 Lacs
Gurgaon
On-site
DESCRIPTION At AWS, we are looking for a Delivery Practice Manager with a successful record of leading enterprise customers through a variety of transformative projects involving IT Strategy, distributed architecture, and hybrid cloud operations. AWS Global Services includes experts from across AWS who help our customers design, build, operate, and secure their cloud environments. Customers innovate with AWS Professional Services, upskill with AWS Training and Certification, optimize with AWS Support and Managed Services, and meet objectives with AWS Security Assurance Services. Our expertise and emerging technologies include AWS Partners, AWS Sovereign Cloud, AWS International Product, and the Generative AI Innovation Center. You’ll join a diverse team of technical experts in dozens of countries who help customers achieve more with the AWS cloud. Professional Services engage in a wide variety of projects for customers and partners, providing collective experience from across the AWS customer base and are obsessed about strong success for the Customer. Our team collaborates across the entire AWS organization to bring access to product and service teams, to get the right solution delivered and drive feature innovation based upon customer needs. 10034 Key job responsibilities - Engage customers - collaborate with enterprise sales managers to develop strong customer and partner relationships and build a growing business in a geographic territory, driving AWS adoption in key markets and accounts. - Drive infrastructure engagements - including short on-site projects proving the value of AWS services to support new distributed computing models. - Coach and teach - collaborate with AWS field sales, pre-sales, training and support teams to help partners and customers learn and use AWS services such as Amazon Databases – RDS/Aurora/DynamoDB/Redshift, Amazon Elastic Compute Cloud (EC2), Amazon Simple Storage Service (S3), AWS Identity and Access Management(IAM), etc. - Deliver value - lead high quality delivery of a variety of customized engagements with partners and enterprise customers in the commercial and public sectors. - Lead great people - attract top IT architecture talent to build high performing teams of consultants with superior technical depth, and customer relationship skills - Be a customer advocate - Work with AWS engineering teams to convey partner and enterprise customer feedback as input to AWS technology roadmaps Build organization assets – identify patterns and implement solutions that can be leveraged across customer base. Improve productivity through tooling and process improvements. About the team Diverse Experiences AWS values diverse experiences. Even if you do not meet all of the qualifications and skills listed in the job description, we encourage candidates to apply. If your career is just starting, hasn’t followed a traditional path, or includes alternative experiences, don’t let it stop you from applying. Why AWS? Amazon Web Services (AWS) is the world’s most comprehensive and broadly adopted cloud platform. We pioneered cloud computing and never stopped innovating — that’s why customers from the most successful startups to Global 500 companies trust our robust suite of products and services to power their businesses. Inclusive Team Culture AWS values curiosity and connection. Our employee-led and company-sponsored affinity groups promote inclusion and empower our people to take pride in what makes us unique. Our inclusion events foster stronger, more collaborative teams. Our continual innovation is fueled by the bold ideas, fresh perspectives, and passionate voices our teams bring to everything we do. Mentorship & Career Growth We’re continuously raising our performance bar as we strive to become Earth’s Best Employer. That’s why you’ll find endless knowledge-sharing, mentorship and other career-advancing resources here to help you develop into a better-rounded professional. Work/Life Balance We value work-life harmony. Achieving success at work should never come at the expense of sacrifices at home, which is why we strive for flexibility as part of our working culture. When we feel supported in the workplace and at home, there’s nothing we can’t achieve in the cloud. BASIC QUALIFICATIONS Bachelor’s degree in Information Science / Information Technology, Computer Science, Engineering, Mathematics, Physics, or a related field. 15+ years of IT implementation and/or delivery experience, with 5+ years working in an IT Professional Services and/or consulting organization; and 5+ years of direct people management leading a team of consultants. Deep understanding of cloud computing, adoption strategy, transition challenges. Experience managing a consulting practice or teams responsible for KRAs. Ability to travel to client locations to deliver professional services as needed PREFERRED QUALIFICATIONS Demonstrated ability to think strategically about business, product, and technical challenges. Vertical industry sales and delivery experience of contemporary services and solutions.Experience with design of modern, scalable delivery models for technology consulting services. Business development experience including complex agreements w/ integrators and ISVs .International sales and delivery experience with global F500 enterprise customers and partners Direct people management experience leading a team of at least 20 or manager of manager experience in a consulting practice. Use of AWS services in distributed environments with Microsoft, IBM, Oracle, HP, SAP etc. Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.
Posted 7 hours ago
3.0 years
3 - 10 Lacs
Chennai
On-site
DESCRIPTION Are you passionate about solving business challenges at a global scale? Amazon Employee Services is looking for an experienced Business Analyst to join Retail Business Services team and help unlock insights which take our business to the next level. The candidate will be excited about understanding and implementing new and repeatable processes to improve our employee global work authorization experiences. They will do this by partnering with key stakeholders to be curious and comfortable digging deep into the business challenges to understand and identify insights that will enable us to figure out standards to improve our ability to globally scale this program. They will be comfortable delivering/presenting these recommended solutions by retrieving and integrating artifacts in a format that is immediately useful to improve the business decision-making process. This role requires an individual with excellent analytical abilities as well as an outstanding business acumen. The candidate knows and values our customers (internal and external) and will work back from the customer to create structured processes for global expansions of work authorization, and help integrate new countries/new acquisitions into the existing program. They are experts in partnering and earning trust with operations/business leaders to drive these key business decisions. Responsibilities: Own the development and maintenance of new and existing artifacts focused on analysis of requirements, metrics, and reporting dashboards. Partner with operations/business teams to consult, develop and implement KPI’s, automated reporting/process solutions, and process improvements to meet business needs. Enable effective decision making by retrieving and aggregating data from multiple sources and compiling it into a digestible and actionable format. Prepare and deliver business requirements reviews to the senior management team regarding progress and roadblocks. Participate in strategic and tactical planning discussions. Design, develop and maintain scaled, automated, user-friendly systems, reports, dashboards, etc. that will support our business needs. Excellent writing skills, to create artifacts easily digestible by business and tech partners. Key job responsibilities Design and develop highly available dashboards and metrics using SQL and Excel/Tableau/QuickSight Understand the requirements of stakeholders and map them with the data sources/data warehouse Own the delivery and backup of periodic metrics, dashboards to the leadership team Draw inferences and conclusions, and create dashboards and visualizations of processed data, identify trends, anomalies Execute high priority (i.e. cross functional, high impact) projects to improve operations performance with the help of Operations Analytics managers Perform business analysis and data queries using appropriate tools Work closely with internal stakeholders such as business teams, engineering teams, and partner teams and align them with respect to your focus area BASIC QUALIFICATIONS 3+ years of Excel or Tableau (data manipulation, macros, charts and pivot tables) experience Experience defining requirements and using data and metrics to draw business insights Experience with SQL or ETL Knowledge of data visualization tools such as Quick Sight, Tableau, Power BI or other BI packages 1+ years of tax, finance or a related analytical field experience PREFERRED QUALIFICATIONS Experience in Amazon Redshift and other AWS technologies Experience creating complex SQL queries joining multiple datasets, ETL DW concepts Experience in SCALA and Pyspark Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.
Posted 7 hours ago
7.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
We Are #Hiring : DevOps Engineer with Database Expertise! 🕐 Availability: Immediate Joiners Only 📍 Location: Hyderabad (Hybrid), India 📌 Experience: 7+ years Are you a DevOps Engineer with a passion for database management and cloud infrastructure? Do you have experience working with MySQL, Redshift, PostgreSQL, and other modern databases like MongoDB, Cassandra, and Redis? If yes, we have an exciting opportunity for you! 🔹 What You’ll Do: ✅ Design, implement, and maintain CI/CD pipelines for seamless deployment. ✅ Optimize and administer cloud infrastructure (AWS, GCP, or Azure) for scalability. ✅ Manage, secure, and optimize database performance across multiple platforms. ✅ Automate infrastructure provisioning and ensure high availability. ✅ Collaborate with development & data teams to enhance performance. 🔹 What We’re Looking For: 📌 7 to 8 years of DevOps experience with a strong focus on database administration. 📌 Hands-on expertise with MySQL, Redshift, PostgreSQL, and NoSQL databases. 📌 Proficiency in cloud platforms (AWS preferred), Docker, Kubernetes, Terraform, and Python scripting. 📌 Strong knowledge of security, compliance, and database scaling techniques. If you or someone you know is a great fit, let’s connect! Drop your resume to pranay.t@grexinnovation.com 📩
Posted 7 hours ago
8.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Purpose: We are looking for a highly skilled and experienced Data Engineering professional to lead our data engineering team. The ideal candidate will possess a strong technical background, strong project management abilities, and excellent client handling/stakeholder management skills. This role requires a strategic thinker who can drive the design, development and implementation of data solutions that meet our clients’ needs while ensuring the highest standards of quality and efficiency. Job Responsibilities Technology Leadership – Lead guide the team independently or with little support to design, implement deliver complex cloud-based data engineering / data warehousing project assignments Managing projects in fast paced agile ecosystem and ensuring quality deliverables within stringent timelines Responsible for Risk Management, maintaining the Risk documentation and mitigations plan. Drive continuous improvement in a Lean/Agile environment, implementing DevOps delivery approaches encompassing CI/CD, build automation and deployments. Communication & Logical Thinking – Demonstrates strong analytical skills, employing a systematic and logical approach to data analysis, problem-solving, and situational assessment. Capable of effectively presenting and defending team viewpoints, while securing buy-in from both technical and client stakeholders. Handle Client Relationship – Manage client relationship and client expectations independently. Should be able to deliver results back to the Client independently. Should have excellent communication skills. Work Experience Should have expertise and 8+ years of working experience in at least two ETL tools among Matillion, DBT, Pyspark/python, Informatica, and Talend Should have expertise and working experience in at least two databases among Databricks, Redshift, Snowflake, SQL Server, Oracle Should have strong Data Warehousing, Data Integration and Data Modeling fundamentals like Star Schema, Snowflake Schema, Dimension Tables and Fact Tables. Strong experience on SQL building blocks. Creating complex SQL queries and Procedures. Experience in AWS or Azure cloud and its service offerings Aware of techniques such as: Data Modelling, Performance tuning and regression testing Willingness to learn and take ownership of tasks. Excellent written/verbal communication and problem-solving skills and Understanding and working experience on Pharma commercial data sets like IQVIA, Veeva, Symphony, Liquid Hub, Cegedim etc. would be an advantage Good experience working in pharma or life sciences domain projects Education BE/B.Tech, MCA, M.Sc., M. Tech with 60%+ Why Axtria: - Axtria is a global provider of cloud software and data analytics to the Life Sciences industry. We help Life Sciences companies transform the product commercialization journey to drive sales growth and improve healthcare outcomes for patients. We are acutely aware that our work impacts millions of patients and lead passionately to improve their lives. We will provide– (Employee Value Proposition) Offer an inclusive environment that encourages diverse perspectives and ideas Deliver challenging and unique opportunities to contribute to the success of a transforming organization Opportunity to work on technical challenges that may impact across geographies Vast opportunities for self-development: online Axtria Institute, knowledge sharing opportunities globally, learning opportunities through external certifications Sponsored Tech Talks & Hackathons Possibility to relocate to any Axtria office for short and long-term projects Benefit package: Health benefits Retirement benefits Paid time off Flexible Benefits Hybrid /FT Office Axtria is an equal-opportunity employer that values diversity and inclusiveness in the workplace. A few more links are mentioned below, you may want to go through to know more about Axtria’s journey as an Organization, its culture, products and solutions offerings. For White papers: Research Hub: https://www.axtria.com/axtria-research-hub-pharmaceutical-industry/ For Axtria product and capability related content: 5 step guides: https://www.axtria.com/axtria-5-step-guides-sales-marketing-data-management-best-practices/ For recent marketing videos, including Jassi’s public discussions: Video Wall: https://www.axtria.com/video-wall/ Infographic Points of view on industry, Therapy areas etc.: https://www.axtria.com/video-wall/
Posted 8 hours ago
7.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Position Overview: As a Data Architect, you are responsible for designing and managing scalable, secure, and high-performance data architectures that support GEDU and customer needs. This role ensures that the GEDU’s data assets are structured and managed in a way that enables the business to generate insights, make data-driven decisions, and maintain data integrity across the GEDU and Customers. The Data Architect will work closely with business leaders, data engineers, data scientists, and IT teams to align the data architecture with the GEDU’s strategic goals. Key Responsibilities: Data Architecture Design: Design, develop, and maintain the enterprise data architecture, including data models, database schemas, and data flow diagrams. Develop a data strategy and roadmap that aligns with GEDU business objectives and ensures the scalability of data systems. Architect both transactional (OLTP) and analytical (OLAP) databases, ensuring optimal performance and data consistency. Data Integration & Management: Oversee the integration of disparate data sources into a unified data platform, leveraging ETL/ELT processes and data integration tools. Design and implement data warehousing solutions, data lakes, and/or data marts that enable efficient storage and retrieval of large datasets. Ensure proper data governance, including the definition of data ownership, security, and privacy controls in accordance with compliance standards (GDPR, HIPAA, etc.). Collaboration with Stakeholders: Work closely with business stakeholders, including analysts, developers, and executives, to understand data requirements and ensure that the architecture supports analytics and reporting needs. Collaborate with DevOps and engineering teams to optimize database performance and support large-scale data processing pipelines. Technology Leadership: Guide the selection of data technologies, including databases (SQL/NoSQL), data processing frameworks (Hadoop, Spark), cloud platforms (Azure is a must), and analytics tools. Stay updated on emerging data management technologies, trends, and best practices, and assess their potential application within the organization. Data Quality & Security: Define data quality standards and implement processes to ensure the accuracy, completeness, and consistency of data across all systems. Establish protocols for data security, encryption, and backup/recovery to protect data assets and ensure business continuity. Mentorship & Leadership: Lead and mentor data engineers, data modelers, and other technical staff in best practices for data architecture and management. Provide strategic guidance on data-related projects and initiatives, ensuring that all efforts are aligned with the enterprise data strategy. Extensive Data Architecture Expertise: Over 7 years of experience in data architecture, data modeling, and database management. Proficiency in designing and implementing relational (SQL) and non-relational (NoSQL) database solutions. Strong experience with data integration tools (Azure Tools are a must + any other third-party tools), ETL/ELT processes, and data pipelines. Advanced Knowledge of Data Platforms: Expertise in Azure cloud data platform is a must. Other platforms such as AWS (Redshift, S3), Azure (Data Lake, Synapse), and/or Google Cloud Platform (BigQuery, Dataproc) is a bonus. Experience with big data technologies (Hadoop, Spark) and distributed systems for large-scale data processing. Hands-on experience with data warehousing solutions and BI tools (e.g., Power BI, Tableau, Looker). Data Governance & Compliance: Strong understanding of data governance principles, data lineage, and data stewardship. Knowledge of industry standards and compliance requirements (e.g., GDPR, HIPAA, SOX) and the ability to architect solutions that meet these standards. Technical Leadership: Proven ability to lead data-driven projects, manage stakeholders, and drive data strategies across the enterprise. Strong programming skills in languages such as Python, SQL, R, or Scala. Pre-Sales Responsibilities: Stakeholder Engagement: Work with product stakeholders to analyze functional and non-functional requirements, ensuring alignment with business objectives. Solution Development: Develop end-to-end solutions involving multiple products, ensuring security and performance benchmarks are established, achieved, and maintained. Proof of Concepts (POCs): Develop POCs to demonstrate the feasibility and benefits of proposed solutions. Client Communication: Communicate system requirements and solution architecture to clients and stakeholders, providing technical assistance and guidance throughout the pre-sales process. Technical Presentations: Prepare and deliver technical presentations to prospective clients, demonstrating how proposed solutions meet their needs and requirements. To know our privacy policy, please click the link below: https://gedu.global/wp-content/uploads/2023/09/GEDU-Privacy-Policy-22092023-V2.0-1.pdf
Posted 8 hours ago
5.0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Description At Amazon Ads, we sit at the intersection of Advertising, Media and eCommerce. With millions of customers visiting us every day to find, discover, and buy products, we believe that advertising, when done well, can enhance the value of the customer experience and generate a positive ROI for our advertising partners. We strive to make advertising relevant so that customers welcome it - across Amazon’s ecosystem of mobile and desktop websites, proprietary devices, and the Amazon Advertising business. If you’re interested in innovative advertising solutions with a relentless focus on the customer, you’ve come to the right place! As a Business Analyst in the Amazon Ads team, you will be responsible for analyzing advertising performance data, providing insights to drive business decisions, and supporting the growth of Amazon's advertising business in India. We are seeking an experienced and highly skilled Reporting & Automation Specialist to lead our data analytics and reporting efforts. This role will be responsible for overseeing complex data flows, developing advanced reporting solutions, and driving data-driven decision-making across Business, Finance, and Leadership teams. The ideal candidate will have a deep understanding of business intelligence tools, advanced SQL skills, and the ability to translate complex data into actionable insights. Key job responsibilities Lead the development and implementation of sophisticated reporting solutions, integrating advertising data from MADS, Hercules, Spektr with retail platform datasets to provide comprehensive business intelligence. Design and deliver high-impact reports and dashboards for Business, Finance, and Leadership teams, ensuring data accuracy, relevance, and alignment with strategic objectives. Serve as the senior point of contact for complex reporting-related queries, providing expert guidance and insights to stakeholders across the organization. Drive continuous improvement initiatives to optimize reporting processes, including the implementation of advanced automation techniques and cutting-edge BI tools. L Lead the development of complex SQL queries and data models to support in-depth analysis and insight generation for business teams. Architect and implement sophisticated reporting and analytics solutions using Amazon QuickSight, Excel macros, and other advanced BI tools. Collaborate with cross-functional teams to elevate the overall data analytics capabilities of the organization. Basic Qualifications 5+ years of Excel (including VBA, pivot tables, array functions, power pivots, etc.) and data visualization tools such as Tableau experience Bachelor's degree or equivalent Experience defining requirements and using data and metrics to draw business insights Experience with Excel Experience with SQL Proven track record of implementing large-scale process improvements through automation and advanced analytics Expert-level proficiency in SQL, including experience with complex queries and data modeling Demonstrated ability to manage multiple high-priority reporting cycles and projects simultaneously Exceptional attention to detail and ability to maintain accuracy when working with large, complex datasets Preferred Qualifications Advanced certifications in relevant BI tools (e.g., Amazon QuickSight, Tableau, Power BI) Experience with cloud-based data warehousing solutions (e.g., Amazon Redshift, Snowflake) Proficiency in programming languages such as Python or R for data analysis and automation Knowledge of machine learning and predictive analytics techniques Experience working in e-commerce or digital advertising industries Strong presentation skills with the ability to communicate complex data insights to both technical and non-technical audiences Track record of driving data-driven decision-making at senior leadership levels Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ASSPL - Maharashtra - C32 Job ID: A3049704
Posted 8 hours ago
4.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Overview We are seeking a Platform Architect with expertise in Informatica PowerCenter and Informatica Intelligent Cloud Services (IICS) to design, implement, and optimize enterprise-level data integration platforms. The ideal candidate will have a strong background in ETL/ELT architecture, cloud data integration, and platform modernization, ensuring scalability, security, and performance across on-prem and cloud environments. Responsibilities Platform Engineering & Administration Oversee installation, configuration, and optimization of PowerCenter and IICS environments. Manage platform scalability, performance tuning, and troubleshooting. Implement data governance, security, and compliance (e.g., role-based access, encryption, data lineage tracking). Optimize connectivity and integrations with various sources (databases, APIs, cloud storage, SaaS apps). Cloud & Modernization Initiatives Architect and implement IICS-based data pipelines for real-time and batch processing. Migrate existing PowerCenter workflows to IICS, leveraging serverless and cloud-native features. Ensure seamless integration with cloud platforms (AWS, Azure, GCP) and modern data lakes/warehouses (Snowflake, Redshift, BigQuery). Qualifications 4 years of experience in data integration and ETL/ELT architecture. Expert-level knowledge of Informatica PowerCenter and IICS (Cloud Data Integration, API & Application Integration, Data Quality). Hands-on experience with cloud platforms (AWS, Azure, GCP) and modern data platforms (Snowflake, Databricks, Redshift, BigQuery). Strong SQL, database tuning, and performance optimization skills. Deep understanding of data governance, security, and compliance best practices. Experience in automation, DevOps (CI/CD), and Infrastructure-as-Code (IaC) tools for data platforms. Excellent communication, leadership, and stakeholder management skills. Preferred Qualifications Informatica certifications (IICS, PowerCenter, Data Governance). Proficient to Power Center to IDMC Conversions Understanding on real-time streaming (Kafka, Spark Streaming). Knowledge of API-based integration and event-driven architectures. Familiarity with Machine Learning and AI-driven data processing.
Posted 9 hours ago
7.0 - 10.0 years
0 - 0 Lacs
pune, mumbai city
Remote
Position - AWS Data Engineer Job Description: We are seeking a skilled Data Engineer with 7+ years of experience in data processing, ETL pipelines, and cloud-based data solutions. The ideal candidate will have strong expertise in AWS Glue, Redshift, S3, EMR, and Lambda , with hands-on experience using Python and PySpark for large-scale data transformations. The candidate will be responsible for designing, building, and maintaining scalable data pipelines and systems to support analytics and data-driven decision-making. Additionally, need to have strong expertise in Terraform and Git-based CI/CD pipelines to support infrastructure automation and configuration management. Key Responsibilities: ETL Development & Automation: Design and implement ETL pipelines using AWS Glue and PySpark to transform raw data into consumable formats. Automate data processing workflows using AWS Lambda and Step Functions. Data Integration & Storage: Integrate and ingest data from various sources into Amazon S3 and Redshift. Optimize Redshift for query performance and cost efficiency. Data Processing & Analytics: Use AWS EMR and PySpark for large-scale data processing and complex transformations. Build and manage data lakes on Amazon S3 for analytics use cases. Monitoring & Optimization: Monitor and troubleshoot data pipelines to ensure high availability and performance. Implement best practices for cost optimization and performance tuning in Redshift, Glue, and EMR. Terraform & Git-based Workflows: Design and implement Terraform modules to provision cloud infrastructure across AWS/Azure/GCP. Manage and optimize CI/CD pipelines using Git-based workflows (e.g., GitHub Actions, GitLab CI, Jenkins, Azure DevOps). Collaborate with developers and cloud architects to automate infrastructure provisioning and deployments. Write reusable and scalable Terraform modules following best practices and code quality standards. Maintain version control, branching strategies, and code promotion processes in Git. Collaboration: Work closely with stakeholders to understand requirements and deliver solutions. Document data workflows, designs, and processes for future reference. Must-Have Skills: Strong proficiency in Python and PySpark for data engineering tasks. Hands-on experience with AWS Glue, Redshift, S3, and EMR . Expertise in building, deploying, and optimizing data pipelines and workflows. Solid understanding of SQL and databas optimization techniques. Strong hands-on experience with Terraform , including writing and managing modules, state files, and workspaces. Proficient in CI/CD pipeline design and maintenance using tools like: GitHub Actions / GitLab CI / Jenkins / Azure DevOps Pipelines Deep understanding of Git workflows (e.g., GitFlow, trunk-based development). Experience in serverless architecture using AWS Lambda for automation and orchestration. Knowledge of data modeling, partitioning, and schema design for data lakes and warehouses.
Posted 10 hours ago
10.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Title: Lead Platform Engineer – AWS Data Platform Location: Hybrid – Hyderabad, Telangana Experience: 10+ years Employment Type: Full-Time Apply Now --- About the Role Infoslab is hiring on behalf of our client, a leading healthcare technology company committed to transforming healthcare through data. We are seeking a Lead Platform Engineer to architect, implement, and lead the development of a secure, scalable, and cloud-native data platform on AWS. This role combines deep technical expertise with leadership responsibilities. You will build the foundation that supports critical business intelligence, analytics, and machine learning applications across the organization. --- Key Responsibilities Architect and build a highly available, cloud-native data platform using AWS services such as S3, Glue, Redshift, Lambda, and ECS. Design reusable platform components and frameworks to support data engineering, analytics, and ML pipelines. Build and maintain CI/CD pipelines, GitOps workflows, and infrastructure-as-code using Terraform. Drive observability, operational monitoring, and incident response processes across environments. Ensure platform security, compliance (HIPAA, SOC2), and audit-readiness in partnership with InfoSec. Lead and mentor a team of platform engineers, promoting best practices in DevOps and cloud infrastructure. Collaborate with cross-functional teams to deliver reliable and scalable data platform capabilities. --- Required Skills and Experience 10+ years of experience in platform engineering, DevOps, or infrastructure roles with a data focus. 3+ years in technical leadership or platform engineering management. Deep experience with AWS services, including S3, Glue, Redshift, Lambda, ECS, and Athena. Strong hands-on experience with Python or Scala, and automation tooling. Proficient in Terraform and CI/CD tools (GitHub Actions, Jenkins, etc.). Advanced knowledge of Apache Spark for both batch and streaming workloads. Proven track record of building secure, scalable, and compliant infrastructure. Strong understanding of observability, reliability engineering, and infrastructure automation. --- Preferred Qualifications Experience with containerization and orchestration (Docker, Kubernetes). Familiarity with Data Mesh principles or domain-driven data platform design. Background in healthcare or other regulated industries. Experience integrating data platforms with BI tools like Tableau or Looker. --- Why Join Contribute to a mission-driven client transforming healthcare through intelligent data platforms. Lead high-impact platform initiatives that support diagnostics, research, and machine learning. Work with modern engineering practices including IaC, GitOps, and serverless architectures. Be part of a collaborative, hybrid work culture focused on innovation and technical excellence.
Posted 14 hours ago
50.0 years
0 Lacs
Pune, Maharashtra, India
On-site
About Client :- Our client is a French multinational information technology (IT) services and consulting company, headquartered in Paris, France. Founded in 1967, It has been a leader in business transformation for over 50 years, leveraging technology to address a wide range of business needs, from strategy and design to managing operations. The company is committed to unleashing human energy through technology for an inclusive and sustainable future, helping organizations accelerate their transition to a digital and sustainable world. They provide a variety of services, including consulting, technology, professional, and outsourcing services. Job Details :- Position: Data Analyst - AI& Bedrock Experience Required: 6-10yrs Notice: immediate Work Location: Pune Mode Of Work: Hybrid Type of Hiring: Contract to Hire Job Description:- FAS - Data Analyst - AI & Bedrock Specialization About Us: We are seeking a highly experienced and visionary Data Analyst with a deep understanding of artificial intelligence (AI) principles and hands-on expertise with cutting-edge tools like Amazon Bedrock. This role is pivotal in transforming complex datasets into actionable insights, enabling data-driven innovation across our organization. Role Summary: The Lead Data Analyst, AI & Bedrock Specialization, will be responsible for spearheading advanced data analytics initiatives, leveraging AI and generative AI capabilities, particularly with Amazon Bedrock. With 5+ years of experience, you will lead the design, development, and implementation of sophisticated analytical models, provide strategic insights to stakeholders, and mentor a team of data professionals. This role requires a blend of strong technical skills, business acumen, and a passion for pushing the boundaries of data analysis with AI. Key Responsibilities: • Strategic Data Analysis & Insight Generation: o End-to-end data analysis projects, from defining business problems to delivering actionable insights that influence strategic decisions. o Utilize advanced statistical methods, machine learning techniques, and AI-driven approaches to uncover complex patterns and trends in large, diverse datasets. o Develop and maintain comprehensive dashboards and reports, translating complex data into clear, compelling visualizations and narratives for executive and functional teams. • AI/ML & Generative AI Implementation (Bedrock Focus): o Implement data analytical solutions leveraging Amazon Bedrock, including selecting appropriate foundation models (e.g., Amazon Titan, Anthropic Claude) for specific use cases (text generation, summarization, complex data analysis). o Design and optimize prompts for Large Language Models (LLMs) to extract meaningful insights from unstructured and semi-structured data within Bedrock. o Explore and integrate other AI/ML services (e.g., Amazon SageMaker, Amazon Q) to enhance data processing, analysis, and automation workflows. o Contribute to the development of AI-powered agents and intelligent systems for automated data analysis and anomaly detection. • Data Governance & Quality Assurance: o Ensure the accuracy, integrity, and reliability of data used for analysis. o Develop and implement robust data cleaning, validation, and transformation processes. o Establish best practices for data management, security, and governance in collaboration with data engineering teams. • Technical Leadership & Mentorship: o Evaluate and recommend new data tools, technologies, and methodologies to enhance analytical capabilities. o Collaborate with cross-functional teams, including product, engineering, and business units, to understand requirements and deliver data-driven solutions. • Research & Innovation: o Stay abreast of the latest advancements in AI, machine learning, and data analytics trends, particularly concerning generative AI and cloud-based AI services. o Proactively identify opportunities to apply emerging technologies to solve complex business challenges. Required Skills & Qualifications: • Bachelor's or Master's degree in Computer Science, Data Science, Statistics, Mathematics, Engineering, or a related quantitative field. • 5+ years of progressive experience as a Data Analyst, Business Intelligence Analyst, or similar role, with a strong portfolio of successful data-driven projects. • Proven hands-on experience with AI/ML concepts and tools, with a specific focus on Generative AI and Large Language Models (LLMs). • Demonstrable experience with Amazon Bedrock is essential, including knowledge of its foundation models, prompt engineering, and ability to build AI-powered applications. • Expert-level proficiency in SQL for data extraction and manipulation from various databases (relational, NoSQL). • Advanced proficiency in Python (Pandas, NumPy, Scikit-learn, etc.) or R for data analysis, statistical modeling, and scripting. • Strong experience with data visualization tools such as Tableau, Power BI, Qlik Sense, or similar, with a focus on creating insightful and interactive dashboards. • Experience with cloud platforms (AWS preferred) and related data services (e.g., S3, Redshift, Glue, Athena). • Excellent analytical, problem-solving, and critical thinking skills. • Strong communication and presentation skills, with the ability to convey complex technical findings to non-technical stakeholders. • Ability to work independently and collaboratively in a fast-paced, evolving environment. Preferred Qualifications: • Experience with other generative AI frameworks or platforms (e.g., OpenAI, Google Cloud AI). • Familiarity with data warehousing concepts and ETL/ELT processes. • Knowledge of big data technologies (e.g., Spark, Hadoop). • Experience with MLOps practices for deploying and managing AI/ML models. Learn about building AI agents with Bedrock and Knowledge Bases to understand how these tools revolutionize data analysis and customer service.
Posted 15 hours ago
6.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
We are seeking a highly experienced Senior Business Systems Analyst to guide us in our quest with our global, regional, and functional commercial policy implementation, reporting & governance projects. This successful candidate will contribute by building metrics, analyzing processes, workflows, and systems with the objective of identifying opportunities for either improvement or automation. This role requires close collaboration with product, segment partners, product marketing, customer to cash, sales, marketing, technology, and finance areas. The successful candidate is required to react with speed and agility in our ever-evolving world and manage changing timelines, multiple priorities, deliverables, and uncertainty. Business partnership skills, the capability to influence, and the ability to build effective relationships across geographically disbursed teams are critical. This position resides in the Commercial Excellence organization and reports to the Manager of Commercial Policy Reporting & Governance. About The Role In this role as a Senior Business Systems Analyst, you will: Improve, execute, and effectively communicate significant analyses that identifies meaningful trends and opportunities across the business. Participate in regular meetings with stakeholders & management, assessing and addressing issues to identify and implement improvements toward efficient operations. Provide strong and timely business analytic support to business partners and various organizational stakeholders. Develop actionable road maps for improving workflows and processes. Collaborate with Project Leads, Managers, and Business partners to determine schedules and project timelines ensuring alignments across all areas of the business. Drive commercial strategy and policy alignment with fast changing attributes, while managing reporting, tracking and governance best practices. Identify, assess, manage, and communicate risks while laying out mitigation plan and course corrections where appropriate Provide insightful diagnostics and actionable insights to the leadership team in a proactive manner by spotting trends, questioning data and asking questions to understand underlying drivers Proactively identify trends for future governance & reporting needs while presenting ideas to CE Leadership for new areas of opportunity to drive value. Prepare, analyze, and summarize various weekly, monthly, and periodic operational results for use by various key stakeholders, creating reports, specifications, instructions, and flowcharts Conduct full lifecycle of analytics projects, including pulling, manipulating, and exporting data from project requirements documentation to design and execution Shift Timings: 2:00 PM to 11:00 PM (IST) Work from office for 2 days in a week (Mandatory) About You You’re a fit for the role of Senior Business Systems Analyst, if your background includes: Bachelor’s degree required, preferably in Computer Science, Mathematics, Business management, or economics. 6+ years of professional experience in a similar role. Proven project management skills related planning and overseeing projects from the initial ideation through to completion. Proven ability to take complex and disparate data sets and create streamlined and efficient data lakes with connected and routinized cadence. Advanced level skills in the following systems: Alteryx, Power BI, Snowflake, Redshift, Salesforce.com, EDW, Excel and MS Powerpoint. Familiarity with contract lifecycle management tools like Conga CLM, HighQ CLM etc. Ability to quickly draw insights into trends in data and make recommendations to drive productivity and efficiency. Exceptional verbal, written, and visual communication skills. Experience managing multiple projects simultaneously within a matrix organization, adhering to deadlines in a fast-paced environment. Ability to deploy influencing techniques to drive cross-functional alignment and change across broad audience. Ability to be flexible with working hours to support ever-changing demands of the business. What’s in it For You? Hybrid Work Model: We’ve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrow’s challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits: We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our values: Obsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact: Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world. About Us Thomson Reuters informs the way forward by bringing together the trusted content and technology that people and organizations need to make the right decisions. We serve professionals across legal, tax, accounting, compliance, government, and media. Our products combine highly specialized software and insights to empower professionals with the data, intelligence, and solutions needed to make informed decisions, and to help institutions in their pursuit of justice, truth, and transparency. Reuters, part of Thomson Reuters, is a world leading provider of trusted journalism and news. We are powered by the talents of 26,000 employees across more than 70 countries, where everyone has a chance to contribute and grow professionally in flexible work environments. At a time when objectivity, accuracy, fairness, and transparency are under attack, we consider it our duty to pursue them. Sound exciting? Join us and help shape the industries that move society forward. As a global business, we rely on the unique backgrounds, perspectives, and experiences of all employees to deliver on our business goals. To ensure we can do that, we seek talented, qualified employees in all our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under applicable law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. We also make reasonable accommodations for qualified individuals with disabilities and for sincerely held religious beliefs in accordance with applicable law. More information on requesting an accommodation here. Learn more on how to protect yourself from fraudulent job postings here. More information about Thomson Reuters can be found on thomsonreuters.com.
Posted 15 hours ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
As a Solution Designer (Cloud Data Integration) at Barclays within the Customer Digital and Data Business Area, you will play a vital role in supporting the successful delivery of location strategy projects. Your responsibilities will include ensuring projects are delivered according to plan, budget, quality standards, and governance protocols. By spearheading the evolution of the digital landscape, you will drive innovation and excellence, utilizing cutting-edge technology to enhance our digital offerings and deliver unparalleled customer experiences. To excel in this role, you should possess hands-on experience working with large-scale data platforms and developing cloud solutions within the AWS data platform. Your track record should demonstrate a history of driving business success through your expertise in AWS, distributed computing paradigms, and designing data ingestion programs using technologies like Glue, Lambda, S3, Redshift, Snowflake, Apache Kafka, and Spark Streaming. Proficiency in Python, PySpark, SQL, and database management systems is essential, along with a strong understanding of data governance principles and tools. Additionally, valued skills for this role may include experience in multi-cloud solution design, data modeling, data governance frameworks, agile methodologies, project management tools, business analysis, and product ownership within a data analytics context. A basic understanding of the banking domain, along with excellent analytical, communication, and interpersonal skills, will be crucial for success in this position. Your main purpose as a Solution Designer will involve designing, developing, and implementing solutions to complex business problems by collaborating with stakeholders to understand their needs and requirements. You will be accountable for designing solutions that balance technology risks against business delivery, driving consistency and aligning with modern software engineering practices and automated delivery tooling. Furthermore, you will be expected to provide impact assessments, fault finding support, and architecture inputs required to comply with the bank's governance processes. As an Assistant Vice President in this role, you will be responsible for advising on decision-making processes, contributing to policy development, and ensuring operational effectiveness. If the position involves leadership responsibilities, you will lead a team to deliver impactful work and set objectives for employees while demonstrating leadership behaviours focused on listening, inspiring, aligning, and developing others. Alternatively, as an individual contributor, you will lead collaborative assignments, guide team members, identify new directions for projects, consult on complex issues, and collaborate with other areas to support business activities. All colleagues at Barclays are expected to embody the Barclays Values of Respect, Integrity, Service, Excellence, and Stewardship, as well as the Barclays Mindset to Empower, Challenge, and Drive. By demonstrating these values and mindset, you will contribute to creating an environment where colleagues can thrive and deliver consistently excellent results.,
Posted 16 hours ago
7.0 years
0 Lacs
Itanagar, Arunachal Pradesh, India
On-site
Job Overview We are seeking a highly skilled and experienced Lead Data Engineer AWS to spearhead the design, development, and optimization of our cloud-based data infrastructure. As a technical leader, you will drive scalable data solutions using AWS services and modern data engineering tools, ensuring robust data pipelines and architectures for real-time and batch data processing. Responsibilities The ideal candidate is a hands-on technologist with a deep understanding of distributed data systems, cloud-native data services, and team leadership in Agile Responsibilities : Design, build, and maintain scalable, fault-tolerant, and secure data pipelines using AWS-native services (e.g., Glue, EMR, Lambda, S3, Redshift, Athena, Kinesis). Lead end-to-end implementation of data architecture strategies including ingestion, storage, transformation, and data governance. Collaborate with data scientists, analysts, and application developers to understand data requirements and deliver optimal solutions. Ensure best practices for data quality, data cataloging, lineage tracking, and metadata management using tools like AWS Glue Data Catalog or Apache Atlas. Optimize data pipelines for performance, scalability, and cost-efficiency across structured and unstructured data sources. Mentor and lead a team of data engineers, providing technical guidance, code reviews, and architecture recommendations. Implement data modeling techniques (OLTP/OLAP), partitioning strategies, and data warehousing best practices. Maintain CI/CD pipelines for data infrastructure using tools such as AWS CodePipeline, Git, and Monitor production systems and lead incident response and root cause analysis for data infrastructure issues. Drive innovation by evaluating emerging technologies and proposing improvements to existing data platform Skills & Qualifications : Minimum 7 years of experience in data engineering with at least 3+ years in a lead or senior engineering role. Strong hands-on experience with AWS data services: S3, Redshift, Glue, Lambda, EMR, Athena, Kinesis, RDS, DynamoDB. Advanced proficiency in Python/Scala/Java for ETL development and data transformation logic. Deep understanding of distributed data processing frameworks (e.g., Apache Spark, Hadoop). Solid grasp of SQL and experience with performance tuning in large-scale environments. Experience implementing data lakes, lakehouse architecture, and data warehousing solutions on cloud. Knowledge of streaming data pipelines using Kafka, Kinesis, or AWS MSK. Proficiency with infrastructure-as-code (IaC) using Terraform or AWS CloudFormation. Experience with DevOps practices and tools such as Docker, Git, Jenkins, and monitoring tools (CloudWatch, Prometheus, Grafana). Expertise in data governance, security, and compliance in cloud environments (ref:hirist.tech)
Posted 17 hours ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
As an ETL Testing professional at CGI, you will be part of a dynamic team that is committed to helping clients succeed in their IT and business process services. With 5-8 years of experience, you will be based in Chennai or Bangalore, working full-time from the office on a Monday to Friday schedule from 12:30 PM to 9:30 PM. Your role will involve utilizing your analytics skills to understand requirements, develop test cases, and manage data effectively. You will need strong SQL skills and hands-on experience testing data pipelines built using Glue, S3, Redshift, and Lambda. Collaboration with developers to build automated testing and a solid understanding of data concepts like data lineage, data integrity, and quality are essential for success in this role. Previous experience in testing financial data is considered a plus. You will be expected to demonstrate expert-level analytical and problem-solving skills, flexibility in testing approaches, and awareness of Quality Management tools and techniques. Ensuring best practice quality assurance of deliverables, working within agreed architectural processes, data, and organizational frameworks will be crucial. Effective communication skills, proficiency in English (written/verbal), and local language as necessary are required. An open-minded approach to sharing information, transferring knowledge, and supporting team members will be key to your success. Must-have skills for this role include ETL and SQL proficiency, hands-on testing of data pipelines, experience with Glue, S3, Redshift, data lineage, and data integrity. Additionally, experience testing financial data will be advantageous. At CGI, we value ownership, teamwork, respect, and belonging. As a CGI Partner, you will have the opportunity to contribute meaningfully from day one, shaping the company's strategy and direction. Your work will create value through innovative solutions, collaboration with colleagues and clients, and access to global capabilities. You will have the chance to grow and develop your skills within a supportive environment that prioritizes your well-being and professional growth. Join CGI, one of the largest IT and business consulting services firms globally, and together, let's turn meaningful insights into action.,
Posted 17 hours ago
5.0 - 9.0 years
0 Lacs
salem, tamil nadu
On-site
This is a key position that will play a pivotal role in creating data-driven technology solutions to establish our client as a leader in healthcare, financial, and clinical administration. As the Lead Data Scientist, you will be instrumental in building and implementing machine learning models and predictive analytics solutions that will spearhead the new era of AI-driven innovation in the healthcare industry. Your responsibilities will involve developing and implementing a variety of ML/AI products, from conceptualization to production, to help the organization gain a competitive edge in the market. Working closely with the Director of Data Science, you will operate at the crossroads of healthcare, finance, and cutting-edge data science to tackle some of the most intricate challenges faced by the industry. This role presents a unique opportunity within VHT's Product Transformation division to create pioneering machine learning capabilities from scratch. You will have the chance to shape the future of VHT's data science & analytics foundation, utilizing state-of-the-art tools and methodologies within a collaborative and innovation-focused environment. Key Responsibilities: - Lead the development of predictive machine learning models for Revenue Cycle Management analytics, focusing on areas such as: - Claim Denials Prediction: identifying high-risk claims before submission - Cash Flow Forecasting: predicting revenue timing and patterns - Patient-Related Models: enhancing patient financial experience and outcomes - Claim Processing Time Prediction: optimizing workflow and resource allocation - Explore emerging areas and integration opportunities, e.g., denial prediction + appeal success probability or prior authorization prediction + approval likelihood models. VHT Technical Environment: - Cloud Platform: AWS (SageMaker, S3, Redshift, EC2) - Development Tools: Jupyter Notebooks, Git, Docker - Programming: Python, SQL, R (optional) - ML/AI Stack: Scikit-learn, TensorFlow/PyTorch, MLflow, Airflow - Data Processing: Spark, Pandas, NumPy - Visualization: Matplotlib, Seaborn, Plotly, Tableau Required Qualifications: - Advanced degree in Data Science, Statistics, Computer Science, Mathematics, or a related quantitative field - 5+ years of hands-on data science experience with a proven track record of deploying ML models to production - Expert-level proficiency in SQL and Python, with extensive experience using standard Python machine learning libraries (scikit-learn, pandas, numpy, matplotlib, seaborn, etc.) - Cloud platform experience, preferably AWS, with hands-on knowledge of SageMaker, S3, Redshift, and Jupyter Notebook workbenches (other cloud environments acceptable) - Strong statistical modeling and machine learning expertise across supervised and unsupervised learning techniques - Experience with model deployment, monitoring, and MLOps practices - Excellent communication skills with the ability to translate complex technical concepts to non-technical stakeholders Preferred Qualifications: - US Healthcare industry experience, particularly in Health Insurance and/or Medical Revenue Cycle Management - Experience with healthcare data standards (HL7, FHIR, X12 EDI) - Knowledge of healthcare regulations (HIPAA, compliance requirements) - Experience with deep learning frameworks (TensorFlow, PyTorch) - Familiarity with real-time streaming data processing - Previous leadership or mentoring experience,
Posted 18 hours ago
7.0 - 12.0 years
0 Lacs
hyderabad, telangana
On-site
As a Lead Data Engineer with 7-12 years of experience, you will be an integral part of our team, contributing significantly to the design, development, and maintenance of our data infrastructure. Your primary responsibilities will revolve around creating and managing robust data architectures, ETL processes, data warehouses, and utilizing big data and cloud technologies to support our business intelligence and analytics needs. You will lead the design and implementation of data architectures that facilitate data warehousing, integration, and analytics platforms. Developing and optimizing ETL pipelines will be a key aspect of your role, ensuring efficient processing of large datasets and implementing data transformation and cleansing processes to maintain data quality. Your expertise will be crucial in building and maintaining scalable data warehouse solutions using technologies such as Snowflake, Databricks, or Redshift. Additionally, you will leverage AWS Glue and PySpark for large-scale data processing, manage data pipelines with Apache Airflow, and utilize cloud platforms like AWS, Azure, and GCP for data storage, processing, and analytics. Establishing data governance and security best practices, ensuring data integrity, accuracy, and availability, and implementing monitoring and alerting systems are vital components of your responsibilities. Collaborating closely with stakeholders, mentoring junior engineers, and leading data-related projects will also be part of your role. Furthermore, your technical skills should include proficiency in ETL tools like Informatica Power Center, Python, PySpark, SQL, RDBMS platforms, and data warehousing concepts. Soft skills such as excellent communication, leadership, problem-solving, and the ability to manage multiple projects effectively will be essential for success in this role. Preferred qualifications include experience with machine learning workflows, certification in relevant data engineering technologies, and familiarity with Agile methodologies and DevOps practices. Location: Hyderabad Employment Type: Full-time,
Posted 20 hours ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The job market for redshift professionals in India is growing rapidly as more companies adopt cloud data warehousing solutions. Redshift, a powerful data warehouse service provided by Amazon Web Services, is in high demand due to its scalability, performance, and cost-effectiveness. Job seekers with expertise in redshift can find a plethora of opportunities in various industries across the country.
The average salary range for redshift professionals in India varies based on experience and location. Entry-level positions can expect a salary in the range of INR 6-10 lakhs per annum, while experienced professionals can earn upwards of INR 20 lakhs per annum.
In the field of redshift, a typical career path may include roles such as: - Junior Developer - Data Engineer - Senior Data Engineer - Tech Lead - Data Architect
Apart from expertise in redshift, proficiency in the following skills can be beneficial: - SQL - ETL Tools - Data Modeling - Cloud Computing (AWS) - Python/R Programming
As the demand for redshift professionals continues to rise in India, job seekers should focus on honing their skills and knowledge in this area to stay competitive in the job market. By preparing thoroughly and showcasing their expertise, candidates can secure rewarding opportunities in this fast-growing field. Good luck with your job search!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough