Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 10.0 years
20 - 30 Lacs
Hyderabad
Remote
Hiring for TOP MNC for Data Modeler positon (Long term contract - 2+ Years) The Data Modeler designs and implements data models for Microsoft Fabric and Power BI, supporting the migration from Oracle/Informatica. This offshore role ensures optimized data structures for performance and reporting needs. The successful candidate will bring expertise in data modeling and a collaborative approach. Responsibilities Develop conceptual, logical, and physical data models for Microsoft Fabric and Power BI solutions. Implement data models for relational, dimensional, and data lake environments on target platforms. Collaborate with the Offshore Data Engineer and Onsite Data Modernization Architect to ensure model alignment. Define and govern data modeling standards, tools, and best practices. Optimize data structures for query performance and scalability. Provide updates on modeling progress and dependencies to the Offshore Project Manager. Skills Bachelors or masters degree in computer science, data science, or a related field. 5+ years of data modeling experience with relational and NoSQL platforms. Proficiency with modeling tools (e.g., Erwin, ER/Studio) and SQL. Experience with Microsoft Fabric, data lakes, and BI data structures. Strong analytical and communication skills for team collaboration. Attention to detail with a focus on performance and consistency. management, communication, and presentation
Posted 3 weeks ago
3.0 - 5.0 years
15 - 27 Lacs
Bengaluru
Work from Office
Job Summary The NetApp Keystone team is responsible for cutting-edge technologies that enable NetApp’s pay as you go offering. Keystone helps customers manage data on prem or in the cloud and have invoices that are charged in a subscription manner.As an engineer in the NetApp’s Keystone organization, you will be executing our most challenging and complex projects. You will be responsible for decomposing complex product requirements into simple solutions, understanding system interdependencies and limitations and engineering best practices. Job Requirements • Strong knowledge of Python programming language, paradigms, constructs, and idioms • Bachelor’s/master’s degree in computer science, information technology, or engineering/ or anything specific that you prefer • Knowledge of various Python frameworks and tools • 2+ year experience working with the Python programming language • Strong written and communication skills with proven fluency in English • Be proficient in writing code for backend and front end • Familiarity with database technologies such as NoSQL, Prometheus and datalake • Hands-on experience with code conversion tools like Git, • Passionate about learning new tools, languages, philosophies, and workflows • Working with generated code and code generation techniques • Knowledge of software development methodologies - SCRUM/AGILE/LEAN • Knowledge of software deployment - Docker/Kubernetes • Knowledge of software team tools - GIT/JIRA/CICD Education Minimum of 2 to 4 years experience required with B.Tech or M.Tech background
Posted 3 weeks ago
5.0 - 8.0 years
25 - 35 Lacs
Gurugram, Bengaluru
Hybrid
Role & responsibilities Work with data product managers, analysts, and data scientists to architect, build and maintain data processing pipelines in SQL or Python. Build and maintain a data warehouse / data lake-house for analytics, reporting and ML predictions. Implement DataOps and related DevOps focused on creating ETL pipelines for data analytics / reporting, and ELT pipelines for model training. Support, optimise and transition our current processes to ensure well architected implementations and best practices. Work in an agile environment within a collaborative agile product team using Kanban Collaborate across departments and work closely with data science teams and with business (economists/data) analysts in refining their data requirements for various initiatives and data consumption requirements. Educate and train colleagues such as data scientists, analysts, and stakeholders in data pipelining and preparation techniques, which make it easier for them to integrate and consume the data they need for their own use cases. Participate in ensuring compliance and governance during data use, to ensure that the data users and consumers use the data provisioned to them responsibly through data governance and compliance initiatives. Become a data and analytics evangelist, and promote the available data and analytics capabilities and expertise to business unit leaders, and educate them in leveraging these. Preferred candidate profile What you'll need to be successful 8+ years of professional experience with data processing environments used in large scale digital applications. Extensive experience with programming in Python, Spark( SparkSQL) and SQL Experience with warehouse technologies such as Snowflake, and data modelling, lineage and data governance tools such as Alation. Professional experience of designing, building and managing bespoke data pipelines (including ETL, ELT and lambda architectures), using technologies such as Apache Airflow, Snowflake, Amazon Athena, AWS Glue, Amazon EMR, or other equivalent. Strong, fundamental technical expertise in cloud-native technologies, such as serverless functions, API gateway, relational and NoSQL databases, and caching. Experience in leading / mentoring data engineering teams. Experience in working in teams with data scientists and ML engineers, for building automated pipelines for data pre-processing and feature extraction. An advanced degree in software / data engineering, computer / information science, or a related quantitative field or equivalent work experience. Strong verbal and written communication skills and ability to work well with a wide range of stakeholders. Strong ownership, scrappy and biassed for action. Perks and benefits
Posted 3 weeks ago
5.0 - 7.0 years
15 - 25 Lacs
Chennai
Work from Office
Job Summary: We are seeking a skilled Big Data Tester & Developer to design, develop, and validate data pipelines and applications on large-scale data platforms. You will work on data ingestion, transformation, and testing workflows using tools from the Hadoop ecosystem and modern data engineering stacks. Experience - 6-12 years Key Responsibilities: • Develop and test Big Data pipelines using Spark, Hive, Hadoop, and Kafka • Write and optimize PySpark/Scala code for data processing • Design test cases for data validation, quality, and integrity • Automate testing using Python/Java and tools like Apache Nifi, Airflow, or DBT • Collaborate with data engineers, analysts, and QA teams Key Skills: • Strong hands-on experience in Big Data tools: Spark, Hive, HDFS, Kafka • Proficient in PySpark, Scala, or Java • Experience in data testing, ETL validation, and data quality checks • Familiarity with SQL, NoSQL, and data lakes • Knowledge of CI/CD, Git, and automation frameworks We are looking for a skilled PostgreSQL Developer/DBA to design, implement, optimize, and maintain our PostgreSQL database systems. You will work closely with developers and data teams to ensure high performance, scalability, and data integrity. Experience - 6 to 12 years Key Responsibilities: • Develop complex SQL queries, stored procedures, and functions • Optimize query performance and database indexing • Manage backups, replication, and security • Monitor and tune database performance • Support schema design and data migrations Key Skills: • Strong hands-on experience with PostgreSQL • Proficient in SQL, PL/pgSQL scripting • Experience in performance tuning, query optimization, and indexing • Familiarity with logical replication, partitioning, and extensions • Exposure to tools like pgAdmin, psql, or PgBouncer
Posted 3 weeks ago
5.0 - 10.0 years
10 - 15 Lacs
Pune, Bengaluru, Mumbai (All Areas)
Hybrid
Designation : Azure Data Engineer Experience : 5+ Years Location: Chennai, Bangalore, Pune, Mumbai Notice Period: Immediate Joiners/ Serving Notice Period Shift Timing: 3:30 PM IST to 12:30 AM IST Job Description : Azure Data Engineer: Must Have Azure Data Bricks, Azure Data Factory, Spark SQL with analytical knowledge Years 6-7 years of development experience in data engineering skills Strong experience in Spark. Understand complex data system by working closely with engineering and product teams Develop scalable and maintainable applications to extract, transform, and load data in various formats to SQL Server, Hadoop Data Lake or other data storage locations. Sincerely, Sonia HR Recruiter Talent Sketchers
Posted 3 weeks ago
5.0 - 9.0 years
20 - 30 Lacs
Pune
Hybrid
Job Summary : We are looking for a highly skilled AWS Data Engineer with over 5 years of experience in designing, developing, and maintaining scalable data pipelines on AWS. The ideal candidate will be proficient in data engineering best practices and cloud-native technologies, with hands-on experience in building ETL/ELT pipelines, working with large datasets, and optimizing data architecture for analytics and business intelligence. Key Responsibilities : Design, build, and maintain scalable and robust data pipelines and ETL processes using AWS services (e.g., Glue, Lambda, EMR, Redshift, S3, Athena). Collaborate with data analysts, data scientists, and stakeholders to understand data requirements and deliver high-quality solutions. Implement data lake and data warehouse architectures, ensuring data governance, data quality, and compliance. Optimize data pipelines for performance, reliability, scalability, and cost. Automate data ingestion and transformation workflows using Python, PySpark, or Scala. Manage and monitor data infrastructure including logging, error handling, alerting, and performance metrics. Leverage infrastructure-as-code tools like Terraform or AWS CloudFormation for infrastructure deployment. Ensure security best practices are implemented for data access and storage (IAM, KMS, encryption, etc.). Document data processes, architectures, and standards. Required Qualifications : Bachelors or Master’s degree in Computer Science, Information Systems, or a related field. Minimum 5 years of experience as a Data Engineer with a focus on AWS cloud services. Strong experience in building ETL/ELT pipelines using AWS Glue, EMR, Lambda , and Step Functions . Proficiency in SQL , Python , PySpark , and data modeling techniques. Experience working with data lakes (S3) and data warehouses (Redshift, Snowflake, etc.) . Experience with Athena , Kinesis , Kafka , or similar streaming data tools is a plus. Familiarity with DevOps and CI/CD processes, using tools like Git , Jenkins , or GitHub Actions . Understanding of data privacy, governance, and compliance standards such as GDPR, HIPAA, etc. Strong problem-solving and analytical skills, with the ability to work in a fast-paced environment.
Posted 4 weeks ago
9.0 - 12.0 years
25 - 40 Lacs
Hyderabad
Work from Office
Job Description: GCP Cloud Architect Opportunity: We are seeking a highly skilled and experienced GCP Cloud Architect to join our dynamic technology team. You will play a crucial role in designing, implementing, and managing our Google Cloud Platform (GCP) infrastructure, with a primary focus on building a robust and scalable Data Lake in BigQuery. You will be instrumental in ensuring the reliability, security, and performance of our cloud environment, supporting critical healthcare data initiatives. This role requires strong technical expertise in GCP, excellent problem-solving abilities, and a passion for leveraging cloud technologies to drive impactful solutions within the healthcare domain. Responsibilities: Cloud Architecture & Design: Design and architect scalable, secure, and cost-effective GCP solutions, with a strong emphasis on BigQuery for our Data Lake. Define and implement best GCP infrastructure management, security, networking, and data governance practices. Develop and maintain comprehensive architectural diagrams, documentation, and standards. Collaborate with data engineers, data scientists, and application development teams to understand their requirements and translate them into robust cloud solutions. Evaluate and recommend new GCP services and technologies to optimize our cloud environment. Understand and implement the fundamentals of GCP, including resource hierarchy, projects, organizations, and billing. GCP Infrastructure Management: Manage and maintain our existing GCP infrastructure, ensuring high availability, performance, and security. Implement and manage infrastructure-as-code (IaC) using tools like Terraform or Cloud Deployment Manager. Monitor and troubleshoot infrastructure issues, proactively identifying and resolving potential problems. Implement and manage backup and disaster recovery strategies for our GCP environment. Optimize cloud costs and resource utilization, including BigQuery slot management. Collaboration & Communication: Work closely with cross-functional teams, including data engineering, data science, application development, security, and compliance. Communicate technical concepts and solutions effectively to both technical and non-technical stakeholders. Provide guidance and mentorship to junior team members. Participate in on-call rotation as needed. Develop and maintain thorough and reliable documentation of all cloud infrastructure processes, configurations, and security protocols. Qualifications: Bachelor's degree in Computer Science, Engineering, or a related field. Minimum of 5-8 years of experience in designing, implementing, and managing cloud infrastructure, with a strong focus on Google Cloud Platform (GCP). Proven experience in architecting and implementing Data Lakes on GCP, specifically using BigQuery. Hands-on experience with ETL/ELT processes and tools, with strong proficiency in Google Cloud Composer (Apache Airflow). Solid understanding of GCP services such as Compute Engine, Cloud Storage, Networking (VPC, Firewall Rules, Cloud DNS), IAM, Cloud Monitoring, and Cloud Logging. Experience with infrastructure-as-code (IaC) tools like Terraform or Cloud Deployment Manager. Strong understanding of security best practices for cloud environments, including identity and access management, data encryption, and network security. Excellent problem-solving, analytical, and troubleshooting skills. Strong communication, collaboration, and interpersonal skills. Bonus Points: Experience with Apigee for API management. Experience with containerization technologies like Docker and orchestration platforms like Cloud Run. Experience with Vertex AI for machine learning workflows on GCP. Familiarity with GCP Healthcare products and solutions (e.g., Cloud Healthcare API). Knowledge of healthcare data standards and regulations (e.g., HIPAA, HL7, FHIR). GCP Professional Architect certification. Experience with scripting languages (e.g., Python, Bash). Experience with Looker.
Posted 4 weeks ago
5.0 - 8.0 years
9 - 13 Lacs
Hyderabad
Work from Office
Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. What you will do Let’s do this. Let’s change the world. In this vital role you will as an expert IS Architect lead the design and implementation of integration frameworks for pharmacovigilance (PV) systems spanning both SaaS and internally hosted. This role focuses on building secure, compliant, and scalable architectures to ensure seamless data flow between safety databases, external systems, and analytics platforms, without direct access to backend databases. The ideal candidate will work closely with PV system collaborators, SaaS vendors, and internal IT teams to deliver robust and efficient solutions. Roles & Responsibilities: Design hybrid integration architectures to manage data flows between SaaS-based PV systems, internally hosted systems and platforms. Implement middleware solutions to bridge on-premise and cloud environments, applying Application Programming Interface API-first integration design pattern and establishing secure data exchange mechanisms to ensure data consistency and compliance. Work with SaaS providers and internal IT teams to define integration approach for Extract Transform Load (ETL), event-driven architecture, and batch processing. Design and maintain end-to-end data flow diagrams and blueprints that consider the unique challenges of hybrid environments. Define and enforce data governance frameworks to maintain data quality, integrity, and traceability across integrated systems. Lead all aspects of data lifecycle management for both cloud and internally hosted systems to ensure consistency and compliance. Act as the main point of contact between pharmacovigilance teams, SaaS vendors, internal IT staff, and other parties to align technical solutions with business goals. Ensure alignment with the delivery and platform teams to safeguard that the applications follow approved Amgen’s architectural and development guidelines as well as data/software standards. Collaborate with analytics teams to ensure timely access to PV data for signal detection, trending, and regulatory reporting. Continuously evaluate and improve integration frameworks to adapt to evolving PV requirements, data volumes, and business needs. Provide technical guidance and mentorship to junior developers. Basic Qualifications Master’s degree with 4 to 6 years of experience in Computer Science, software development or related field Bachelor’s degree with 6 to 8 years of experience in Computer Science, software development or related field Diploma with 10 to 12 years of experience in Computer Science, software development or related field Must-Have Skills: Demonstrable experience in architecting data pipeline and/or integration cross technology landscape (SaaS, Data lake, internally hosted systems) Experience with Application Programming Interface (API integrations) such as MuleSoft and Extract Transform Load (ETL tools) as Informatica platform, Snowflake, or Databricks. Strong problem-solving skills, particularly in hybrid system integrations. Superb communication and collaborator leadership skills, ability to explain technical concepts to non-technical clients Ability to balance technical solutions with business priorities and compliance needs. Passion for using technology to improve pharmacovigilance and patient safety. Experience with data transfer processes and taking on stuck or delayed data files. Knowledge of testing methodologies and quality assurance standard processes. Proficiency in working with data analysis and QA tools. Understanding data flows related to regulations such as GDPR and HIPAA. Experience in SQL/NOSQL database, database programming languages, data modelling concepts. Good-to-Have Skills: Knowledgeable in SDLC, including requirements, design, testing, data analysis, change control Knowledgeable in reporting tools (e.g. Tableau, Power BI) Professional Certifications: SAFe for Architect certification (preferred) Soft Skills: Excellent analytical skills to gather options to deal with ambiguity scenarios. Excellent leadership and progressive thinking abilities Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation Ability to balance multiple priorities Team-oriented, with a focus on achieving team goals Strong presentation and public speaking skills Ability to influence and strive to an intended outcome Ability to hold team members accountable to commitments Shift Information: This position requires you to work a later shift and may be assigned a second or third shift schedule. Candidates must be willing and able to work during evening or night shifts, as required based on business requirements EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation .
Posted 4 weeks ago
2.0 - 5.0 years
7 - 11 Lacs
Hyderabad
Work from Office
Data Platform Engineer About Amgen Amgen harnesses the best of biology and technology to fight the world’s toughest diseases, and make people’s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what’s known today. What you will do Roles & Responsibilities: Work as a member of a Data Platform Engineering team that uses Cloud and Big Data technologies to design, develop, implement and maintain solutions to support various functional areas like Manufacturing, Commercial, Research and Development. Work closely with the Enterprise Data Lake delivery and platform teams to ensure that the applications are aligned with the overall architectural and development guidelines Research and evaluate technical solutions including Databricks and AWS Services, NoSQL databases, Data Science packages, platforms and tools with a focus on enterprise deployment capabilities like security, scalability, reliability, maintainability, cost management etc. Assist in building and managing relationships with internal and external business stakeholders Develop basic understanding of core business problems and identify opportunities to use advanced analytics Assist in reviewing 3rd party providers for new feature/function/technical fit with EEA's data management needs. Work closely with the Enterprise Data Lake ecosystem leads to identify and evaluate emerging providers of data management & processing components that could be incorporated into data platform. Work with platform stakeholders to ensure effective cost observability and control mechanisms are in place for all aspects of data platform management. Experience developing in an Agile development environment, and comfortable with Agile terminology and ceremonies. Keen on adopting new responsibilities, facing challenges, and mastering new technologies What we expect of you Basic Qualifications and Experience: Master’s degree in computer science or engineering field and 1 to 3 years of relevant experience OR Bachelor’s degree in computer science or engineering field and 3 to 5 years of relevant experience OR Diploma and Minimum of 8+ years of relevant work experience Must-Have Skills: Experience with Databricks (or Snowflake), including cluster setup, execution, and tuning Experience with common data processing librariesPandas, PySpark, SQL-Alchemy. Experience in UI frameworks (Angular.js or React.js) Experience with data lake, data fabric and data mesh concepts Experience with data modeling, performance tuning, and experience on relational databases Experience building ETL or ELT pipelines; Hands-on experience with SQL/NoSQL Program skills in one or more computer languages – SQL, Python, Java Experienced with software engineering best-practices, including but not limited to version control (Git, GitLab.), CI/CD (GitLab, Jenkins etc.), automated unit testing, and Dev Ops Exposure to Jira or Jira Align. Good-to-Have Skills: Knowledge on R language will be considered an advantage Experience in Cloud technologies AWS preferred. Cloud Certifications -AWS, Databricks, Microsoft Familiarity with the use of AI for development productivity, such as GitHub Copilot, Databricks Assistant, Amazon Q Developer or equivalent. Knowledge of Agile and DevOps practices. Skills in disaster recovery planning. Familiarity with load testing tools (JMeter, Gatling). Basic understanding of AI/ML for monitoring. Knowledge of distributed systems and microservices. Data visualization skills (Tableau, Power BI). Strong communication and leadership skills. Understanding of compliance and auditing requirements. Soft Skills: Excellent analytical and solve skills Excellent written and verbal communications skills (English) in translating technology content into business-language at various levels Ability to work effectively with global, virtual teams High degree of initiative and self-motivation Ability to manage multiple priorities successfully Team-oriented, with a focus on achieving team goals Strong problem-solving and analytical skills. Strong time and task leadership skills to estimate and successfully meet project timeline with ability to bring consistency and quality assurance across various projects. What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now for a career that defies imagination Objects in your future are closer than they appear. Join us. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
Posted 4 weeks ago
15.0 - 20.0 years
4 - 8 Lacs
Chennai
Work from Office
Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : BlueYonder Order Management Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :Role Overview:We are looking for an experienced Integration Architect to lead the design and execution of integration strategies for Blue Yonder (BY) implementations across cloud-native environments. The ideal candidate will possess strong expertise in integrating supply chain platforms with enterprise cloud systems, data lakes, and Snowflake, along with working knowledge of Generative AI (Gen AI) to enhance automation and intelligence in integration and data workflows.Key Responsibilities:Functional Expertise- Must have skill Blue Yonder (BY) Order Promising modules (formerly JDA)- Knowledge of ATP (Available to Promise), CTP (Capable to Promise), and Order Fulfillment logic- Experience with S&OP, Demand Planning, and Inventory Availability functions- Ability to design and interpret supply-demand match rules, sourcing policies, and allocation strategiesTechnical Acumen- Strong grasp of BY architecture, workflows, and configuration capabilities- Proficiency in tools like BY Platform Manager, BY Studio, and BY Workbench- Understanding of data modeling, integration frameworks (REST, SOAP APIs, flat file interfaces), and middleware platforms- Familiarity with PL/SQL, Java, and batch job orchestration for customizations and enhancementsIntegration & Ecosystem Knowledge- Integration experience with OMS, ERP (e.g., SAP, Oracle), WMS, and TMS- Experience in real-time inventory visibility, order brokering, and global ATP engine- Exposure to microservices architecture and cloud deployments (BY Luminate Platform) Implementation & Support Experience- Proven experience in end-to-end BY Order Promising implementations- Ability to conduct solution design workshops, fit-gap analysis, and UAT management- Experience in post-go-live support, performance tuning, and issue triage/resolutionSoft Skills & Project Leadership- Ability to act as a bridge between business and technical teams- Strong stakeholder communication, requirement gathering, and documentation skills- Excellent problem-solving and troubleshooting capabilities- Agile and Waterfall project methodology familiarityPreferred Certifications- Blue Yonder Functional / Technical Certification in Order Promising or Fulfillment- Supply Chain Certifications like APICS / CSCP (desirable) Additional Information:- The candidate should have minimum 5 years of experience in BlueYonder Demand Planning.- This position is based at our Chennai office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 4 weeks ago
15.0 - 20.0 years
5 - 9 Lacs
Chennai
Work from Office
Project Role : Advanced Application Engineer Project Role Description : Utilize modular architectures, next-generation integration techniques and a cloud-first, mobile-first mindset to provide vision to Application Development Teams. Work with an Agile mindset to create value across projects of multiple scopes and scale. Must have skills : BlueYonder Enterprise Supply Planning Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :We are looking for an experienced Integration Architect to lead the design and execution of integration strategies for Blue Yonder (BY) implementations across cloud-native environments. The ideal candidate will possess strong expertise in integrating supply chain platforms with enterprise cloud systems, data lakes, and Snowflake, along with working knowledge of Generative AI (Gen AI) to enhance automation and intelligence in integration and data workflows. Roles & Responsibilities:- Architect and implement end-to-end integration solutions for Blue Yonder (WMS, TMS, ESP, etc) with enterprise systems (ERP, CRM, legacy).- Design integration flows using cloud-native middleware platforms (Azure Integration Services, AWS Glue, GCP Dataflow, etc.).- Enable real-time and batch data ingestion into cloud-based Data Lakes (e.g., AWS S3, Azure Data Lake, Google Cloud Storage) and downstream to Snowflake.- Develop scalable data pipelines to support analytics, reporting, and operational insights from Blue Yonder and other systems.- Integrate Snowflake as an enterprise data platform for unified reporting and machine learning use cases. Professional & Technical Skills: - Leverage Generative AI (e.g., OpenAI, Azure OpenAI) for :Auto-generating integration mapping specs and documentation.- Enhancing data quality and reconciliation with intelligent agents.- Developing copilots for integration teams to speed up development and troubleshooting.- Ensure integration architecture adheres to security, performance, and compliance standards.- Collaborate with enterprise architects, functional consultants, data engineers, and business stakeholders.- Lead troubleshooting, performance tuning, and hypercare support post-deployment Additional Information:- The candidate should have minimum 5 years of experience in BlueYonder Enterprise Supply Planning.- This position is based at our Chennai office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 4 weeks ago
15.0 - 20.0 years
10 - 14 Lacs
Bengaluru
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : AWS BigData Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your day will involve overseeing the application development process and ensuring project success. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Expected to provide solutions to problems that apply across multiple teams- Lead the application development process effectively- Ensure timely delivery of projects- Provide guidance and mentorship to team members Professional & Technical Skills: - Must To Have Skills: Proficiency in AWS BigData- Strong understanding of cloud computing and AWS services- Experience in designing and implementing Big Data solutions- Knowledge of data warehousing and data lake concepts- Hands-on experience with big data technologies such as Hadoop and Spark Additional Information:- The candidate should have a minimum of 12 years of experience in AWS BigData- This position is based at our Gurugram office- A 15 years full-time education is required Qualification 15 years full time education
Posted 4 weeks ago
6.0 - 8.0 years
6 - 10 Lacs
Pune
Work from Office
: Job TitleProduction Specialist, Associate LocationPune, India Role Description Our organization within Deutsche Bank is AFC Production Services. We are responsible for providing technical L2 application support for business applications. The AFC (Anti-Financial Crime) line of business has a current portfolio of 25+ applications. The organization is in process of transforming itself using Google Cloud and many new technology offerings. Your role will include hands-on production support and be actively involved in technical issues resolution across multiple applications. Deutsche Banks Corporate Bank division is a leading provider of cash management, trade finance and securities finance. We complete green-field projects that deliver the best Corporate Bank - Securities Services products in the world. Our team is diverse, international, and driven by shared focus on clean code and valued delivery. At every level, agile minds are rewarded with competitive pay, support, and opportunities to excel. You will work as part of a cross-functional agile delivery team. You will bring an innovative approach to software development, focusing on using the latest technologies and practices, as part of a relentless focus on business value. You will be someone who sees engineering as team activity, with a predisposition to open code, open discussion and creating a supportive, collaborative environment. You will be ready to contribute to all stages of software delivery, from initial analysis right through to production support." What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy, Best in class leave policy. Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Provide technical support by handling and consulting on BAU, Incidents/emails/alerts for the respective applications. Perform post-mortem, root cause analysis using ITIL standards of Incident Management, Service Request fulfillment, Change Management, Knowledge Management, and Problem Management. Analyze occurred errors out of the batch processing and interfaces of related systems. Resolution or Workaround determination and implementation Supporting the resolution of high impact incidents on our applications, including attendance at incident bridge calls Escalate incident tickets timely and communicate effectively with business users, development teams, and stakeholders. Providing resolution for open problems or ensuring that the appropriate parties have been tasked with doing so. Supporting the handover from new Projects / Applications into Production Services with Service Transition before Go Life Phase. Assist in the process to approve application code releases as well as tasks assigned to support to perform. Keep key stakeholders informed using communication templates. Automate routine tasks and enhance operational efficiencies through scripts and tools. Support the transition of applications to Google Cloud and new technologies offering. Proactively Identify performance bottlenecks and suggest optimization strategies. Support audit, compliance, and regulatory requirements related to AFC applications. The candidate will have to work in shifts as part of a Rota covering APAC and EMEA hours between 07:00 IST and 09:00 PM IST (2 shifts). In the event of major outages or issues we may ask for flexibility to help provide appropriate cover. Supporting On Call-Support activities Your skills and experience 4-8 years of experience in providing hands on IT application support. Bachelors degree from an accredited college or university with a concentration in Computer Science or IT-related discipline (or equivalent work experience/diploma/certification). Preferred: ITIL v3 foundation certification or higher. Clear and concise documentation in general and especially a proper documentation of the status of incidents, problems, and service requests in the Service Management tool. Monitoring ToolsKnowledge of Elastic Search, Control M, Grafana, Geneos, OpenShift, Prometheus, Google Cloud Monitoring,Airflow, Splunk Red Hat Enterprise Linux (RHEL) professional skill in searching logs, process commands, start/stop processes, use of OS commands to aid in tasks needed to resolve or investigate issues. Shell scripting knowledge a plus. Understanding of database concepts and exposure in working with Oracle, MS SQL, Big Query etc. databases. Ability to work across countries, regions, and time zones with a broad range of cultures and technical capability. Skills That Will Help You Excel Strong written and oral communication skills, including the ability to communicate technical information to a non-technical audience and good analytical and problem-solving skills. Analytical and problem-solving skills, with a structured approach to troubleshooting, issue resolution and its documentation. Able to train, coach, and mentor and know where each technique is best applied. Experience with GCP or another public cloud provider to build applications. Experience in an investment bank, financial institution or large corporation using enterprise hardware and software. Knowledge of Actimize, Mantas, and case management software is good to have. Working knowledge of Big Data Hadoop/Secure Data Lake is a plus. Prior experience in automation projects is great to have. Exposure to python, shell, Ansible or other scripting language for automation and process improvement Strong stakeholder management skills ensuring seamless coordination between business, development, and infrastructure teams. How we'll support you Training and development to help you excel in your career. Coaching and support from experts in your team A culture of continuous learning to aid progression. A range of flexible benefits that you can tailor to suit your needs.
Posted 4 weeks ago
4.0 - 8.0 years
6 - 10 Lacs
Bengaluru
Work from Office
Back At BCE Global Tech, immerse yourself in exciting projects that are shaping the future of both consumer and enterprise telecommunications This involves building innovative mobile apps to enhance user experiences and enable seamless connectivity on-the-go Thrive in diverse roles like Full Stack Developer, Backend Developer, UI/UX Designer, DevOps Engineer, Cloud Engineer, Data Science Engineer, and Scrum Master; at a workplace that encourages you to freely share your bold and different ideas If you are passionate about technology and eager to make a difference, we want to hear from you! Apply now to join our dynamic team in Bengaluru We are seeking a talented Site Reliability Engineer (SRE) to join our team The ideal candidate will have a strong background in software engineering and systems administration, with a passion for building scalable and reliable systems As an SRE, you will collaborate with development and operations teams to ensure our services are reliable, performant, and highly available Key Responsibilities "Ensure the 24/7 operations and reliability of data services in our production GCP and on-premise Hadoop environments Collaborate with the data engineering development team to design, build, and maintain scalable, reliable, and secure data pipelines and systems Develop and implement monitoring, alerting, and incident response strategies to proactively identify and resolve issues before they impact production Drive the implementation of security and reliability best practices across the software development life cycle Contribute to the development of tools and automation to streamline the management and operation of data services Participate in on-call rotation and respond to incidents in a timely and effective manner Continuously evaluate and improve the reliability, scalability, and performance of data services" Technology Skills "4+ years of experience in site reliability engineering or a similar role Strong experience with Google Cloud Platform (GCP) services, including BigQuery, Dataflow, Pub/Sub, and Cloud Storage Experience with on-premise Hadoop environments and related technologies (HDFS, Hive, Spark, etc ) Proficiency in at least one programming language (Python, Scala, Java, Go, etc ) Required qualifications to be successful in this role Bachelors degree in computer science engineering, or related field 8 -10 years of experience as a SRE Proven experience as an SRE, DevOps engineer, or similar role Strong problem-solving skills and ability to work under pressure Excellent communication and collaboration skills Flexible to work in EST time zones ( 9-5 EST) Additional Information Job Type Full Time Work ProfileHybrid (Work from Office/ Remote) Years of Experience8-10 Years LocationBangalore What We Offer Competitive salaries and comprehensive health benefits Flexible work hours and remote work options Professional development and training opportunities A supportive and inclusive work environment
Posted 4 weeks ago
5.0 - 10.0 years
7 - 12 Lacs
Bengaluru
Work from Office
Are you ready to play a key role in transforming Thomson Reuters into a truly data-driven companyJoin our Data & Analytics (D&A) function and be part of the strategic ambition to build, embed, and mature a data-driven culture across the entire organization. The Data Architecture organization within the Data and Analytics division is responsible for designing and implementing a unified data strategy that enables the efficient, secure, and governed use of data across the organization. We aim to create a trusted and customer-centric data ecosystem, built on a foundation of data quality, security, and openness, and guided by the Thomson Reuters Trust Principles. Our team is dedicated to developing innovative data solutions that drive business value while upholding the highest standards of data management and ethics. About the Role In this opportunity as a Data Architect, you will: Lead Architecture Design: Architect and Lead Data Platform EvolutionSpearhead the conceptual, logical, and physical architecture design for our enterprise Data Platform (encompassing areas like our data lake, data warehouse, streaming services, and master data management systems). You will define and enforce data modeling standards, data flow patterns, and integration strategies to serve a diverse audience from data engineers to AI/ML practitioners and BI analysts. Technical Standards and Best Practices: Research and recommend technical standards, ensuring the architecture aligns with overall technology and product strategy. Be hands-on in implementing core components reusable across applications. Hands-on Prototyping and Framework DevelopmentWhile a strategic role, maintain a hands-on approach by designing and implementing proof-of-concepts and core reusable components/frameworks for the data platform. This includes developing best practices and templates for data pipelines, particularly leveraging dbt for transformations, and ensuring efficient data processing and quality. Champion Data Ingestion StrategiesDesign and oversee the implementation of robust, scalable, and automated cloud data ingestion pipelines from a variety of sources (e.g., APIs, databases, streaming feeds) into our AWS-based data platform, utilizing services such as AWS Glue, Kinesis, Lambda, S3, and potentially third-party ETL/ELT tools. Design and optimize solutions utilizing our core cloud data stack, including deep expertise in Snowflake (e.g., architecture, performance tuning, security, data sharing, Snowpipe, Streams, Tasks) and a broad range of AWS data services (e.g., S3, Glue, EMR, Kinesis, Lambda, Redshift, DynamoDB, Athena, Step Functions, MWAA/Managed Airflow) to build and automate end-to-end analytics and data science workflows. Data-Driven Decision-Making: Make quick and effective data-driven decisions, demonstrating strong problem-solving and analytical skills. Align strategies with company goals. Stakeholder Collaboration: Collaborate closely with external and internal stakeholders, including business teams and product managers. Define roadmaps, understand functional requirements, and lead the team through the end-to-end development process. Team Collaboration: Work in a collaborative team-oriented environment, sharing information, diverse ideas, and partnering with cross-functional and remote teams. Quality and Continuous Improvement: Focus on quality, continuous improvement, and technical standards. Keep service focus on reliability, performance, and scalability while adhering to industry best practices. Technology Advancement: Continuously update yourself with next-generation technology and development tools. Contribute to process development practices. About You Youre a fit for the role of Data Architect, Data Platform if your background includes: Educational Background: Bachelors degree in information technology. Experience: 10 + years of IT experience with at least 5 years in a lead design or architectural capacity. Technical Expertise: Broad knowledge and experience with Cloud-native software design, Microservices architecture, Data Warehousing, and proficiency in Snowflake. Cloud and Data Skills: Experience with building and automating end-to-end analytics pipelines on AWS, familiarity with NoSQL databases. Data Pipeline and Ingestion MasteryExtensive experience in designing, building, and automating robust and scalable cloud data ingestion frameworks and end-to-end data pipelines on AWS. This includes experience with various ingestion patterns (batch, streaming, CDC) and tools. Data Modeling: Proficient with concepts of data modeling and data development lifecycle. Advanced Data ModelingDemonstrable expertise in designing and implementing various data models (e.g., relational, dimensional, Data Vault, NoSQL schemas) for transactional, analytical, and operational workloads. Strong understanding of the data development lifecycle, from requirements gathering to deployment and maintenance. Leadership: Proven ability to lead architectural discussions, influence technical direction, and mentor data engineers, effectively balancing complex technical decisions with user needs and overarching business constraints Programming Skills: Strong programming skills in languages such as Python or Java or data manipulation, automation, and API development. Regulatory Awareness and Security Acumen: Data Governance and Security AcumenDeep understanding and practical experience in designing and implementing solutions compliant with robust data governance principles, data security best practices (e.g., encryption, access controls, masking), and relevant privacy regulations (e.g., GDPR, CCPA). Containerization and Orchestration: Experience with containerization technologies like Docker and orchestration tools like Kubernetes. #LI-VN1 What’s in it For You Hybrid Work Model We’ve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrow’s challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our valuesObsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world. About Us Thomson Reuters informs the way forward by bringing together the trusted content and technology that people and organizations need to make the right decisions. We serve professionals across legal, tax, accounting, compliance, government, and media. Our products combine highly specialized software and insights to empower professionals with the data, intelligence, and solutions needed to make informed decisions, and to help institutions in their pursuit of justice, truth, and transparency. Reuters, part of Thomson Reuters, is a world leading provider of trusted journalism and news. We are powered by the talents of 26,000 employees across more than 70 countries, where everyone has a chance to contribute and grow professionally in flexible work environments. At a time when objectivity, accuracy, fairness, and transparency are under attack, we consider it our duty to pursue them. Sound excitingJoin us and help shape the industries that move society forward. As a global business, we rely on the unique backgrounds, perspectives, and experiences of all employees to deliver on our business goals. To ensure we can do that, we seek talented, qualified employees in all our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under applicable law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. We also make reasonable accommodations for qualified individuals with disabilities and for sincerely held religious beliefs in accordance with applicable law. More information on requesting an accommodation here. Learn more on how to protect yourself from fraudulent job postings here. More information about Thomson Reuters can be found on thomsonreuters.com.
Posted 4 weeks ago
2.0 - 6.0 years
10 - 15 Lacs
Ahmedabad
Work from Office
We are seeking a highly skilled and innovative AI & MLTechnology Specialist to drive AI initiatives within our HR function. Theideal candidate will explore and implement advanced AI and machine learningsolutions to enhance HR processes, leveraging cutting-edge technologies such as data lakes, AI/ML bots, predictive analytics, and automation frameworks . Key Responsibilities: Develop AI-driven HR Solutions : Identify and implement AI and ML applications to optimize recruitment, onboarding, employee engagement, performance management, and workforce planning. Data Management & Analytics : Design and manage high-end HR data lakes , ensuring data integrity, security, and accessibility for advanced analytics. AI/ML Bot Development : Work on intelligent HR chatbots for employee queries, HR service automation, and improving user experience. Predictive Workforce Analytics : Utilize machine learning models to analyze workforce trends, predict attrition, assess employee satisfaction, and optimize talent management strategies. Collaborate with HR & IT Teams : Partner with cross-functional teams to understand business needs, develop AI-driven HR solutions, and ensure seamless integration with existing HR systems. Research & Continuous Innovation : Stay up-to-date with emerging AI/ML trends, tools, and frameworks, recommending best practices for HR transformation. Qualifications & Skills: Bachelor's/Masters degree in Computer Science, Data Science, Artificial Intelligence, or a related field . Proven experience in AI & ML technologies , with a focus on HR applications. Strong knowledge of data lakes, predictive analytics, NLP, chatbot development, and automation . Proficiency in programming languages such as Python, R, TensorFlow, or PyTorch . Experience with HR tech platforms, cloud-based AI solutions, and big data analytics is a plus. Excellent problem-solving skills, analytical mindset, and ability to communicate technical concepts to non-technical stakeholders.
Posted 4 weeks ago
7.0 - 12.0 years
0 - 2 Lacs
Pune, Ahmedabad, Gurugram
Work from Office
Urgent Hiring: Azure Data Engineer (Strong PySpark + SCD II/III Expert) Work Mode: Remote Client-Focused Interview on PySpark + SCD II/III Key Must-Haves: Very Strong hands-on PySpark coding Practical experience implementing Slowly Changing Dimensions (SCD) Type II and Type III Strong expertise in Azure Data Engineering (ADF, Databricks, Data Lake, Synapse) Proficiency in SQL and Python for scripting and transformation Strong understanding of data warehousing concepts and ETL pipelines Good to Have: Experience with Microsoft Fabric Familiarity with Power BI Domain knowledge in Finance, Procurement, and Human Capital Note: This role is highly technical. The client will focus interviews on PySpark coding and SCD Type II/III implementation . Only share profiles that are hands-on and experienced in these areas. Share strong, relevant profiles to: b.simrana@ekloudservices.com
Posted 1 month ago
3.0 - 8.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Designer Project Role Description : Assist in defining requirements and designing applications to meet business process and application requirements. Must have skills : SAP BW/4HANA Data Modeling & Development Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Designer, you will assist in defining requirements and designing applications to meet business process and application requirements. A typical day involves collaborating with cross-functional teams to gather insights, analyzing user needs, and translating them into functional specifications. You will engage in discussions to refine application designs and ensure alignment with business objectives, while also participating in testing and validation processes to guarantee that the applications meet the defined requirements effectively. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with stakeholders to gather and analyze requirements for application design.- Participate in the testing and validation of applications to ensure they meet business needs. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP BW/4HANA Data Modeling & Development.- Good to have- SAP ABAP, CDP views- Strong understanding of data modeling concepts and best practices.- Experience with application design methodologies and tools.- Ability to analyze and interpret complex business requirements.- Familiarity with integration techniques and data flow management. Additional Information:- The candidate should have minimum 3 years of experience in SAP BW/4HANA Data Modeling & Development.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
10.0 - 16.0 years
60 - 75 Lacs
Pune
Hybrid
Position Summary: As a Software Architect, you will be responsible for providing technical leadership and architectural guidance to development teams, ensuring the design and implementation of scalable, robust, and maintainable software solutions. You will collaborate with stakeholders, including business leaders, project managers, and developers, to understand requirements, define architectural goals, and make informed decisions on technology selection, system design, and implementation strategies. Additionally, you will mentor and coach team members, promote best practices, and foster a culture of innovation and excellence within the organization. This role is based in Redaptive Pune, India office. Responsibilities and Duties: Time Spent Performing Duty: System Design and Architecture : 40% Identify and propose technical solutions for complex problem-statements. Provides an application-level perspective during design and implementation, which incorporates for cost constraints, testability, complexity, scalability, performance, migrations, etc. Provide technical leadership and guidance to development teams, mentoring engineers and fostering a culture of excellence and innovation. Review code and architectural designs to ensure adherence to coding standards, best practices, and architectural principles. Create and maintain architectural documentation, including architectural diagrams, design documents, and technical specifications, to ensure clarity and facilitate collaboration. Software Design and Development: 50% Gather and analyze requirements from stakeholders, understanding business needs, and translating them into technical specifications. Work alongside teams at all stages of design & development. Augmenting and supporting teams as needed. Collaborate with product managers, stakeholders, and cross-functional teams to define project scope, requirements, and timelines, and ensure successful project execution. Knowledge Sharing and Continuous Improvement: 10% Conduct presentations, workshops, and training sessions to educate stakeholders and development teams on architectural concepts, best practices, and technologies. Stay updated with emerging technologies, industry trends, and best practices in software architecture and development. Identify opportunities for process improvement, automation, and optimization in software development processes and methodologies. Share knowledge and expertise with team members through mentorship, training sessions, and community involvement. Required Abilities and Skills: Strong analytical and troubleshooting skills. Excellent verbal and written communication skills. Ability to effectively communicate with stakeholders, including business leaders and project managers to understand requirements and constraints. Works effectively with cross-functional teams, including developers, QA, product managers, and operations. Capability to understand the bigger picture and design systems that align with business goals, scalability requirements, and future growth. Ability to make tough decisions and take ownership of architectural choices, considering both short-term and long-term implications Mastery of one or more programming languages commonly used in software development, such as Java, Python, or JavaScript. Expertise in SQL and NoSQL database, including database design and optimization. Ability to quickly learn new technologies and adapt to changing requirements. Knowledge of techniques for designing scalable and high-performance web services, including load balancing, caching, and horizontal scaling. Knowledge of software design principles (e.g. object-oriented principles, data structures, and algorithms.) Processes a security mindset, drives adoption of best practices to design systems that are secure and resilient to security threats. Continuously learning and staying up to date with emerging technologies and best practices. Domain knowledge in energy efficiency, solar/storage, or electric utilities is a plus. Education and Experience: 10+ years of software development experience. Proven track record of delivering high-quality software solutions within deadlines. Demonstrated technical leadership experience. Experience with data heavy systems like Databricks and Data Ops. Experience with Cloud (AWS) application development. Experience with Java & Spring framework strongly preferred. Experience with distributed architectures, SOA, microservices and containerization technologies (e.g., Docker, Kubernetes) Experience designing and developing web-based applications and backend services. Travel: This role may require 1-2 annual international work visits to the US. The Perks! Equity plan participation Medical and Personal Accident Insurance Support on Hybrid working and Relocation Flexible Time Off Continuous Learning Annual bonus, subject to company and individual performance The company is an Equal Opportunity Employer, drug free workplace, and complies with Labor Laws as applicable. All duties and responsibilities are essential functions and requirements and are subject to possible modification to reasonably accommodate individuals with disabilities. The requirements listed in this document are the minimum levels of knowledge, skills, or abilities.
Posted 1 month ago
2.0 - 5.0 years
5 - 15 Lacs
Hyderabad
Work from Office
Company Overview Accordion works at the intersection of sponsors and management teams throughout every stage of the investment lifecycle, providing hands-on, execution-focused support to elevate data and analytics capabilities. So, what does it mean to work at Accordion? It means joining 1,000+ analytics, data science, finance & technology experts in a high-growth, agile, and entrepreneurial environment while transforming how portfolio companies drive value. It also means making your mark on Accordions futureby embracing a culture rooted in collaboration and a firm-wide commitment to building something great, together. Headquartered in New York City with 10 offices worldwide, Accordion invites you to join our journey. Data & Analytics (Accordion | Data & Analytics) Accordion's Data & Analytics (D&A) team delivers cutting-edge, intelligent solutions to a global clientele, leveraging a blend of domain knowledge, sophisticated technology tools, and deep analytics capabilities to tackle complex business challenges. We partner with Private Equity clients and their Portfolio Companies across diverse sectors, including Retail, CPG, Healthcare, Media & Entertainment, Technology, and Logistics. D&A team delivers data and analytical solutions designed to streamline reporting capabilities and enhance business insights across vast and complex data sets ranging from Sales, Operations, Marketing, Pricing, Customer Strategies, and more. Location: Hyderabad, Telangana Role Overview: Accordion is looking for Senior Data Engineer with Database/Data Warehouse/Business Intelligence experience. He/she will be responsible for the design, development, configuration/deployment, and maintenance of the above technology stack. He/she must have in depth understanding of various tools & technologies in the above domain to design and implement robust and scalable solutions which address client current and future requirements at optimal costs. The Senior Data Engineer should be able to understand various architecture and recommend right fit depending on the use case of the project. A successful Senior Data Engineer should possess strong working business knowledge, familiarity with multiple tools and techniques along with industry standards and best practices in Business Intelligence and Data Warehousing environment. He/she should have strong organizational, critical thinking, and communication skills. What You will do: Understand the business requirements thoroughly to design and develop the BI architecture. Determine business intelligence and data warehousing solutions that meet business needs. Perform data warehouse design and modelling according to established standards. Work closely with the business teams to arrive at methodologies to develop KPIs and Metrics. Work with Project Manager in developing and executing project plans within assigned schedule and timeline. Develop standard reports and functional dashboards based on business requirements. Ensure to develop and deliver high quality reports in timely and accurate manner. Conduct training programs and knowledge transfer sessions to junior developers when needed. Recommend improvements to provide optimum reporting solutions. Ideally, you have: Undergraduate degree (B.E/B.Tech.) from tier-1/tier-2 colleges are preferred. 2 - 5 years of experience in related field. Proven expertise in SSIS, SSAS and SSRS (MSBI Suite). In-depth knowledge of databases (SQL Server, MySQL, Oracle etc.) and data warehouse (Azure Synapse, AWS Redshift, Google BigQuery, Snowflake etc.). In-depth knowledge of business intelligence tools (any one of Power BI, Tableau, Qlik, DOMO, Looker etc.). Good understanding of Azure (Data Factory & Pipelines, SQL Database & Managed Instances, DevOps, Logic Apps, Analysis Services), AWS (Glue, Aurora Database, Dynamo Database, Redshift, QuickSight). Proven abilities to take on initiative and be innovative. Analytical mind with problem solving attitude. Why Explore a Career at Accordion: High growth environment: Semi-annual performance management and promotion cycles coupled with a strong meritocratic culture, enables fast track to leadership responsibility. Cross Domain Exposure: Interesting and challenging work streams across industries and domains that always keep you excited, motivated, and on your toes. Entrepreneurial Environment : Intellectual freedom to make decisions and own them. We expect you to spread your wings and assume larger responsibilities. Fun culture and peer group: Non-bureaucratic and fun working environment; Strong peer environment that will challenge you and accelerate your learning curve. Other benefits for full time employees: Health and wellness programs that include employee health insurance covering immediate family members and parents, term life insurance for employees, free health camps for employees, discounted health services (including vision, dental) for employee and family members, free doctors consultations, counsellors, etc. Corporate Meal card options for ease of use and tax benefits. Team lunches, company sponsored team outings and celebrations. Robust leave policy to support work-life balance. Specially designed leave structure to support woman employees for maternity and related requests. Reward and recognition platform to celebrate professional and personal milestones. A positive & transparent work environment including various employee engagement and employee benefit initiatives to support personal and professional learning and development.
Posted 1 month ago
2.0 - 7.0 years
5 - 15 Lacs
Hyderabad, Bengaluru
Work from Office
Job Description - Data Warehouse Senior Engineer / Lead: Location : Bangalore or Hyderabad Responsibilities 1) Own design and development complex data integrations from multiple systems 2) Coordinate with onshore teams to obtain clarity on requirements, scope etc. 3) Be able to develop high quality BI reports that meet the needs of the customer 4) Good communication, interpersonal skills, and team player Qualifications Required 1) Strong knowledge on Azure Data warehousing and integration solutions such as Azure Data factory, Synapse Analytics Analytics 2) Working knowledge on Power Platform - PowerBI, PowerApps, Data verse, Power Automate 3) Working knowledge on Azure app integrations - Logic apps, function apps 4) Good knowledge on Azure data storage solutions - Data Lake, Cosmos DB, Storage accounts, SQL database 5) Strong data modelling experience (snowflake, dimensional etc.) and SQL expertise 6) Strong data analysis skills 7) Knowledge in Microsoft Fabrics Optional Skills 1) Knowledge on other data integration/streaming services (Data bricks, Azure data streaming services, Event grids, Kafka etc.) is a plus. 2) Knowledge on Microsoft Dynamics 365 platform - including working knowledge on export/import data from data verse etc. is a plus. 3) Having a certification on Azure data engineering related aspects is a plus.
Posted 1 month ago
3.0 - 7.0 years
5 - 9 Lacs
Hyderabad
Work from Office
We are looking for an experienced Azure Data Engineer with 2+ years of hands-on experience in Azure Data Lake and Azure Data Factory. The ideal candidate will have a strong background in connecting data sources to the Data Lake, writing PiSpark SQL codes, and building SSIS packages. Additionally, experience in data architecture, data modeling, and creating visualizations is essential. Key Responsibilities : Work with Azure Data Lake and Azure Data Factory to design, implement, and manage data pipelines. Connect various data sources (applications, databases, etc.) to the Azure Data Lake for storage and processing. Write PiSpark SQL codes and SSIS packages for data retrieval and transformation from different data sources. Design and develop efficient Data Architecture and Data Modeling solutions to support business requirements. Create data visualizations to communicate insights to stakeholders and decision-makers. Optimize data workflows and pipelines for better performance and scalability. Collaborate with cross-functional teams to ensure seamless data integration and delivery. Ensure data integrity, security, and compliance with best practices. Skills and Qualifications : 2+ years of experience working with Azure Data Lake, Azure Data Factory, and related Azure services. Proficiency in writing PiSpark SQL codes for data extraction and transformation. Experience in developing SSIS packages for data integration and automation. Strong understanding of Data Architecture and Data Modeling concepts. Experience in creating effective and insightful data visualizations using tools like Power BI or similar. Familiarity with cloud-based storage and computing concepts and best practices. Strong problem-solving skills with an ability to troubleshoot and optimize data workflows. Ability to collaborate effectively in a team environment and communicate with stakeholders. Preferred Qualifications : Certifications in Azure (e.g., Azure Data Engineer or similar) would be a plus. Experience with other Azure tools like Azure Synapse, Databricks, etc.
Posted 1 month ago
14.0 - 24.0 years
35 - 55 Lacs
Hyderabad, Bengaluru, Delhi / NCR
Hybrid
About the role We are seeking a Sr. Practice Manager with Insight , you will be involved in different phases related to Software Development Lifecycle including Analysis, Design, Development and Deployment. We will count on you to be proficient in Software Design and Development, data modelling, data processing and data visualization. Along the way, you will get to: Help customers leverage existing data resources, implement new technologies and tooling to enable data science and data analytics Track the performance of our resources and related capabilities Experience mentoring and managing other data engineers and ensuring data engineering best practices are being followed. Constantly evolve and scale our capabilities along with the growth of the business and needs of our customers Be Ambitious : This opportunity is not just about what you do today but also about where you can go tomorrow. As a Practice Manager, you are positioned for swift advancement within our organization through a structured career path. When you bring your hunger, heart, and harmony to Insight, your potential will be met with continuous opportunities to upskill, earn promotions, and elevate your career. What were looking for Sr. Practice Manager with: Total of 14+ yrs of relevant experience, atleast 5-6 years in people management, managing 20+ team. Minimum 12 years of experience in Data technology. Experience in Data Warehouse and excellent command in SQL, data modeling and ETL development. Hands-on experience in SQL Server, Microsoft Azure (Data Factory, Data Lake, Data Bricks) Experience in MSBI (SSRS, SSIS, SSAS), writing queries and stored procedures. (Good to have) Experienced using Power BI, MDX, DAX, MDS, DQS. (Good to have) Experience developing design related to Predictive Analytics model Ability to handle performance improvement tasks & data archiving. Proficient in relevant provisioning of Azure resources, forecasting hardware usage, and managing to a budget.
Posted 1 month ago
1.0 - 5.0 years
6 - 10 Lacs
Hyderabad
Work from Office
Coeo are trusted data management and analytics experts, delivering technology strategy and support for business The team have deep technical and commercial experience working with Microsoft Data Services to help our clients optimise their costs and maximise the benefits from their investments in these technologies Coeo have a strong emphasis on consulting skills, and we expect our team members to be customer facing and have a growth mind-set This role sits within our consulting team and we have clear expectations that our team members understand the importance of personal utilisation and have the ability to spot opportunities within our clients that can be passed back to our business development team. Coeo has been established for over 14 years and has exclusively focused on Microsoft technologies Our mission is to help our clients predict their future through the better use of data, technology, people and processes To do this our business has always focused on: Managed Services Database Consultancy Data Engineering and Analytics Consultancy Adoption and Change Management There has never been a more exciting time to join us, were a fast-growing professional services and managed services business Due to consistent growth, it is enabling us to expand out our project management and delivery teams. Role Overview: We are looking for an experienced Data Engineer with a strong background in SQL Server, SSIS, and Data Warehousing This role will involve developing and optimizing ETL pipelines, designing data models, and delivering scalable, high-performance data solutions that support analytics and business intelligence. Key Responsibilities: Design and maintain ETL processes using SQL Server Integration Services (SSIS). Work with SQL Server to create and optimize queries and stored procedures. Build and manage data warehouses to support reporting and analytics. Develop scalable data pipelines to support business needs. Collaborate with stakeholders to gather requirements and deliver data solutions. Monitor and optimize database and pipeline performance. Implement data management and ETL best practices. Required Skills: Strong expertise in SQL Server and SSIS. In-depth understanding of Data Warehousing and data modeling concepts. Ability to design and optimize stored procedures, functions, and complex queries. Strong problem-solving skills and attention to detail. Excellent communication and collaboration skills. Preferred Qualifications: Familiarity with cloud platforms (Azure, AWS, GCP) and data lake architecture. Experience with big data tools (Hadoop, Spark) is a plus. Mentoring or leadership experience is a bonus. Education & Experience: Bachelor's degree in Computer Science or a related field. Several years of hands-on experience in SQL-based data engineering. Additional Information: Hybrid working with flexible office visits in Hyderabad. Competitive compensation package with benefits such as healthcare, Gym pass, and more. Supportive and inclusive culture with career progression opportunities. Apply via our Careers page or visit our LinkedIn, Facebook, and Twitter profiles for more about Coeo. Diversity and Inclusion: Coeo is an equal opportunity employer committed to diversity and inclusion All qualified applicants will be considered.
Posted 1 month ago
12.0 - 20.0 years
22 - 37 Lacs
Bengaluru
Hybrid
12+ yrs of experience in Data Architecture Strong in Azure Data Services & Databricks, including Delta Lake & Unity Catalog Experience in Azure Synapse, Purview, ADF, DBT, Apache Spark,DWH,Data Lakes, NoSQL,OLTP NP-Immediate sachin@assertivebs.com
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20312 Jobs | Dublin
Wipro
11977 Jobs | Bengaluru
EY
8165 Jobs | London
Accenture in India
6667 Jobs | Dublin 2
Uplers
6464 Jobs | Ahmedabad
Amazon
6352 Jobs | Seattle,WA
Oracle
5993 Jobs | Redwood City
IBM
5803 Jobs | Armonk
Capgemini
3897 Jobs | Paris,France
Tata Consultancy Services
3776 Jobs | Thane