Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 10.0 years
20 - 27 Lacs
gurugram
Work from Office
Description: Agentic AI is must Requirements: Education: Bachelor’s degree in Computer Science, Software Engineering, AI/ML, or a related field Certifications in AI/ML, Generative AI, or Cloud platforms (AWS/GCP) are a plus A Candidate should have atleast 5+ years of Experiences Job Description: The Agent Developer will be a key contributor to our initiative to build and launch assistive agents. This role involves designing, developing, and deploying AI-powered agents that enhance efficiency in program tracking, governance, and execution. The Agent Developer will be responsible for bringing the defined requirements and use cases to life, working with a variety of internal tools and data sources. Technical Skills: Agentic AI is must. API Development & Integration : Experience in consuming and building REST APIs, gRPC Database Skills : Working knowledge of MySQL or Cloud Spanner or equivalent relational databases Cloud Knowledge : Familiarity with Google Cloud Platform (GCP) (like GCE,Google App Engine,Google Cloud Storage,BigQuery, Dataflow,DataProc,AI Platform,Cloud AutoML Testing & Debugging : Solid understanding of unit testing frameworks like Jasmine, Karma, or Jest. Proficiency in debugging, writing unit/integration tests, and validating AI workflows Machine Learning Fundamentals : Solid understanding of ML (machine learning concepts), algorithms, and their application in building intelligent systems. Generative AI & LLMs : Practical experience with Generative AI models, including large language models (LLMs) like PaLM, Gemini, or GPT, for building assistive agents and generating human-like text. AI/ML Integration : Understanding of building assistive agents using Generative AI (e.g., LLMs like PaLM, Gemini, GPT) Tool Integrations : Experience integrating with internal/external tools such as Buganizer, Jira, Taskflow, Chiron, and RA Data Handling : Knowledge of data sourcing, grounding, lineage, and attribution techniques to ensure reliable agent responses Personalization : Ability to implement contextual and user-specific response logic within AI agents Soft Skills & Collaboration: Strong problem-solving skills with a focus on building reliable, scalable, and efficient agent experiences Ability to work in cross-functional teams, including developers, SMEs, product managers, and data scientists Effective written and verbal communication skills Adaptability to fast-paced, experiment-driven environments Experience working in Agile/Scrum teams Nice to Have : Familiarity with version control systems (e.g., Git) Experience in RESTful API design and integration Experience with backend frameworks such as ExpressJS Experience with CI/CD pipelines and DevOps practices Familiarity with microservices architecture Experience with Docker or containerization Knowledge of security best practices in web development Job Responsibilities: Key Responsibilities: Agent Development : Ability to Design, develop, and implement assistive agents based on the requirements and use cases Integration : Develop robust integrations with various internal & external tools and platforms, including Buganizer, Jira, Taskflow, Chiron, and RA, to feed data into the agents. Data Handling : Implement mechanisms for data sourcing, grounding, lineage, and attribution to ensure agent responses are accurate, relevant, and traceable. Personalization : Develop features that enable personalization of agent responses to tailor information to specific users or groups. Automation & Scripting : Implement automation solutions and develop custom scripts (e.g., Python) to extend agent capabilities for use cases not supported out-of-the-box. Testing & Debugging : Conduct thorough testing, identify and debug issues, and ensure the reliability and performance of developed agents. Experimentation & Evaluation : Participate in the design and execution of experiments to evaluate agent performance, identify areas for improvement, and implement corrections. Code Quality : Write clean, maintainable, and well-documented code following best practices. Collaboration : Work closely with Agent Development SMEs, other developers, and product managers to deliver high-quality agent solutions. What We Offer: Exciting Projects: We focus on industries like High-Tech, communication, media, healthcare, retail and telecom. Our customer list is full of fantastic global brands and leaders who love what we build for them. Collaborative Environment: You Can expand your skills by collaborating with a diverse team of highly talented people in an open, laidback environment — or even abroad in one of our global centers or client facilities! Work-Life Balance: GlobalLogic prioritizes work-life balance, which is why we offer flexible work schedules, opportunities to work from home, and paid time off and holidays. Professional Development: Our dedicated Learning & Development team regularly organizes Communication skills training(GL Vantage, Toast Master),Stress Management program, professional certifications, and technical and soft skill trainings. Excellent Benefits: We provide our employees with competitive salaries, family medical insurance, Group Term Life Insurance, Group Personal Accident Insurance , NPS(National Pension Scheme ), Periodic health awareness program, extended maternity leave, annual performance bonuses, and referral bonuses. Fun Perks: We want you to love where you work, which is why we host sports events, cultural activities, offer food on subsidies rates, Corporate parties. Our vibrant offices also include dedicated GL Zones, rooftop decks and GL Club where you can drink coffee or tea with your colleagues over a game of table and offer discounts for popular stores and restaurants!
Posted 3 weeks ago
3.0 - 6.0 years
4 - 8 Lacs
chennai
Work from Office
GCP- Google Big Query - Strong experience in data engineering or analytics with strong SQL expertise. hands-on experience with Google BigQuery in production environments. Strong understanding of BigQuery architecture, partitioning, clustering, and performance tuning. Experience with GCP data services such as Cloud Storage, Dataflow, Composer (Airflow), and Pub/Sub. Proficiency in data modeling techniques (star/snowflake schema, denormalization, etc.). Familiarity with scripting languages such as Python or Java for orchestration and transformation. Experience with CI/CD tools and version control (e.g., Git, Cloud Build). Solid understanding of data security and access control within GCP. Design, develop, and maintain scalable data pipelines using BigQuery and GCP-native tools. Optimize complex SQL queries and BigQuery jobs for performance and cost efficiency. Collaborate with business analysts, data scientists, and engineers to deliver actionable insights from large datasets. Build and manage data warehouses and data marts using BigQuery. Integrate BigQuery with other GCP services such as Cloud Storage, Dataflow, Pub/Sub, and Cloud Functions. Implement best practices for data modeling, data governance, and security within BigQuery. Monitor and troubleshoot data workflows and optimize storage/query performance. Participate in architecture discussions and contribute to overall data platform strategy.
Posted 3 weeks ago
10.0 - 12.0 years
6 - 11 Lacs
bengaluru
Work from Office
Oracle Health & AI (OHAI) is a newly formed business unit committed to transforming the healthcare industry through our expertise in IaaS and SaaS. Our mission is to deliver patient-centered care and make advanced clinical tools accessible globally (). We're assembling a team of innovative technologists to build the next-generation health platforma greenfield initiative driven by entrepreneurship, creativity, and energy. If you thrive in a fast-paced, innovative environment, we invite you to help us create a world-class engineering team with a meaningful impact. The OHAI Patient Accounting Analytics Team focuses on delivering cutting-edge reporting metrics and visualizations for healthcare financial data. Our goal is to transform healthcare by automating insurance and patient billing processes, helping optimize operations and reimbursement processes. Our solutions leverage a blend of reporting platforms across both on-premises and cloud infrastructure. As we expand reporting capabilities on the cloud and make use of AI, were looking for talented professionals to join us on this exciting journey. Responsibilities What? We are looking for an accomplished and experienced Consulting Member of Technical Staff with in-depth knowledge of Oracle Analytics Cloud (OAC) to lead the design, development, integration, and optimization of analytics solutions. In this role, you will serve as a technical leader, guiding solution architecture and promoting best practices in data modeling, visualization, and cloud analytics. You will collaborate with cross-functional teams and provide mentorship to other engineers. In addition to strong proficiency in OAC, experience with Oracle Machine Learning (OML) on Autonomous Data Warehouse (ADW) and within OAC is required. The ideal candidate will be skilled in integrating OML capabilities, developing advanced analytics solutions, and supporting data-driven business strategies to unlock actionable insights. Minimum Qualifications Bachelor's/Master's degree in Computer Science, Information Systems, Data Science or a related field. 10+ years of experience in analytics, business intelligence, or data engineering roles, with 3+ years hands-on with Oracle Analytics Cloud. Deep expertise in OAC features: Data Flows, Visualization, Semantic Modeling, Security, and Scheduling. Advanced SQL skills and proficiency integrating OAC with diverse data sources (Oracle DB, REST APIs, cloud and on-prem sources). Experience with cloud infrastructure and deployment (OCI preferred). Demonstrated ability to deliver scalable, enterprise-grade analytics solutions. Knowledge of security, privacy, and role-based access best practices. Strong collaboration, documentation, and presentation skills. Preferred Qualifications Experience with healthcare / financial systems. Oracle Analytics Cloud and/or OCI certifications. Experience with other BI/analytics platforms (e.g., Tableau, Power BI). Proficiency in scripting/programming for automation (e.g., Python, Shell).
Posted 3 weeks ago
6.0 - 11.0 years
8 - 12 Lacs
bengaluru
Work from Office
At Oracle Health, we put humans at the heart of every conversation. Our mission is to create a human-centric healthcare experience powered by unified global data.As a global leader were looking for a Data Engineer with BI to join an exciting project for replacing existing Data warehouse systems with the Oracle's own data warehouse to manage storage of all the internal corporate data to provide insights that will help our teams to make critical business decisions Join us and create the future! Roles and Responsibilities Proficient in writing and optimising SQL queries for data extraction Translate client requirements to technical design that junior team members can implement Developing the code that aligns with the technical design and coding standards Review design and code implemented by other team members. Recommend better design and efficient code Conduct Peer design and Code Reviews for early detection of defects and code quality Documenting ETL processes and data flow diagrams Optimizing data extraction and transformation processes for better performance Performing data quality checks and debugging issues Conducting root cause analysis for data issues and implementing fixes Collaborating with more experienced developers on larger projects, collaborate with stakeholders on the requirements Participate in the requirements, design and implementation discussions Participating in learning and development opportunities to enhance technical skills Test storage system after transferring the data Exposures to Business Intelligence platforms like OAC, Power BI or Tableau Technical Skills Set : You must be strong in PLSQL concepts such as tables, keys, DDL, DML commands, etc. You need be proficient in writing, debugging complex SQL queries, Views and Stored Procedures. Strong hands on in Python / PySpark programming As a Data Engineer, you must be strong in data modelling, ETL / ELT concepts, programming / scripting Python language, You must be proficient in the following ETL process automation tools Oracle Data Integrator (ODI) Oracle Data Flow Oracle Database / Autonomous Data warehouse Should possess working knowledge in any of the cloud platform like Oracle Cloud (Preferred), Microsoft Azure, AWS You must be able to create technical design, build prototypes, build and maintain high performing data pipelines, optimise ETL pipelines Good knowledge on Business Intelligence development tools like OAC, PowerBI Good to have Microsoft ADF and Data Lakes, Databricks
Posted 3 weeks ago
10.0 - 14.0 years
19 - 25 Lacs
chennai, bengaluru
Work from Office
An experienced consulting professional who understands solutions, industry best practices, multiple business processes or technology designs within a product/technology family. Operates independently to provide quality work products to an engagement. Performs varied and complex duties and tasks that need independent judgment, in order to implement Oracle products and technology to meet customer needs. Applies Oracle methodology, company procedures, and leading practices. Operates independently to provide quality work products to an engagement. Performs varied and complex duties and tasks that need independent judgment, in order to implement Oracle products and technology to meet customer needs. Applies Oracle methodology, company procedures, and leading practices. Demonstrates expertise to deliver functional and technical solutions on moderately complex customer engagements. May act as the team lead on projects. Effectively consults with management of customer organizations. Participates in business development activities. Develops and configures detailed solutions for moderately complex projects.10-12 years of experience relevant to this position. Ability to communicate effectively. Ability to build rapport with team members and clients. Ability to travel as needed. Responsibilities The candidate is expected to have 10 to 12 years of expert domain knowledge in HCM covering the hire to retire cycle. S/he must have been a part of at least 5 end-to-end HCM implementations of which at least 2 should have been with HCM Cloud. The candidate must have expert working experience in 1 or more of these modules along with the Payroll module Time and Labor Absence Management Talent Benefits Compensation Recruiting (ORC) Core HR In-depth understanding of HCM Cloud business process and their data flow. The candidate should have been in client facing roles and interacted with customers in requirement gathering workshops, design, configuration, testing and go-live. Should have strong written and verbal communication skills, personal drive, flexibility, team player, problem solving, influencing and negotiating skills and organizational awareness and sensitivity, engagement delivery, continuous improvement and knowledge sharing and client management. Good leadership capability with strong planning and follow up skills, mentorship, Work Allocation, monitoring and status updates to Project Manager Assist in the identification, assessment and resolution of complex functional issues/problems. Interact with client frequently around specific work efforts/deliverables Candidate should be open for domestic or international travel for short as well as long duration.
Posted 3 weeks ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
As a GCP Data Engineer-Technical Lead at Birlasoft Office in Bengaluru, India, you will be responsible for designing, building, and maintaining scalable data pipelines and platforms on Google Cloud Platform (GCP) to support business intelligence, analytics, and machine learning initiatives. With a primary focus on Python and GCP technologies such as BigQuery, Dataproc, and Data Flow, you will develop ETL and ELT pipelines while ensuring optimal data manipulation and performance tuning. Your role will involve leveraging data manipulation libraries like Pandas, NumPy, and PySpark, along with SQL expertise for efficient data processing in BigQuery. Additionally, your experience with tools such as Dataflow, Cloud Run, GKE, and Cloud Functions will be crucial in this position. A strong foundation in data modeling, schema design, data governance, and containerization (Docker) for data workloads will further enhance your contributions to our data team. With 5-8 years of experience in Data Engineering and Software Development, including a minimum of 3-4 years working directly with Google Cloud Platform, you will play a key role in driving our data initiatives forward.,
Posted 1 month ago
6.0 - 10.0 years
0 Lacs
karnataka
On-site
You should have at least 6+ years of experience in Java, Springboot, Microservices, ReactJS, product development, and sustenance. Troubleshooting and debugging existing code will be part of your responsibilities. It is essential to be proficient in code quality, security compliance, and application performance management. Your role will also involve participation in the agile planning process and estimation of planned tasks. Good verbal and written communication skills are necessary, along with expertise in unit testing (Junit). As part of your key responsibilities and deliverables, you will be responsible for feature implementation and delivering production-ready code. Technical documentation and system diagrams, debugging reports, and fixes, as well as performance optimizations, will also be expected from you. Qualifications and Experience: - 6+ years of experience in developing and designing software applications using Java - Expert understanding of core computer science fundamentals such as data structures, algorithms, and concurrent programming - Experience in analyzing, designing, implementing, and troubleshooting software solutions for highly transactional systems - Proficiency in OOAD and design principles, implementing microservices architecture using various technologies including JEE, Spring, Spring Boot, Spring Cloud, Hibernate, Oracle, CloudSQL PostgreSQL, BigTable, BigQuery, NoSQL, Git, IntelliJ IDEA, Pub/Sub, Data Flow - Experience working in Native & Hybrid Cloud environments - Familiarity with Agile development methodology - Strong collaboration and communication skills to work effectively across product and technology teams - Ability to translate strategic priorities into scalable and user-centric solutions - Detail-oriented problem solver with excellent communication skills and a can-do attitude - Experience with Java, Java IDEs like Eclipse or IntelliJ, Java EE application servers, object-oriented design, Git, Maven, scripting languages, JSON, XML, YAML, Terraform, etc. Preferred Skills/Experience: - Experience with Agile Scrum methodologies, continuous integration systems like Jenkins or GitHub CI, SAFe methodologies - Deep knowledge of creating secure solutions by design, multi-threaded backend environments, and tools/languages like Ruby, Python, Perl, Node.js, bash scripting languages, Spring, Spring Boot, C, C++, Docker, Kubernetes, Oracle, etc. Working with GlobalLogic offers a culture of caring, learning and development opportunities, interesting and meaningful work, balance and flexibility, and a high-trust organization. You'll have the chance to collaborate with innovative clients and work on cutting-edge solutions that shape the world today. GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner known for creating impactful digital products and experiences, collaborating with clients to transform businesses through intelligent products and services.,
Posted 1 month ago
6.0 - 10.0 years
0 Lacs
hyderabad, telangana
On-site
As a Java Developer with over 6 years of experience in Java, Springboot, Microservices, and ReactJS, you will be responsible for troubleshooting and debugging existing code as necessary. Your proficiency in ensuring code quality, security compliance, and application performance management will be crucial to the success of the projects. You will actively participate in the agile planning process and estimate planned tasks while possessing good verbal and written communication skills. Additionally, your expertise in unit testing, particularly with Junit, will be essential in ensuring the overall quality of the software. Your key responsibilities will include feature implementation and delivering production-ready code, along with creating technical documentation and system diagrams. You will also be tasked with generating debugging reports, implementing fixes, and optimizing performance to enhance the overall efficiency of the systems. To excel in this role, you should have a solid foundation in core computer science fundamentals, including data structures, algorithms, and concurrent programming. Your experience should demonstrate a deep understanding of software design principles, microservices architecture, and various technologies such as JEE, Spring, Hibernate, Oracle, CloudSQL PostgreSQL, BigTable, BigQuery, NoSQL, Git, IntelliJ IDEA, Pub/Sub, and Data Flow. Experience with Native & Hybrid Cloud environments, Agile development methodologies, and proficiency in programming languages like Python and Java will be beneficial. You are expected to collaborate effectively with the product and technology teams, translating strategic priorities into scalable and user-centric solutions. Your attention to detail and problem-solving skills will be critical in addressing complex issues and delivering effective solutions. Strong communication skills and a proactive, team-oriented attitude are essential for success in this role. Preferred skills and experiences include familiarity with Agile Scrum methodologies, continuous integration systems like Jenkins or GitHub CI, SAFe methodologies, and creating secure solutions by design. Experience with multi-threaded backend environments, Docker, Kubernetes, and scripting languages like Ruby, Python, Perl, Node.js, and bash will be advantageous. At GlobalLogic, we value a culture of caring, continuous learning and development, meaningful work, balance, flexibility, and integrity. As part of our team, you will have the opportunity to work on impactful projects, grow personally and professionally, and collaborate with forward-thinking clients on cutting-edge solutions that shape the world today. Join us and be a part of our commitment to engineering impact and transforming businesses through intelligent digital products and services.,
Posted 1 month ago
10.0 - 15.0 years
0 Lacs
haryana
On-site
The Legal Analytics lead (Vice President) will be a part of AIM, based out of Gurugram and reporting into the Legal Analytics head (Senior Vice President) leading the team. You will lead a team of information management experts and data engineers responsible for building Data Strategy by identifying all relevant product processors, creating Data Lake, Data Pipeline, Governance & Reporting. Your role will involve driving quality, reliability, and usability of all work products, as well as evaluating and refining methods and procedures for obtaining data to ensure validity, applicability, efficiency, and accuracy. It is essential to ensure proper documentation and traceability of all project work and respond timely to internal and external reviews. As the Data/Information Management Sr. Manager, you will achieve results through the management of professional team(s) and department(s), integrating subject matter and industry expertise within a defined area. You will contribute to standards around which others will operate, requiring an in-depth understanding of how areas collectively integrate within the sub-function and coordinate and contribute to the objectives of the entire function. Basic commercial awareness is necessary, along with developed communication and diplomacy skills to guide, influence, and convince others, including colleagues in other areas and occasional external customers. Your responsibilities will include ensuring volume, quality, timeliness, and delivery of end results of an area, and you may also have responsibility for planning, budgeting, and policy formulation within your area of expertise. Involvement in short-term planning resource planning and full management responsibility of a team, which may include management of people, budget and planning, such as performance evaluation, compensation, hiring, disciplinary actions, terminations, and budget approval. Your primary responsibilities will involve supporting Business Execution activities of the Chief Operating Office by implementing data engineering solutions to manage banking operations. You will establish monitoring routines, scorecards, and escalation workflows, overseeing Data Strategy, Smart Automation, Insight Generation, Data Quality, and Reporting activities using proven analytical techniques. It will be your responsibility to document data requirements, data collection, processing, cleaning, process automation, optimization, and data visualization techniques. You will enable proactive issue detection, escalation workflows, and alignment with firmwide Data Related policies, implementing a governance framework with clear stewardship roles and data quality controls. You will also interface between business and technology partners for digitizing data collection, including performance generation, validation rules for banking operations. In this role, you will work with large and complex data sets (both internal and external data) to evaluate, recommend, and support the implementation of business strategies, such as a Centralized data repository with standardized definitions and scalable data pipes. You will identify and compile data sets using a variety of tools (e.g., SQL, Access) to help predict, improve, and measure the success of key business outcomes, implementing rule-based Data Quality checks across critical data points, automating alerts for breaks, and publishing periodic quality reports. You will develop and execute the analytics strategy for Data Ingestion, Reporting / Insights Centralization, ensuring consistency, lineage tracking, and audit readiness across legal reporting. As the ideal candidate, you will have 10-15 years of experience in Business Transformation Solution Design roles with proficiency in tools/technologies like SQL, SAS, Python, PySpark, Tableau, Xceptor, Appian, JIRA, Sharepoint, etc. Strong understanding of Data Transformation, Data Modeling, Data Strategy, Data Architecture, Data Tracing & Lineage, Scalable Data Flow Design, Standardization, Platform Integration, and Smart Automation is essential. You should also have expertise in database performance tuning and optimization for data enrichment and integration, reporting, and dashboarding. A Bachelors or Masters degree in STEM is required, with a Masters degree being preferred. Additionally, you should have a strong capability to influence business outcomes and decisions in collaboration with AIM leadership and business stakeholders, demonstrating thought leadership in partner meetings while leading from the front to drive innovative solutions with excellent stakeholder management. Your excellent verbal and written communication skills will enable you to communicate seamlessly across team members, stakeholders, and cross-functional teams.,
Posted 1 month ago
10.0 - 14.0 years
0 Lacs
pune, maharashtra
On-site
As a Data Engineer at Deutsche Bank in Pune, India, you will be responsible for developing and delivering engineering solutions to achieve business objectives. You are expected to have a strong grasp of essential engineering principles and possess root cause analysis skills to address enhancements and fixes in product reliability and resiliency. You should be capable of working independently on medium to large projects with strict deadlines and adapt to a cross-application mixed technical environment. Your role involves hands-on development experience in ETL, Big Data, Hadoop, Spark, and GCP while following an agile methodology. Collaboration with a geographically dispersed team is essential in this role. The position is part of the Compliance tech internal development team in India, focusing on delivering improvements in compliance tech capabilities to meet regulatory commitments and mandates. You will be involved in analyzing data sets, designing stable data ingestion workflows, and integrating them into existing workflows. Additionally, you will work closely with team members and stakeholders to provide ETL solutions, develop analytics algorithms, and handle data sourcing in Hadoop and GCP. Your responsibilities include unit testing, UAT deployment, end-user sign-off, and supporting production and release management teams. To excel in this role, you should have over 10 years of coding experience in reputable organizations, proficiency in technologies such as Hadoop, Python, Spark, SQL, Unix, and Hive, as well as hands-on experience in Bitbucket and CI/CD pipelines. Knowledge of data security in on-prem and GCP environments, cloud services, and data quality dimensions is crucial. Experience in regulatory delivery environments, banking, test-driven development, and data visualization tools like Tableau would be advantageous. At Deutsche Bank, you will receive support through training, coaching, and a culture of continuous learning to enhance your career progression. The company fosters a collaborative environment where employees are encouraged to act responsibly, think commercially, and take initiative. Together, we strive for excellence and celebrate the achievements of our diverse workforce. Deutsche Bank promotes a positive, fair, and inclusive work environment and welcomes applications from all individuals. For more information about Deutsche Bank and our values, please visit our company website: [https://www.db.com/company/company.htm](https://www.db.com/company/company.htm),
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
ZS is a place where passion changes lives. As a management consulting and technology firm focused on improving life and how we live it, our most valuable asset is our people. Here you'll work side-by-side with a powerful collective of thinkers and experts shaping life-changing solutions for patients, caregivers, and consumers, worldwide. ZSers drive impact by bringing a client-first mentality to each and every engagement. We partner collaboratively with our clients to develop custom solutions and technology products that create value and deliver company results across critical areas of their business. Bring your curiosity for learning; bold ideas; courage and passion to drive life-changing impact to ZS. At ZS, we honor the visible and invisible elements of our identities, personal experiences, and belief systemsthe ones that comprise us as individuals, shape who we are, and make us unique. We believe your personal interests, identities, and desire to learn are part of your success here. Learn more about our diversity, equity, and inclusion efforts and the networks ZS supports to assist our ZSers in cultivating community spaces, obtaining the resources they need to thrive, and sharing the messages they are passionate about. **What You'll Do** - Design & implement various innovative solutions in the realms of clinical trial, data management, analytics. - Work in designing a common data model based on clinical standards like SDTM / ADaM / ISS / ISE. - Design safety data mart for cross-study analysis and exploratory analysis. - Build analysis-ready datasets that serve various safety and exploratory use cases across clinical trials. - Experience in building use cases like safety signal detection, reverse translation research, combination therapy exploration. - As a solution lead, your role will involve planning, offering tech consultancy, conducting assessments, and executing technology projects. - Support leaders in engaging with clients to review clinical data models, determining requirements, establishing safety use cases, and delivering solutions that have a significant impact. - Lead requirements gathering activities and collaborate with the product owner to prioritize and groom the backlog to ensure an appropriate level of detail is captured at the right time. - Review and author high-level user stories and develop related tasks, acceptance criteria, and review test cases. - Work closely with project teams in creating requirement specifications, requirement traceability metrics, user guides, and other project requirement documents. - Plan and facilitate various requirement gathering, solution meetings, and artifacts. - Create process flows based on client and internal project discussions. - Perform business process modeling, data flow, user experience modeling, and basic solution architecture diagramming. - Build a Business Analysis capability, mentoring and upscaling more junior colleagues, and contributing to the growth of a BA community. **What You'll Bring** - Bachelor's degree in engineering / Pharma / Bioinformatics / Medicine or related disciplines. - Master's degree in business analyst, Engineering, or Science preferred. - Experience working within the Life Science Domain as a solution architect / business analyst / data analyst is required. - Experience of clinical data standards like CDISC (SDTM, ADaM), Safety data marts is required. - Experience of FHIR, HL7, USDM is preferred. - Experience of Data mapping and transformation in clinical data using various tools like TAMR, Oracle LSH, etc. - Experience of generating SDTM, ADaM datasets as part of statistical programming deliverables of clinical trials. - Experience of designing clinical data models as part of SDTM, ADaM, or safety data marts for submission or exploratory analysis. - Experience of building use cases like safety signal detection, reverse translation research, combination therapy exploration for exploratory analysis for clinical data of cross-study. - Experience of working in any of Clinical trial design, data management, analytics, product implementation, and Integration like EDC (Rave, Veeva, InForm), Non-EDC (ePRO, LAB, eCOA), clinical data repository (CDR, SAS LSAF, Oracle LSH, eClinical elluminate), Metadata Repository (MDR, Nurocor, Sycamore, Formedix), statistical computing environment (Domino, Sas Viya) system, CTMS, eTMF, CDMS is preferred. - Strong verbal and written communication skills with the ability to articulate results and issues to internal and client teams. - Experience in driving requirements discussions, workshops, and coordinating with internal and external stakeholders, across time zones, during the planning and delivery of technology projects is required. - Experience working in end-to-end Clinical Data Repository implementation with Biometrics space will be preferred. - Exposure to clinical data standards like CDISC (SDTM, ADaM), FHIR, HL7, USDM is required. - Experience in building and delivering GxP compliant solutions for large enterprise programs is required. - Exposure to programming languages like R, Python, and SAS is preferred. **Perks & Benefits** ZS offers a comprehensive total rewards package including health and well-being, financial planning, annual leave, personal growth, and professional development. Our robust skills development programs, multiple career progression options, and internal mobility paths and collaborative culture empower you to thrive as an individual and global team member. We are committed to giving our employees a flexible and connected way of working. A flexible and connected ZS allows us to combine work from home and on-site presence at clients/ZS offices for the majority of our week. The magic of ZS culture and innovation thrives in both planned and spontaneous face-to-face connections. **Travel** Travel is a requirement at ZS for client-facing ZSers; business needs of your project and client are the priority. While some projects may be local, all client-facing ZSers should be prepared to travel as needed. Travel provides opportunities to strengthen client relationships, gain diverse experiences, and enhance professional growth by working in different environments and cultures. **Considering applying ** At ZS, we're building a diverse and inclusive company where people bring their passions to inspire life-changing impact and deliver better outcomes for all. We are most interested in finding the best candidate for the job and recognize the value that candidates with all backgrounds, including non-traditional ones, bring. If you are interested in joining us, we encourage you to apply even if you don't meet 100% of the requirements listed above. ZS is an equal opportunity employer and is committed to providing equal employment and advancement opportunities without regard to any class protected by applicable law. **To Complete Your Application** Candidates must possess or be able to obtain work authorization for their intended country of employment. An online application, including a full set of transcripts (official or unofficial), is required to be considered. NO AGENCY CALLS, PLEASE. **Find Out More At:** www.zs.com,
Posted 1 month ago
8.0 - 12.0 years
0 Lacs
thiruvananthapuram, kerala
On-site
As an Enterprise Solution Architect specializing in Java and Python, you will be responsible for delivering custom integrated solutions to clients across various development layers. Your expertise in full stack development, backend services, middleware, and UI/UX will be crucial in meeting client requirements. It is essential that you have hands-on experience in deploying solutions to AWS cloud. Your role will involve facilitating requirements gathering and analysis workshops to capture both functional and non-functional requirements from clients. You will then effectively communicate these requirements to internal product and development teams. Documenting and designing client-specific solution requirements, including acceptance criteria and necessary features to align with client business needs, will be a key aspect of your responsibilities. You should be well-versed in enterprise software full stack development, with proficiency in backend development, scripting languages such as Java and Python, and experience with DB connectors. A bachelor's degree in engineering or equivalent is required, with a degree in power systems considered a plus. With 8 to 10 years of experience in the electricity industry related to software solution architecting, you are expected to possess strong IT Enterprise Architecture skills, including knowledge of ITNW, Data Flow, and Complex SW Application deployments. Your ability to trace requirements to design specifications and develop test plans/cases is crucial. Automation of processes and utilization of scripting solutions to support product development are skills that will be valuable in this role. Furthermore, you should have hands-on experience in Java and Python enterprise application development. A good understanding of emerging energy industry trends, particularly in DER management, distribution planning, IT, SCADA, and asset management, is highly desirable. Nice-to-have qualifications include a master's degree, AWS Certification, hands-on experience in DevSecOps and cybersecurity, familiarity with cloud-based solutions (Azure and/or GCP), utility integration experience, and knowledge of power system analysis software like OpenDSS, CYME, Powerfactory, Synergi, etc. Your expertise and experience as an Enterprise Solution Architect will play a crucial role in designing and implementing innovative solutions to meet the evolving needs of utility clients in the energy industry.,
Posted 1 month ago
4.0 - 8.0 years
0 Lacs
karnataka
On-site
The Workday Sr Integration / Extend Developer is an integral part of the HR Tech team and possesses profound technical expertise in Workday Integration tools. You will be required to demonstrate strong problem-solving skills and collaborate effectively with HR, IT, and business stakeholders to ensure seamless data flow and system connectivity. Your role as a key technical expert involves supporting a portfolio of existing integrations and closely working with cross-functional teams to comprehend business requirements and translate them into scalable and efficient integration solutions. You must have a strong knowledge of core design principles, common data modeling and patterns, project implementation methodology, and a successful track record of delivering high-quality integrations. Your responsibilities will include designing, developing, testing, and maintaining integrations using various Workday tools such as Workday Studio, Core Connectors, EIBs, and APIs. Additionally, you will be expected to troubleshoot complex issues, optimize integration performance, and ensure data security and compliance. Proactively identifying opportunities for process automation, system enhancements, and integration efficiencies to support the evolving needs of the business will also be a crucial aspect of your role. As the Workday Sr. Integration / Extend Developer, you will lead the design, build, and testing of Workday integration code base, work with business stakeholders to resolve integration-related issues, and enhance integration performance and system efficiency. Ensuring that integrations adhere to security best practices, data privacy regulations, and compliance standards will be a key focus area. You will also be responsible for leading integration testing activities, preparing test scripts, conducting Unit and UAT testing, and documenting integration processes and configurations for future reference. To be successful in this role, you should have a Bachelor's degree in computer science, engineering, or a related field, along with 6+ years of demonstrated ability in data migration, integration development, report building / RaaS, or software development. A minimum of 4+ years of experience in Workday Integrations development, including proficiency in Workday Studio, Core Connectors, EIBs, Web Services (SOAP, REST), Extend, and Workday APIs is required. Prior experience with Workday Extend, developing at least 2+ app use cases, is also necessary. You should possess hands-on Workday experience developing and supporting end-to-end Integrations across multiple functions, such as Core HCM, Compensation, Recruiting, Learning, Finance, Benefits, IT, and Procurement. Additionally, experience in all phases of the technology implementation lifecycle, leading design sessions, and proficiency in RaaS, EDI, Web Services, XSLT, Java, .Net, or other integration technology is essential. Proficiency in MVEL and XSLT for writing custom business logic within Workday Studio Integrations, familiarity with XML Transformations, Namespaces, XSD, SOAP and REST APIs, ServiceNow case management, agile methodologies, and effective communication skills are also required. Labcorp Is Proud To Be An Equal Opportunity Employer. We encourage all to apply. If you are an individual with a disability who needs assistance using our online tools to search and apply for jobs, or needs an accommodation, please visit our accessibility site or contact us at Labcorp Accessibility. For more information about how we collect and store your personal data, please see our Privacy Statement.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
As a Data Engineer with 5+ years of experience, you will be responsible for designing, developing, and maintaining scalable data pipelines using Google Cloud Data Proc and Dataflow tools. Your primary focus will be on processing and analyzing large datasets while ensuring data integrity and accessibility. Your role will require a Bachelor's degree in Computer Science, Information Technology, or a related field. Along with your academic background, you should have a strong technical skill set, including proficiency in Google Cloud Dataflow and Data Proc, along with a solid understanding of SQL and data modeling concepts. Experience with tools like BigQuery, Cloud Storage, and other GCP services will be essential for this position. Additionally, familiarity with programming languages like Python or Java will be advantageous. In addition to your technical expertise, soft skills are equally important for success in this role. You should possess excellent problem-solving abilities, strong communication skills, and a collaborative mindset to work effectively within a team environment. If you are passionate about leveraging GCP tools to process and analyze data, and if you meet the mandatory skills criteria of GCP Data Proc and Dataflow, we encourage you to share your resume with us at gkarthik@softpathtech.com/careers@softpathtech.com. Join our team and contribute to building efficient and reliable data solutions with cutting-edge technologies.,
Posted 1 month ago
4.0 - 6.0 years
13 - 18 Lacs
Chennai
Work from Office
{"company":" Fueling Brains is a growing, vibrant organization poised to change the narrative of Education. We are looking for individuals who are passionate about transforming the world of education through a holistic, whole-brain approach to the development of young children. Children impacted by our program will grow into well-rounded, well-regulated, and joyful adults who serve their community and shape the future. We bring together the best of educational science, technology, and childcare expertise to unveil the childs infinite potential. ","role":" Location : Remote or Chennai, India Duration : 2 Months Engagement : Contract We are looking for a skilled Architect with strong experience in Adobe InDesign to join our team for a short-term project. This role requires someone who can bring both architectural expertise and visual structuring skills to the table. Key Responsibilities Apply architectural knowledge to support ongoing design-related tasks. Work hands-on with Adobe InDesign to format, structure, and lay out content with clarity and consistency. Collaborate with the internal team to ensure design outputs meet technical and visual standards. Handle iterative edits and version control for high-precision deliverables. Required Skills Degree in Architecture or related field. Proven hands-on experience with Adobe InDesign . Minimum of 4-6 years experience. Strong attention to layout, visual hierarchy, and design consistency. Ability to work independently and meet project deadlines. Prior experience handling architectural content, reports, or visual documentation is a plus. What s in it for you Short-term impactful project with a collaborative team. Flexibility to work remotely or from Chennai. Opportunity to contribute your architectural and design skills to a dynamic initiative. Fueling Brains is an equal-opportunity workplace, and we are committed to building and fostering an environment where our employees feel included, valued, and heard. We strongly encourage applications from Indigenous peoples, racialized people, people with disabilities, people from gender and sexually diverse communities and/or people with intersectional identities. We thank all those applicants who have applied; however, only those selected for an interview will be contacted. "},"
Posted 1 month ago
6.0 - 11.0 years
6 - 10 Lacs
Hyderabad
Work from Office
About the Role In this opportunity, as Senior Data Engineer, you will: Develop and maintain data solutions using resources such as dbt, Alteryx, and Python. Design and optimize data pipelines, ensuring efficient data flow and processing. Work extensively with databases, SQL, and various data formats including JSON, XML, and CSV. Tune and optimize queries to enhance performance and reliability. Develop high-quality code in SQL, dbt, and Python, adhering to best practices. Understand and implement data automation and API integrations. Leverage AI capabilities to enhance data engineering practices. Understand integration points related to upstream and downstream requirements. Proactively manage tasks and work towards completion against tight deadlines. Analyze existing processes and offer suggestions for improvement. About You Youre a fit for the role of Senior Data Engineer if your background includes: Strong interest and knowledge in data engineering principles and methods. 6+ years of experience developing data solutions or pipelines. 6+ years of hands-on experience with databases and SQL. 2+ years of experience programming in an additional language. 2+ years of experience in query tuning and optimization. Experience working with SQL, JSON, XML, and CSV content. Understanding of data automation and API integration. Familiarity with AI capabilities and their application in data engineering. Ability to adhere to best practices for developing programmatic solutions. Strong problem-solving skills and ability to work independently. #LI-SS6 Whats in it For You Hybrid Work Model Weve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrows challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our valuesObsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world. Thomson Reuters informs the way forward by bringing together the trusted content and technology that people and organizations need to make the right decisions. We serve professionals across legal, tax, accounting, compliance, government, and media. Our products combine highly specialized software and insights to empower professionals with the data, intelligence, and solutions needed to make informed decisions, and to help institutions in their pursuit of justice, truth, and transparency. Reuters, part of Thomson Reuters, is a world leading provider of trusted journalism and news. We are powered by the talents of 26,000 employees across more than 70 countries, where everyone has a chance to contribute and grow professionally in flexible work environments. At a time when objectivity, accuracy, fairness, and transparency are under attack, we consider it our duty to pursue them. Sound excitingJoin us and help shape the industries that move society forward. As a global business, we rely on the unique backgrounds, perspectives, and experiences of all employees to deliver on our business goals. To ensure we can do that, we seek talented, qualified employees in all our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under applicable law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. We also make reasonable accommodations for qualified individuals with disabilities and for sincerely held religious beliefs in accordance with applicable law. More information on requesting an accommodation here. Learn more on how to protect yourself from fraudulent job postings here. More information about Thomson Reuters can be found on thomsonreuters.com.
Posted 1 month ago
5.0 - 12.0 years
8 - 12 Lacs
Bengaluru
Work from Office
Trade Surveillance team is responsible for assisting the client to validating the exceptions generated in the system The incumbent will primarily be responsible for checking the alerts/exceptions generated based on the existing modules developed by the client. The incumbent will also be responsible for performing daily review of all exceptions and closing it out with an appropriate rationale and escalate to the client if there any true exceptions. The candidate will responsible of the Trade Surveillance team will be an expert in the process and should be able to perform the task with minimal support of senior team members The incumbent should be able to handle queries of the junior team members and share best practices with them and help them come up the learning curve faster Professionals in this role will: Be required to have strong understanding of investment instruments like equities, debt, mortgages, derivatives etc. Have sound understanding of different Trade Surveillance modules and perform comprehensive investigations on potentially non-compliant trades Regularly monitor and understand current market conditions, regulations, and changes. Have thorough understanding of the clients IT architecture, data flows and organizational structure and should be able to navigate through the system to find answers resolve queries. Have frequent interactions with business groups including the Vice President and Executive Directors of onshore Trade Surveillance team Key Responsibilities Functional Responsibilities: Working on daily exceptions Preparing and updating the client SOPs as and when required Identify gaps in existing process and suggest enhancements Handle queries of junior team members and help them learn the process Demonstrate ownership of the activities performed and be accountable for overall delivery of some work types within the team Functional Competencies: Sound understanding of investment instruments like equities, derivatives, fixed income instruments etc. Strong Microsoft Office knowledge is required Experience in handling different exceptions of the Trade Surveillance modules Sound knowledge of the Bloomberg terminal and its different screens Key Competencies Qualifications: MBA - Finance / CFA, Law, or Compliance related qualification. Capital Markets knowledge/NCFM certifications, preferred. Experience: 3 - 8 years of experience in Trade Surveillance role Behavioral Competencies: Team working Client Centricity Entrepreneurial Communication Clarity of Thought Self-awareness
Posted 1 month ago
6.0 - 10.0 years
1 - 1 Lacs
Chennai
Hybrid
Overview: TekWissen is a global workforce management provider throughout India and many other countries in the world. The below clientis a global company with shared ideals and a deep sense of family. From our earliest days as a pioneer of modern transportation, we have sought to make the world a better place one that benefits lives, communities and the planet Job Title: Specialty Development Practitioner Location: Chennai Work Type: Hybrid Position Description: At the client's Credit Company, we are modernizing our enterprise data warehouse in Google Cloud to enhance data, analytics, and AI/ML capabilities, improve customer experience, ensure regulatory compliance, and boost operational efficiencies. As a GCP Data Engineer, you will integrate data from various sources into novel data products. You will build upon existing analytical data, including merging historical data from legacy platforms with data ingested from new platforms. You will also analyze and manipulate large datasets, activating data assets to enable enterprise platforms and analytics within GCP. You will design and implement the transformation and modernization on GCP, creating scalable data pipelines that land data from source applications, integrate into subject areas, and build data marts and products for analytics solutions. You will also conduct deep-dive analysis of Current State Receivables and Originations data in our data warehouse, performing impact analysis related to the client's Credit North America's modernization and providing implementation solutions. Moreover, you will partner closely with our AI, data science, and product teams, developing creative solutions that build the future for the client's Credit. Experience with large-scale solutions and operationalizing data warehouses, data lakes, and analytics platforms on Google Cloud Platform or other cloud environments is a must. We are looking for candidates with a broad set of analytical and technology skills across these areas and who can demonstrate an ability to design the right solutions with the appropriate combination of GCP and 3rd party technologies for deployment on Google Cloud Platform. Skills Required: Big Query,, Data Flow, DataForm, Data Fusion, Dataproc, Cloud Composer, AIRFLOW, Cloud SQL, Compute Engine, Google Cloud Platform - Biq Query Experience Required: GCP Data Engineer Certified Successfully designed and implemented data warehouses and ETL processes for over five years, delivering high-quality data solutions. 5+ years of complex SQL development experience 2+ experience with programming languages such as Python, Java, or Apache Beam. Experienced cloud engineer with 3+ years of GCP expertise, specializing in managing cloud infrastructure and applications into production-scale solutions. Big Query,, Data Flow, DataForm, Data Fusion, Dataproc, Cloud Composer, AIRFLOW, Cloud SQL, Compute Engine, Google Cloud Platform Biq Query, Data Flow, Dataproc, Data Fusion, TERRAFORM, Tekton,Cloud SQL, AIRFLOW, POSTGRES, Airflow PySpark, Python, API, cloudbuild, App Engine, Apache Kafka, Pub/Sub, AI/ML, Kubernetes Experience Preferred: In-depth understanding of GCP's underlying architecture and hands-on experience of crucial GCP services, especially those related to data processing (Batch/Real Time) leveraging Terraform, Big Query, Dataflow, Pub/Sub, Data form, astronomer, Data Fusion, DataProc, Pyspark, Cloud Composer/Air Flow, Cloud SQL, Compute Engine, Cloud Functions, Cloud Run, Cloud build and App Engine, alongside and storage including Cloud Storage DevOps tools such as Tekton, GitHub, Terraform, Docker. Expert in designing, optimizing, and troubleshooting complex data pipelines. Experience developing with microservice architecture from container orchestration framework. Experience in designing pipelines and architectures for data processing. Passion and self-motivation to develop/experiment/implement state-of-the-art data engineering methods/techniques. Self-directed, work independently with minimal supervision, and adapts to ambiguous environments. Evidence of a proactive problem-solving mindset and willingness to take the initiative. Strong prioritization, collaboration & coordination skills, and ability to simplify and communicate complex ideas with cross-functional teams and all levels of management. Proven ability to juggle multiple responsibilities and competing demands while maintaining a high level of productivity. Data engineering or development experience gained in a regulated financial environment. Experience in coaching and mentoring Data Engineers Project management tools like Atlassian JIRA Experience working in an implementation team from concept to operations, providing deep technical subject matter expertise for successful deployment. Experience with data security, governance, and compliance best practices in the cloud. Experience with AI solutions or platforms that support AI solutions Experience using data science concepts on production datasets to generate insights Experience Range: 5+ years Education Required: Bachelor's Degree TekWissen® Group is an equal opportunity employer supporting workforce diversity.
Posted 1 month ago
8.0 - 10.0 years
6 - 13 Lacs
Pune
Remote
Should have min 4 end to end implementation experience.Strong communication skills to Work closely with customers and partners to gather requirements and design solutions.Strong NetSuite ERP Knowledge and experience.
Posted 1 month ago
9.0 - 14.0 years
7 - 14 Lacs
Hyderabad, Pune
Hybrid
Role & responsibilities Key Skills Required are 8 years of handson experience in cloud application architecture with a focus on creating scalable and reliable software systems 8 Years Experience using Google Cloud Platform GCP including but not restricting to services like Bigquery Cloud SQL Fire store Cloud Composer Experience on Security identity and access management Networking protocols such as TCPIP and HTTPS Network security design including segmentation encryption logging and monitoring Network topologies load balancing and segmentation Python for Rest APIs and Microservices Design and development guidance Python with GCP Cloud SQLPostgreSQL BigQuery Integration of Python API to FE applications built on React JS Unit Testing frameworks Python unit test pytest Java junit spock and groovy DevOps automation process like Jenkins Docker deployments etc Code Deployments on VMs validating an overall solution from the perspective of Infrastructure performance scalability security capacity and create effective mitigation plans Automation technologies Terraform or Google Cloud Deployment Manager Ansible Implementing solutions and processes to manage cloud costs Experience in providing solution to Web Applications Requirements and Design knowledge React JS Elastic Cache GCP IAM Managed Instance Group VMs and GKE Owning the endtoend delivery of solutions which will include developing testing and releasing Infrastructure as Code Translate business requirementsuser stories into a practical scalable solution that leverages the functionality and best practices of the HSBC Executing technical feasibility assessments solution estimations and proposal development for moving identified workloads to the GCP Designing and implementing secure scalable and innovative solutions to meet Banks requirements Ability to interact and influence across all organizational levels on technical or business solutions Certified Google Cloud Architect would be an addon Create and own scaling capacity planning configuration management and monitoring of processes and procedures Create put into practice and use cloudnative solutions Lead the adoption of new cloud technologies and establish best practices for them Experience establishing technical strategy and architecture at the enterprise level Experience leading GCP Cloud project delivery Collaborate with IT security to monitor cloud privacy Architecture DevOps data and integration teams to ensure best practices are followed throughout cloud adoption Respond to technical issues and provide guidance to technical team Skills Mandatory Skills : GCP Storage,GCP BigQuery,GCP DataProc,GCP Vertex AI,GCP Spanner,GCP Dataprep,GCP Datastream,Google Analytics Hub,GCP Dataform,GCP Dataplex/Catalog,GCP Cloud Datastore/Firestore,GCP Datafusion,GCP Pub/Sub,GCP Cloud SQL,GCP Cloud Composer,Google Looker,GCP Cloud Datastore,GCP Data Architecture,Google Cloud IAM,GCP Bigtable,GCP Looker1,GCP Data Flow,GCP Cloud Pub/Sub"
Posted 1 month ago
4.0 - 8.0 years
0 Lacs
hyderabad, telangana
On-site
As a GCP Developer, you will be responsible for maintaining the stability of production platforms, delivering new features, and minimizing technical debt across various technologies. You should have a minimum of 4 years of experience in the field. You must have a strong commitment to maintaining high standards and a genuine passion for ensuring quality in your work. Proficiency in GCP, Python, Hadoop, Spark, Cloud, Scala, Streaming (pub/sub), Kafka, SQL, Data Proc, and Data Flow is essential for this role. Additionally, familiarity with data warehouses, distributed data platforms, and data lakes is required. You should possess knowledge in database definition, schema design, Looker Views, and Models. An understanding of data structures and algorithms is crucial for success in this position. Experience with CI/CD practices would be advantageous. This position involves working in a dynamic environment across multiple locations such as Chennai, Hyderabad, and Bangalore. A total of 20 positions are available for qualified candidates.,
Posted 1 month ago
6.0 - 10.0 years
0 Lacs
pune, maharashtra
On-site
The role of a Pricing Implementation Lead at FedEx involves ensuring timely, accurate, and quality-checked setup of discounts and pricing for large customers/accounts using FedEx Pricing systems and relevant tools. It requires transforming and optimizing pricing processes and systems for enhanced efficiency, reduced turnaround times, and diminished human intervention through process simplification and automation initiatives. As a Pricing Implementation Lead, your primary responsibilities include validating approved prices" completeness and implementing them for FedEx customers. You will specialize in facilitating pricing deployment across different FedEx operating companies, involving execution, testing, documentation, and optimizing contract administration pricing processes. Your duties will consist of entering pricing discount and rates information into FedEx enterprise pricing systems, configuring necessary parameters within the pricing systems, and auditing data entered in the pricing ecosystem. You will also be involved in planning, implementing pricing changes, and validating them for Pricing Contract administration. This role manages pricing-specific processes supporting all FedEx Enterprise Global Net Rate Pricing accounts, including Global Air Freight pricing. Collaboration with key business partners to effectively implement customers" pricing and discounting requirements, streamlining pricing processes through optimization and automation, and managing costs to achieve business efficiencies are crucial aspects of this position. To excel in this role, you must possess the ability to independently run complex projects with minimal supervision, excellent communication skills across all levels, proficiency in business process configuration and project management tasks, hands-on experience in working across complex enterprise systems, and a strong understanding of data flow and governance methodology. Additionally, technical skills in data extraction using SQL or SAS, data visualization using Power BI or Tableau, or data analysis using Advanced Excel are essential. The ideal candidate for this position would have a background as a Business Analyst, Techno-Functional Analyst, System Analyst, Implementation Analyst, Consultant, or in process-oriented roles with 6 to 10 years of relevant work experience. A Bachelor's degree or equivalent in Computer Science, MIS, Mathematics, Statistics, or a similar discipline is required, while a Master's degree or PhD is preferred. FedEx is committed to fostering a diverse, equitable, and inclusive workforce and is an equal opportunity/affirmative action employer. The company values fair treatment, growth opportunities for all, and a people-first philosophy. FedEx's success is attributed to its team members, who are dedicated to delivering outstanding service to customers worldwide.,
Posted 1 month ago
6.0 - 8.0 years
10 - 15 Lacs
Bengaluru
Work from Office
Job Summary Synechron is seeking a highly skilled and proactive Data Engineer to join our dynamic data analytics team. In this role, you will be instrumental in designing, developing, and maintaining scalable data pipelines and solutions on the Google Cloud Platform (GCP). With your expertise, you'll enable data-driven decision-making, contribute to strategic business initiatives, and ensure robust data infrastructure. This position offers an opportunity to work in a collaborative environment with a focus on innovative technologies and continuous growth. Software Requirements Required: Proficiency in Data Engineering tools and frameworks such as Hive , Apache Spark , and Python (version 3.x) Extensive experience working with Google Cloud Platform (GCP) offerings including Dataflow, BigQuery, Cloud Storage, and Pub/Sub Familiarity with Git , Jira , and Confluence for version control and collaboration Preferred: Experience with additional GCP services like DataProc, Data Studio, or Cloud Composer Exposure to other programming languages such as Java or Scala Knowledge of data security best practices and tools Overall Responsibilities Design, develop, and optimize scalable data pipelines on GCP to support analytics and reporting needs Collaborate with cross-functional teams to translate business requirements into technical solutions Build and maintain data models, ensuring data quality, integrity, and security Participate actively in code reviews, adhering to best practices and standards Develop automated and efficient data workflows to improve system performance Stay updated with emerging data engineering trends and continuously improve technical skills Provide technical guidance and support to team members, fostering a collaborative environment Ensure timely delivery of deliverables aligned with project milestones Technical Skills (By Category) Programming Languages: EssentialPython (required) PreferredJava, Scala Data Management & Databases: Experience with Hive, BigQuery, and relational databases Knowledge of data warehousing concepts and SQL proficiency Cloud Technologies: Extensive hands-on experience with GCP services including Dataflow, BigQuery, Cloud Storage, Pub/Sub, and Composer Ability to build and optimize data pipelines leveraging GCP offerings Frameworks & Libraries: Spark (PySpark preferred), Hadoop ecosystem experience is advantageous Development Tools & Methodologies: Agile/Scrum methodologies, version control with Git, project tracking via JIRA, documentation on Confluence Security Protocols: Understanding of data security, privacy, and compliance standards Experience Requirements Minimum of 6-8 years in data or software engineering roles with a focus on data pipeline development Proven experience in designing and implementing data solutions on cloud platforms, particularly GCP Prior experience working in agile teams, participating in code reviews, and delivering end-to-end data projects Experience working with cross-disciplinary teams and understanding varied stakeholder requirements Exposure to industry best practices for data security, governance, and quality assurance is desired Day-to-Day Activities Attend daily stand-up meetings and contribute to project planning sessions Collaborate with business analysts, data scientists, and other stakeholders to understand data needs Develop, test, and deploy scalable data pipelines, ensuring efficiency and reliability Perform regular code reviews, provide constructive feedback, and uphold coding standards Document technical solutions and maintain clear records of data workflows Troubleshoot and resolve technical issues in data processing environments Participate in continuous learning initiatives to stay abreast of technological developments Support team members by sharing knowledge and resolving technical challenges Qualifications Bachelor's or Masters degree in Computer Science, Information Technology, or a related field Relevant professional certifications in GCP (such as Google Cloud Professional Data Engineer) are preferred but not mandatory Demonstrable experience in data engineering and cloud technologies Professional Competencies Strong analytical and problem-solving skills, with a focus on outcome-driven solutions Excellent communication and interpersonal skills to effectively collaborate within teams and with stakeholders Ability to work independently with minimal supervision and manage multiple priorities effectively Adaptability to evolving technologies and project requirements Demonstrated initiative in driving tasks forward and continuous improvement mindset Strong organizational skills with a focus on quality and attention to detail S YNECHRONS DIVERSITY & INCLUSION STATEMENT Diversity & Inclusion are fundamental to our culture, and Synechron is proud to be an equal opportunity workplace and is an affirmative action employer. Our Diversity, Equity, and Inclusion (DEI) initiative Same Difference is committed to fostering an inclusive culture promoting equality, diversity and an environment that is respectful to all. We strongly believe that a diverse workforce helps build stronger, successful businesses as a global company. We encourage applicants from across diverse backgrounds, race, ethnicities, religion, age, marital status, gender, sexual orientations, or disabilities to apply. We empower our global workforce by offering flexible workplace arrangements, mentoring, internal mobility, learning and development programs, and more. All employment decisions at Synechron are based on business needs, job requirements and individual qualifications, without regard to the applicants gender, gender identity, sexual orientation, race, ethnicity, disabled or veteran status, or any other characteristic protected by law . Candidate Application Notice
Posted 1 month ago
8.0 - 12.0 years
30 - 35 Lacs
Pune
Work from Office
About The Role : Job TitleSenior Engineer PD, AVP LocationPune, India Role Description Our team is part of the area Technology, Data, and Innovation (TDI) Private Bank. Within TDI, Partner data is the central client reference data system in Germany. As a core banking system, many banking processes and applications are integrated and communicate via >2k interfaces. From a technical perspective, we focus on mainframe but also build solutions on premise cloud, restful services, and an angular frontend. Next to the maintenance and the implementation of new CTB requirements, the content focus also lies on the regulatory and tax topics surrounding a partner/ client. We are looking for a very motivated candidate for the Cloud Data Engineer area. What well offer you 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities You are responsible for the implementation of the new project on GCP (Spark, Dataproc, Dataflow, BigQuery, Terraform etc) in the whole SDLC chain You are responsible for the support of the migration of current functionalities to Google Cloud You are responsible for the stability of the application landscape and support software releases You also support in L3 topics and application governance You are responsible in the CTM area for coding as part of an agile team (Java, Scala, Spring Boot) Your skills and experience You have experience with databases (BigQuery, Cloud SQl, No Sql, Hive etc.) and development preferably for Big Data and GCP technologies Strong understanding of Data Mesh Approach and integration patterns Understanding of Party data and integration with Product data Your architectural skills for big data solutions, especially interface architecture allows a fast start You have experience in at leastSpark, Java ,Scala and Python, Maven, Artifactory, Hadoop Ecosystem, Github Actions, GitHub, Terraform scripting You have knowledge in customer reference data, customer opening processes and preferably regulatory topics around know your customer processes You can work very well in teams but also independent and are constructive and target oriented Your English skills are good and you can both communicate professionally but also informally in small talks with the team How well support you About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.
Posted 1 month ago
5.0 - 10.0 years
16 - 20 Lacs
Pune
Work from Office
About The Role : As a Senior Data Architect, you will be instrumental in shaping the banks enterprise data landscapesupporting teams in designing, evolving, and implementing data architectures that align with the enterprise target state and enable scalable, compliant, and interoperable solutions. You will also serve as the go-to expert and trusted advisor on what good looks like in data architecture, helping to set high standards and drive continuous improvement across the organization. This role is ideal for an experienced data professional with deep technical expertise, strong solution architecture skills, and a proven ability to influence design decisions across both business and technology teams. Responsibilities 1. Enterprise Data Architecture & Solution Design Support teams in designing, evolving, and implementing data architectures that align with the enterprise target state and enable scalable, compliant, and interoperable solutions. Serve as the go-to person for data architecture best practices and standards, helping to define and communicate what good looks like to ensure consistency and quality. Lead and contribute to solution architecture for key programs, ensuring architectural decisions are well-documented, justified, and aligned to enterprise principles. Work with engineering and platform teams to design end-to-end data flows, integration patterns, data processing pipelines, and storage strategies across structured and unstructured data. Drive the application of modern data architecture principles including event-driven architecture, data mesh, streaming, and decoupled data services. 2. Data Modelling and Semantics Provide hands-on leadership in data modelling efforts, including the occasional creation and stewardship of conceptual, logical, and physical models that support enterprise data domains. Partner with product and engineering teams to ensure data models are fit-for-purpose, extensible, and aligned with enterprise vocabularies and semantics. Support modelling use cases across regulatory, operational, and analytical data assets. 3. Architecture Standards & Frameworks Define and continuously improve data architecture standards, patterns, and reference architectures that support consistency and interoperability across platforms. Embed standards into engineering workflows and tooling to encourage automation and reduce delivery friction. Measure and report on adoption of architectural principles using architecture KPIs and compliance metrics. 4. Leadership, Collaboration & Strategy Act as a technical advisor and architectural leader across initiatives mentoring junior architects and supporting federated architecture teams in delivery. Build strong partnerships with senior stakeholders across the business, CDIO, engineering, and infrastructure teams to ensure alignment and adoption of architecture strategy. Stay current with industry trends, regulatory changes, and emerging technologies, advising on their potential impact and application. Skills Extensive experience in data architecture, data engineering, or enterprise architecture, preferably within a global financial institution. Deep understanding of data platforms, integration technologies, and architectural patterns for real-time and batch processing. Proficiency with data architecture tools such as Sparx Enterprise Architect, ERwin, or similar. Experience designing solutions in cloud and hybrid environments (e.g. GCP, AWS, or Azure), with knowledge of associated data services. Hands-on experience with data modelling, semantic layer design, and metadata-driven architecture approaches. Strong grasp of data governance, privacy, security, and regulatory complianceespecially as they intersect with architectural decision-making. Strategic mindset, with the ability to connect architectural goals to business value, and communicate effectively with technical and non-technical stakeholders. Experience working across business domains including Risk, Finance, Treasury, or Front Office functions. Well-being & Benefits Emotionally and mentally balanced: we support you in dealing with life crises, maintaining stability through illness, and maintaining good mental health Empowering managers who value your ideas and decisions. Show your positive attitude, determination, and open-mindedness. A professional, passionate, and fun workplace with flexible Work from Home options. A modern office with fun and relaxing areas to boost creativity. Continuous learning culture with coaching and support from team experts. Physically thriving we support you managing your physical health by taking appropriate preventive measures and providing a workplace that helps you thrive Private healthcare and life insurance with premium benefits for you and discounts for your loved ones. Socially connected: we strongly believe in collaboration, inclusion and feeling connected to open up new perspectives and strengthen our self-confidence and wellbeing. Kids@TheOffice - support for unexpected events requiring you to care for your kids during work hours. Enjoy retailer discounts, cultural and CSR activities, employee sport clubs, workshops, and more. Financially secure: we support you to meet personal financial goals during your active career and for the future Competitive income, performance-based promotions, and a sense of purpose. 24 days holiday, loyalty days, and bank holidays (including weekdays for weekend bank holidays). We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |