Jobs
Interviews

470 Data Flow Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 8.0 years

0 Lacs

karnataka

On-site

The Workday Sr Integration / Extend Developer is an integral part of the HR Tech team and possesses profound technical expertise in Workday Integration tools. You will be required to demonstrate strong problem-solving skills and collaborate effectively with HR, IT, and business stakeholders to ensure seamless data flow and system connectivity. Your role as a key technical expert involves supporting a portfolio of existing integrations and closely working with cross-functional teams to comprehend business requirements and translate them into scalable and efficient integration solutions. You must have a strong knowledge of core design principles, common data modeling and patterns, project implementation methodology, and a successful track record of delivering high-quality integrations. Your responsibilities will include designing, developing, testing, and maintaining integrations using various Workday tools such as Workday Studio, Core Connectors, EIBs, and APIs. Additionally, you will be expected to troubleshoot complex issues, optimize integration performance, and ensure data security and compliance. Proactively identifying opportunities for process automation, system enhancements, and integration efficiencies to support the evolving needs of the business will also be a crucial aspect of your role. As the Workday Sr. Integration / Extend Developer, you will lead the design, build, and testing of Workday integration code base, work with business stakeholders to resolve integration-related issues, and enhance integration performance and system efficiency. Ensuring that integrations adhere to security best practices, data privacy regulations, and compliance standards will be a key focus area. You will also be responsible for leading integration testing activities, preparing test scripts, conducting Unit and UAT testing, and documenting integration processes and configurations for future reference. To be successful in this role, you should have a Bachelor's degree in computer science, engineering, or a related field, along with 6+ years of demonstrated ability in data migration, integration development, report building / RaaS, or software development. A minimum of 4+ years of experience in Workday Integrations development, including proficiency in Workday Studio, Core Connectors, EIBs, Web Services (SOAP, REST), Extend, and Workday APIs is required. Prior experience with Workday Extend, developing at least 2+ app use cases, is also necessary. You should possess hands-on Workday experience developing and supporting end-to-end Integrations across multiple functions, such as Core HCM, Compensation, Recruiting, Learning, Finance, Benefits, IT, and Procurement. Additionally, experience in all phases of the technology implementation lifecycle, leading design sessions, and proficiency in RaaS, EDI, Web Services, XSLT, Java, .Net, or other integration technology is essential. Proficiency in MVEL and XSLT for writing custom business logic within Workday Studio Integrations, familiarity with XML Transformations, Namespaces, XSD, SOAP and REST APIs, ServiceNow case management, agile methodologies, and effective communication skills are also required. Labcorp Is Proud To Be An Equal Opportunity Employer. We encourage all to apply. If you are an individual with a disability who needs assistance using our online tools to search and apply for jobs, or needs an accommodation, please visit our accessibility site or contact us at Labcorp Accessibility. For more information about how we collect and store your personal data, please see our Privacy Statement.,

Posted 1 day ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

As a Data Engineer with 5+ years of experience, you will be responsible for designing, developing, and maintaining scalable data pipelines using Google Cloud Data Proc and Dataflow tools. Your primary focus will be on processing and analyzing large datasets while ensuring data integrity and accessibility. Your role will require a Bachelor's degree in Computer Science, Information Technology, or a related field. Along with your academic background, you should have a strong technical skill set, including proficiency in Google Cloud Dataflow and Data Proc, along with a solid understanding of SQL and data modeling concepts. Experience with tools like BigQuery, Cloud Storage, and other GCP services will be essential for this position. Additionally, familiarity with programming languages like Python or Java will be advantageous. In addition to your technical expertise, soft skills are equally important for success in this role. You should possess excellent problem-solving abilities, strong communication skills, and a collaborative mindset to work effectively within a team environment. If you are passionate about leveraging GCP tools to process and analyze data, and if you meet the mandatory skills criteria of GCP Data Proc and Dataflow, we encourage you to share your resume with us at gkarthik@softpathtech.com/careers@softpathtech.com. Join our team and contribute to building efficient and reliable data solutions with cutting-edge technologies.,

Posted 2 days ago

Apply

4.0 - 6.0 years

13 - 18 Lacs

Chennai

Work from Office

{"company":" Fueling Brains is a growing, vibrant organization poised to change the narrative of Education. We are looking for individuals who are passionate about transforming the world of education through a holistic, whole-brain approach to the development of young children. Children impacted by our program will grow into well-rounded, well-regulated, and joyful adults who serve their community and shape the future. We bring together the best of educational science, technology, and childcare expertise to unveil the childs infinite potential. ","role":" Location : Remote or Chennai, India Duration : 2 Months Engagement : Contract We are looking for a skilled Architect with strong experience in Adobe InDesign to join our team for a short-term project. This role requires someone who can bring both architectural expertise and visual structuring skills to the table. Key Responsibilities Apply architectural knowledge to support ongoing design-related tasks. Work hands-on with Adobe InDesign to format, structure, and lay out content with clarity and consistency. Collaborate with the internal team to ensure design outputs meet technical and visual standards. Handle iterative edits and version control for high-precision deliverables. Required Skills Degree in Architecture or related field. Proven hands-on experience with Adobe InDesign . Minimum of 4-6 years experience. Strong attention to layout, visual hierarchy, and design consistency. Ability to work independently and meet project deadlines. Prior experience handling architectural content, reports, or visual documentation is a plus. What s in it for you Short-term impactful project with a collaborative team. Flexibility to work remotely or from Chennai. Opportunity to contribute your architectural and design skills to a dynamic initiative. Fueling Brains is an equal-opportunity workplace, and we are committed to building and fostering an environment where our employees feel included, valued, and heard. We strongly encourage applications from Indigenous peoples, racialized people, people with disabilities, people from gender and sexually diverse communities and/or people with intersectional identities. We thank all those applicants who have applied; however, only those selected for an interview will be contacted. "},"

Posted 2 days ago

Apply

6.0 - 11.0 years

6 - 10 Lacs

Hyderabad

Work from Office

About the Role In this opportunity, as Senior Data Engineer, you will: Develop and maintain data solutions using resources such as dbt, Alteryx, and Python. Design and optimize data pipelines, ensuring efficient data flow and processing. Work extensively with databases, SQL, and various data formats including JSON, XML, and CSV. Tune and optimize queries to enhance performance and reliability. Develop high-quality code in SQL, dbt, and Python, adhering to best practices. Understand and implement data automation and API integrations. Leverage AI capabilities to enhance data engineering practices. Understand integration points related to upstream and downstream requirements. Proactively manage tasks and work towards completion against tight deadlines. Analyze existing processes and offer suggestions for improvement. About You Youre a fit for the role of Senior Data Engineer if your background includes: Strong interest and knowledge in data engineering principles and methods. 6+ years of experience developing data solutions or pipelines. 6+ years of hands-on experience with databases and SQL. 2+ years of experience programming in an additional language. 2+ years of experience in query tuning and optimization. Experience working with SQL, JSON, XML, and CSV content. Understanding of data automation and API integration. Familiarity with AI capabilities and their application in data engineering. Ability to adhere to best practices for developing programmatic solutions. Strong problem-solving skills and ability to work independently. #LI-SS6 Whats in it For You Hybrid Work Model Weve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrows challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our valuesObsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world. Thomson Reuters informs the way forward by bringing together the trusted content and technology that people and organizations need to make the right decisions. We serve professionals across legal, tax, accounting, compliance, government, and media. Our products combine highly specialized software and insights to empower professionals with the data, intelligence, and solutions needed to make informed decisions, and to help institutions in their pursuit of justice, truth, and transparency. Reuters, part of Thomson Reuters, is a world leading provider of trusted journalism and news. We are powered by the talents of 26,000 employees across more than 70 countries, where everyone has a chance to contribute and grow professionally in flexible work environments. At a time when objectivity, accuracy, fairness, and transparency are under attack, we consider it our duty to pursue them. Sound excitingJoin us and help shape the industries that move society forward. As a global business, we rely on the unique backgrounds, perspectives, and experiences of all employees to deliver on our business goals. To ensure we can do that, we seek talented, qualified employees in all our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under applicable law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. We also make reasonable accommodations for qualified individuals with disabilities and for sincerely held religious beliefs in accordance with applicable law. More information on requesting an accommodation here. Learn more on how to protect yourself from fraudulent job postings here. More information about Thomson Reuters can be found on thomsonreuters.com.

Posted 2 days ago

Apply

5.0 - 12.0 years

8 - 12 Lacs

Bengaluru

Work from Office

Trade Surveillance team is responsible for assisting the client to validating the exceptions generated in the system The incumbent will primarily be responsible for checking the alerts/exceptions generated based on the existing modules developed by the client. The incumbent will also be responsible for performing daily review of all exceptions and closing it out with an appropriate rationale and escalate to the client if there any true exceptions. The candidate will responsible of the Trade Surveillance team will be an expert in the process and should be able to perform the task with minimal support of senior team members The incumbent should be able to handle queries of the junior team members and share best practices with them and help them come up the learning curve faster Professionals in this role will: Be required to have strong understanding of investment instruments like equities, debt, mortgages, derivatives etc. Have sound understanding of different Trade Surveillance modules and perform comprehensive investigations on potentially non-compliant trades Regularly monitor and understand current market conditions, regulations, and changes. Have thorough understanding of the clients IT architecture, data flows and organizational structure and should be able to navigate through the system to find answers resolve queries. Have frequent interactions with business groups including the Vice President and Executive Directors of onshore Trade Surveillance team Key Responsibilities Functional Responsibilities: Working on daily exceptions Preparing and updating the client SOPs as and when required Identify gaps in existing process and suggest enhancements Handle queries of junior team members and help them learn the process Demonstrate ownership of the activities performed and be accountable for overall delivery of some work types within the team Functional Competencies: Sound understanding of investment instruments like equities, derivatives, fixed income instruments etc. Strong Microsoft Office knowledge is required Experience in handling different exceptions of the Trade Surveillance modules Sound knowledge of the Bloomberg terminal and its different screens Key Competencies Qualifications: MBA - Finance / CFA, Law, or Compliance related qualification. Capital Markets knowledge/NCFM certifications, preferred. Experience: 3 - 8 years of experience in Trade Surveillance role Behavioral Competencies: Team working Client Centricity Entrepreneurial Communication Clarity of Thought Self-awareness

Posted 2 days ago

Apply

6.0 - 10.0 years

1 - 1 Lacs

Chennai

Hybrid

Overview: TekWissen is a global workforce management provider throughout India and many other countries in the world. The below clientis a global company with shared ideals and a deep sense of family. From our earliest days as a pioneer of modern transportation, we have sought to make the world a better place one that benefits lives, communities and the planet Job Title: Specialty Development Practitioner Location: Chennai Work Type: Hybrid Position Description: At the client's Credit Company, we are modernizing our enterprise data warehouse in Google Cloud to enhance data, analytics, and AI/ML capabilities, improve customer experience, ensure regulatory compliance, and boost operational efficiencies. As a GCP Data Engineer, you will integrate data from various sources into novel data products. You will build upon existing analytical data, including merging historical data from legacy platforms with data ingested from new platforms. You will also analyze and manipulate large datasets, activating data assets to enable enterprise platforms and analytics within GCP. You will design and implement the transformation and modernization on GCP, creating scalable data pipelines that land data from source applications, integrate into subject areas, and build data marts and products for analytics solutions. You will also conduct deep-dive analysis of Current State Receivables and Originations data in our data warehouse, performing impact analysis related to the client's Credit North America's modernization and providing implementation solutions. Moreover, you will partner closely with our AI, data science, and product teams, developing creative solutions that build the future for the client's Credit. Experience with large-scale solutions and operationalizing data warehouses, data lakes, and analytics platforms on Google Cloud Platform or other cloud environments is a must. We are looking for candidates with a broad set of analytical and technology skills across these areas and who can demonstrate an ability to design the right solutions with the appropriate combination of GCP and 3rd party technologies for deployment on Google Cloud Platform. Skills Required: Big Query,, Data Flow, DataForm, Data Fusion, Dataproc, Cloud Composer, AIRFLOW, Cloud SQL, Compute Engine, Google Cloud Platform - Biq Query Experience Required: GCP Data Engineer Certified Successfully designed and implemented data warehouses and ETL processes for over five years, delivering high-quality data solutions. 5+ years of complex SQL development experience 2+ experience with programming languages such as Python, Java, or Apache Beam. Experienced cloud engineer with 3+ years of GCP expertise, specializing in managing cloud infrastructure and applications into production-scale solutions. Big Query,, Data Flow, DataForm, Data Fusion, Dataproc, Cloud Composer, AIRFLOW, Cloud SQL, Compute Engine, Google Cloud Platform Biq Query, Data Flow, Dataproc, Data Fusion, TERRAFORM, Tekton,Cloud SQL, AIRFLOW, POSTGRES, Airflow PySpark, Python, API, cloudbuild, App Engine, Apache Kafka, Pub/Sub, AI/ML, Kubernetes Experience Preferred: In-depth understanding of GCP's underlying architecture and hands-on experience of crucial GCP services, especially those related to data processing (Batch/Real Time) leveraging Terraform, Big Query, Dataflow, Pub/Sub, Data form, astronomer, Data Fusion, DataProc, Pyspark, Cloud Composer/Air Flow, Cloud SQL, Compute Engine, Cloud Functions, Cloud Run, Cloud build and App Engine, alongside and storage including Cloud Storage DevOps tools such as Tekton, GitHub, Terraform, Docker. Expert in designing, optimizing, and troubleshooting complex data pipelines. Experience developing with microservice architecture from container orchestration framework. Experience in designing pipelines and architectures for data processing. Passion and self-motivation to develop/experiment/implement state-of-the-art data engineering methods/techniques. Self-directed, work independently with minimal supervision, and adapts to ambiguous environments. Evidence of a proactive problem-solving mindset and willingness to take the initiative. Strong prioritization, collaboration & coordination skills, and ability to simplify and communicate complex ideas with cross-functional teams and all levels of management. Proven ability to juggle multiple responsibilities and competing demands while maintaining a high level of productivity. Data engineering or development experience gained in a regulated financial environment. Experience in coaching and mentoring Data Engineers Project management tools like Atlassian JIRA Experience working in an implementation team from concept to operations, providing deep technical subject matter expertise for successful deployment. Experience with data security, governance, and compliance best practices in the cloud. Experience with AI solutions or platforms that support AI solutions Experience using data science concepts on production datasets to generate insights Experience Range: 5+ years Education Required: Bachelor's Degree TekWissen® Group is an equal opportunity employer supporting workforce diversity.

Posted 2 days ago

Apply

8.0 - 10.0 years

6 - 13 Lacs

Pune

Remote

Should have min 4 end to end implementation experience.Strong communication skills to Work closely with customers and partners to gather requirements and design solutions.Strong NetSuite ERP Knowledge and experience.

Posted 2 days ago

Apply

9.0 - 14.0 years

7 - 14 Lacs

Hyderabad, Pune

Hybrid

Role & responsibilities Key Skills Required are 8 years of handson experience in cloud application architecture with a focus on creating scalable and reliable software systems 8 Years Experience using Google Cloud Platform GCP including but not restricting to services like Bigquery Cloud SQL Fire store Cloud Composer Experience on Security identity and access management Networking protocols such as TCPIP and HTTPS Network security design including segmentation encryption logging and monitoring Network topologies load balancing and segmentation Python for Rest APIs and Microservices Design and development guidance Python with GCP Cloud SQLPostgreSQL BigQuery Integration of Python API to FE applications built on React JS Unit Testing frameworks Python unit test pytest Java junit spock and groovy DevOps automation process like Jenkins Docker deployments etc Code Deployments on VMs validating an overall solution from the perspective of Infrastructure performance scalability security capacity and create effective mitigation plans Automation technologies Terraform or Google Cloud Deployment Manager Ansible Implementing solutions and processes to manage cloud costs Experience in providing solution to Web Applications Requirements and Design knowledge React JS Elastic Cache GCP IAM Managed Instance Group VMs and GKE Owning the endtoend delivery of solutions which will include developing testing and releasing Infrastructure as Code Translate business requirementsuser stories into a practical scalable solution that leverages the functionality and best practices of the HSBC Executing technical feasibility assessments solution estimations and proposal development for moving identified workloads to the GCP Designing and implementing secure scalable and innovative solutions to meet Banks requirements Ability to interact and influence across all organizational levels on technical or business solutions Certified Google Cloud Architect would be an addon Create and own scaling capacity planning configuration management and monitoring of processes and procedures Create put into practice and use cloudnative solutions Lead the adoption of new cloud technologies and establish best practices for them Experience establishing technical strategy and architecture at the enterprise level Experience leading GCP Cloud project delivery Collaborate with IT security to monitor cloud privacy Architecture DevOps data and integration teams to ensure best practices are followed throughout cloud adoption Respond to technical issues and provide guidance to technical team Skills Mandatory Skills : GCP Storage,GCP BigQuery,GCP DataProc,GCP Vertex AI,GCP Spanner,GCP Dataprep,GCP Datastream,Google Analytics Hub,GCP Dataform,GCP Dataplex/Catalog,GCP Cloud Datastore/Firestore,GCP Datafusion,GCP Pub/Sub,GCP Cloud SQL,GCP Cloud Composer,Google Looker,GCP Cloud Datastore,GCP Data Architecture,Google Cloud IAM,GCP Bigtable,GCP Looker1,GCP Data Flow,GCP Cloud Pub/Sub"

Posted 3 days ago

Apply

4.0 - 8.0 years

0 Lacs

hyderabad, telangana

On-site

As a GCP Developer, you will be responsible for maintaining the stability of production platforms, delivering new features, and minimizing technical debt across various technologies. You should have a minimum of 4 years of experience in the field. You must have a strong commitment to maintaining high standards and a genuine passion for ensuring quality in your work. Proficiency in GCP, Python, Hadoop, Spark, Cloud, Scala, Streaming (pub/sub), Kafka, SQL, Data Proc, and Data Flow is essential for this role. Additionally, familiarity with data warehouses, distributed data platforms, and data lakes is required. You should possess knowledge in database definition, schema design, Looker Views, and Models. An understanding of data structures and algorithms is crucial for success in this position. Experience with CI/CD practices would be advantageous. This position involves working in a dynamic environment across multiple locations such as Chennai, Hyderabad, and Bangalore. A total of 20 positions are available for qualified candidates.,

Posted 3 days ago

Apply

6.0 - 10.0 years

0 Lacs

pune, maharashtra

On-site

The role of a Pricing Implementation Lead at FedEx involves ensuring timely, accurate, and quality-checked setup of discounts and pricing for large customers/accounts using FedEx Pricing systems and relevant tools. It requires transforming and optimizing pricing processes and systems for enhanced efficiency, reduced turnaround times, and diminished human intervention through process simplification and automation initiatives. As a Pricing Implementation Lead, your primary responsibilities include validating approved prices" completeness and implementing them for FedEx customers. You will specialize in facilitating pricing deployment across different FedEx operating companies, involving execution, testing, documentation, and optimizing contract administration pricing processes. Your duties will consist of entering pricing discount and rates information into FedEx enterprise pricing systems, configuring necessary parameters within the pricing systems, and auditing data entered in the pricing ecosystem. You will also be involved in planning, implementing pricing changes, and validating them for Pricing Contract administration. This role manages pricing-specific processes supporting all FedEx Enterprise Global Net Rate Pricing accounts, including Global Air Freight pricing. Collaboration with key business partners to effectively implement customers" pricing and discounting requirements, streamlining pricing processes through optimization and automation, and managing costs to achieve business efficiencies are crucial aspects of this position. To excel in this role, you must possess the ability to independently run complex projects with minimal supervision, excellent communication skills across all levels, proficiency in business process configuration and project management tasks, hands-on experience in working across complex enterprise systems, and a strong understanding of data flow and governance methodology. Additionally, technical skills in data extraction using SQL or SAS, data visualization using Power BI or Tableau, or data analysis using Advanced Excel are essential. The ideal candidate for this position would have a background as a Business Analyst, Techno-Functional Analyst, System Analyst, Implementation Analyst, Consultant, or in process-oriented roles with 6 to 10 years of relevant work experience. A Bachelor's degree or equivalent in Computer Science, MIS, Mathematics, Statistics, or a similar discipline is required, while a Master's degree or PhD is preferred. FedEx is committed to fostering a diverse, equitable, and inclusive workforce and is an equal opportunity/affirmative action employer. The company values fair treatment, growth opportunities for all, and a people-first philosophy. FedEx's success is attributed to its team members, who are dedicated to delivering outstanding service to customers worldwide.,

Posted 3 days ago

Apply

6.0 - 8.0 years

10 - 15 Lacs

Bengaluru

Work from Office

Job Summary Synechron is seeking a highly skilled and proactive Data Engineer to join our dynamic data analytics team. In this role, you will be instrumental in designing, developing, and maintaining scalable data pipelines and solutions on the Google Cloud Platform (GCP). With your expertise, you'll enable data-driven decision-making, contribute to strategic business initiatives, and ensure robust data infrastructure. This position offers an opportunity to work in a collaborative environment with a focus on innovative technologies and continuous growth. Software Requirements Required: Proficiency in Data Engineering tools and frameworks such as Hive , Apache Spark , and Python (version 3.x) Extensive experience working with Google Cloud Platform (GCP) offerings including Dataflow, BigQuery, Cloud Storage, and Pub/Sub Familiarity with Git , Jira , and Confluence for version control and collaboration Preferred: Experience with additional GCP services like DataProc, Data Studio, or Cloud Composer Exposure to other programming languages such as Java or Scala Knowledge of data security best practices and tools Overall Responsibilities Design, develop, and optimize scalable data pipelines on GCP to support analytics and reporting needs Collaborate with cross-functional teams to translate business requirements into technical solutions Build and maintain data models, ensuring data quality, integrity, and security Participate actively in code reviews, adhering to best practices and standards Develop automated and efficient data workflows to improve system performance Stay updated with emerging data engineering trends and continuously improve technical skills Provide technical guidance and support to team members, fostering a collaborative environment Ensure timely delivery of deliverables aligned with project milestones Technical Skills (By Category) Programming Languages: EssentialPython (required) PreferredJava, Scala Data Management & Databases: Experience with Hive, BigQuery, and relational databases Knowledge of data warehousing concepts and SQL proficiency Cloud Technologies: Extensive hands-on experience with GCP services including Dataflow, BigQuery, Cloud Storage, Pub/Sub, and Composer Ability to build and optimize data pipelines leveraging GCP offerings Frameworks & Libraries: Spark (PySpark preferred), Hadoop ecosystem experience is advantageous Development Tools & Methodologies: Agile/Scrum methodologies, version control with Git, project tracking via JIRA, documentation on Confluence Security Protocols: Understanding of data security, privacy, and compliance standards Experience Requirements Minimum of 6-8 years in data or software engineering roles with a focus on data pipeline development Proven experience in designing and implementing data solutions on cloud platforms, particularly GCP Prior experience working in agile teams, participating in code reviews, and delivering end-to-end data projects Experience working with cross-disciplinary teams and understanding varied stakeholder requirements Exposure to industry best practices for data security, governance, and quality assurance is desired Day-to-Day Activities Attend daily stand-up meetings and contribute to project planning sessions Collaborate with business analysts, data scientists, and other stakeholders to understand data needs Develop, test, and deploy scalable data pipelines, ensuring efficiency and reliability Perform regular code reviews, provide constructive feedback, and uphold coding standards Document technical solutions and maintain clear records of data workflows Troubleshoot and resolve technical issues in data processing environments Participate in continuous learning initiatives to stay abreast of technological developments Support team members by sharing knowledge and resolving technical challenges Qualifications Bachelor's or Masters degree in Computer Science, Information Technology, or a related field Relevant professional certifications in GCP (such as Google Cloud Professional Data Engineer) are preferred but not mandatory Demonstrable experience in data engineering and cloud technologies Professional Competencies Strong analytical and problem-solving skills, with a focus on outcome-driven solutions Excellent communication and interpersonal skills to effectively collaborate within teams and with stakeholders Ability to work independently with minimal supervision and manage multiple priorities effectively Adaptability to evolving technologies and project requirements Demonstrated initiative in driving tasks forward and continuous improvement mindset Strong organizational skills with a focus on quality and attention to detail S YNECHRONS DIVERSITY & INCLUSION STATEMENT Diversity & Inclusion are fundamental to our culture, and Synechron is proud to be an equal opportunity workplace and is an affirmative action employer. Our Diversity, Equity, and Inclusion (DEI) initiative Same Difference is committed to fostering an inclusive culture promoting equality, diversity and an environment that is respectful to all. We strongly believe that a diverse workforce helps build stronger, successful businesses as a global company. We encourage applicants from across diverse backgrounds, race, ethnicities, religion, age, marital status, gender, sexual orientations, or disabilities to apply. We empower our global workforce by offering flexible workplace arrangements, mentoring, internal mobility, learning and development programs, and more. All employment decisions at Synechron are based on business needs, job requirements and individual qualifications, without regard to the applicants gender, gender identity, sexual orientation, race, ethnicity, disabled or veteran status, or any other characteristic protected by law . Candidate Application Notice

Posted 3 days ago

Apply

8.0 - 12.0 years

30 - 35 Lacs

Pune

Work from Office

About The Role : Job TitleSenior Engineer PD, AVP LocationPune, India Role Description Our team is part of the area Technology, Data, and Innovation (TDI) Private Bank. Within TDI, Partner data is the central client reference data system in Germany. As a core banking system, many banking processes and applications are integrated and communicate via >2k interfaces. From a technical perspective, we focus on mainframe but also build solutions on premise cloud, restful services, and an angular frontend. Next to the maintenance and the implementation of new CTB requirements, the content focus also lies on the regulatory and tax topics surrounding a partner/ client. We are looking for a very motivated candidate for the Cloud Data Engineer area. What well offer you 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities You are responsible for the implementation of the new project on GCP (Spark, Dataproc, Dataflow, BigQuery, Terraform etc) in the whole SDLC chain You are responsible for the support of the migration of current functionalities to Google Cloud You are responsible for the stability of the application landscape and support software releases You also support in L3 topics and application governance You are responsible in the CTM area for coding as part of an agile team (Java, Scala, Spring Boot) Your skills and experience You have experience with databases (BigQuery, Cloud SQl, No Sql, Hive etc.) and development preferably for Big Data and GCP technologies Strong understanding of Data Mesh Approach and integration patterns Understanding of Party data and integration with Product data Your architectural skills for big data solutions, especially interface architecture allows a fast start You have experience in at leastSpark, Java ,Scala and Python, Maven, Artifactory, Hadoop Ecosystem, Github Actions, GitHub, Terraform scripting You have knowledge in customer reference data, customer opening processes and preferably regulatory topics around know your customer processes You can work very well in teams but also independent and are constructive and target oriented Your English skills are good and you can both communicate professionally but also informally in small talks with the team How well support you About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 3 days ago

Apply

5.0 - 10.0 years

16 - 20 Lacs

Pune

Work from Office

About The Role : As a Senior Data Architect, you will be instrumental in shaping the banks enterprise data landscapesupporting teams in designing, evolving, and implementing data architectures that align with the enterprise target state and enable scalable, compliant, and interoperable solutions. You will also serve as the go-to expert and trusted advisor on what good looks like in data architecture, helping to set high standards and drive continuous improvement across the organization. This role is ideal for an experienced data professional with deep technical expertise, strong solution architecture skills, and a proven ability to influence design decisions across both business and technology teams. Responsibilities 1. Enterprise Data Architecture & Solution Design Support teams in designing, evolving, and implementing data architectures that align with the enterprise target state and enable scalable, compliant, and interoperable solutions. Serve as the go-to person for data architecture best practices and standards, helping to define and communicate what good looks like to ensure consistency and quality. Lead and contribute to solution architecture for key programs, ensuring architectural decisions are well-documented, justified, and aligned to enterprise principles. Work with engineering and platform teams to design end-to-end data flows, integration patterns, data processing pipelines, and storage strategies across structured and unstructured data. Drive the application of modern data architecture principles including event-driven architecture, data mesh, streaming, and decoupled data services. 2. Data Modelling and Semantics Provide hands-on leadership in data modelling efforts, including the occasional creation and stewardship of conceptual, logical, and physical models that support enterprise data domains. Partner with product and engineering teams to ensure data models are fit-for-purpose, extensible, and aligned with enterprise vocabularies and semantics. Support modelling use cases across regulatory, operational, and analytical data assets. 3. Architecture Standards & Frameworks Define and continuously improve data architecture standards, patterns, and reference architectures that support consistency and interoperability across platforms. Embed standards into engineering workflows and tooling to encourage automation and reduce delivery friction. Measure and report on adoption of architectural principles using architecture KPIs and compliance metrics. 4. Leadership, Collaboration & Strategy Act as a technical advisor and architectural leader across initiatives mentoring junior architects and supporting federated architecture teams in delivery. Build strong partnerships with senior stakeholders across the business, CDIO, engineering, and infrastructure teams to ensure alignment and adoption of architecture strategy. Stay current with industry trends, regulatory changes, and emerging technologies, advising on their potential impact and application. Skills Extensive experience in data architecture, data engineering, or enterprise architecture, preferably within a global financial institution. Deep understanding of data platforms, integration technologies, and architectural patterns for real-time and batch processing. Proficiency with data architecture tools such as Sparx Enterprise Architect, ERwin, or similar. Experience designing solutions in cloud and hybrid environments (e.g. GCP, AWS, or Azure), with knowledge of associated data services. Hands-on experience with data modelling, semantic layer design, and metadata-driven architecture approaches. Strong grasp of data governance, privacy, security, and regulatory complianceespecially as they intersect with architectural decision-making. Strategic mindset, with the ability to connect architectural goals to business value, and communicate effectively with technical and non-technical stakeholders. Experience working across business domains including Risk, Finance, Treasury, or Front Office functions. Well-being & Benefits Emotionally and mentally balanced: we support you in dealing with life crises, maintaining stability through illness, and maintaining good mental health Empowering managers who value your ideas and decisions. Show your positive attitude, determination, and open-mindedness. A professional, passionate, and fun workplace with flexible Work from Home options. A modern office with fun and relaxing areas to boost creativity. Continuous learning culture with coaching and support from team experts. Physically thriving we support you managing your physical health by taking appropriate preventive measures and providing a workplace that helps you thrive Private healthcare and life insurance with premium benefits for you and discounts for your loved ones. Socially connected: we strongly believe in collaboration, inclusion and feeling connected to open up new perspectives and strengthen our self-confidence and wellbeing. Kids@TheOffice - support for unexpected events requiring you to care for your kids during work hours. Enjoy retailer discounts, cultural and CSR activities, employee sport clubs, workshops, and more. Financially secure: we support you to meet personal financial goals during your active career and for the future Competitive income, performance-based promotions, and a sense of purpose. 24 days holiday, loyalty days, and bank holidays (including weekdays for weekend bank holidays). We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 3 days ago

Apply

3.0 - 8.0 years

3 - 7 Lacs

Pune

Work from Office

Project Role : Application Support Engineer Project Role Description : Act as software detectives, provide a dynamic service identifying and solving issues within multiple components of critical business systems. Must have skills : Electronic Medical Records (EMR) Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Support Engineer, you will act as software detectives, providing a dynamic service that identifies and resolves issues within various components of critical business systems. Your typical day will involve collaborating with team members to troubleshoot software problems, analyzing system performance, and ensuring that applications run smoothly to support business operations effectively. You will engage with users to understand their challenges and work towards implementing solutions that enhance system functionality and user experience. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of processes and procedures to enhance team knowledge.- Engage with stakeholders to gather requirements and provide feedback on system performance. Professional & Technical Skills: - Must To Have Skills: Proficiency in Electronic Medical Records (EMR).- Strong analytical skills to diagnose and resolve software issues.- Experience with troubleshooting and debugging software applications.- Familiarity with system integration and data flow management.- Ability to communicate technical information effectively to non-technical users. Additional Information:- The candidate should have minimum 3 years of experience in Electronic Medical Records (EMR).- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 3 days ago

Apply

7.0 - 12.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions, and ensuring that applications are aligned with business objectives. You will engage in problem-solving activities, manage project timelines, and contribute to the overall success of application development initiatives. Roles & Responsibilities:1.Serve as a client-facing technical lead, working closely with stakeholders to gather requirements and translate them into actionable ETL solutions.2.Design and develop new stored procedures in MS SQL Server, with a strong focus on performance and maintainability.3.Build/enhance SSIS packages, implementing best practices for modularity, reusability, and error handling.4.Architect and design ETL workflows, including staging, cleansing, data masking, transformation, and loading strategies.5.Implement comprehensive error handling and logging mechanisms to support reliable, auditable data pipelines.6.Design and maintain ETL-related tables, including staging, audit/logging, and dimensional/historical tables.7.Work with Snowflake to build scalable cloud-based data integration and warehousing solutions.8.Reverse-engineer and optimize existing ETL processes and stored procedures for better performance and maintainability.9.Troubleshoot job failures, data discrepancies in Production Professional & Technical Skills: 1.7+ years of experience in Data Warehousing [MS SQL, Snowflake], MS SQL Server (T-SQL, stored procedures, indexing, performance tuning).2.Proven expertise in SSIS package development, including parameterization, data flow, and control flow design.3.Strong experience in ETL architecture, including logging, exception handling, and data validation.4.Proficient in data modeling for ETL, including staging, target, and history tables.5.Hands-on experience with Snowflake, including data loading, transformation scripting, and optimization.6.Ability to manage historical data using SCDs, auditing fields, and temporal modeling.7.Set up Git repositories, define version control standards, and manage code branching/release. DevOps, and CI/CD practices for data pipelines.8.Ability to work independently while managing multiple issues and deadlines.9.Excellent communication skills, both verbal and written, with demonstrated client interaction.Would be a Plus:10.DW migration from MS SQL to Snowflake.11.Experience with modern data integration tools such as Matillion.12.Knowledge of BI tools like Tableau. Additional Information:- The candidate should have minimum 5 years of experience in Snowflake Data Warehouse.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 3 days ago

Apply

2.0 - 5.0 years

5 - 9 Lacs

Hyderabad

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Axway API Management Platform Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions that align with business objectives, and ensuring that applications are optimized for performance and usability. You will also engage in problem-solving activities, providing support and enhancements to existing applications while maintaining a focus on quality and efficiency. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of milestones. Professional & Technical Skills: - Must To Have Skills: Proficiency in Axway API Management Platform.- Strong understanding of API design and development principles.- Experience with application integration and data flow management.- Familiarity with cloud-based services and deployment strategies.- Ability to troubleshoot and resolve application issues efficiently. Additional Information:- The candidate should have minimum 5 years of experience in Axway API Management Platform.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 3 days ago

Apply

2.0 - 4.0 years

4 - 8 Lacs

Pune

Work from Office

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to effectively migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver solutions that meet business needs, while also troubleshooting any issues that arise in the data flow. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Continuously evaluate and improve data processes to enhance efficiency. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Good To Have Skills: Experience with data modeling and database design.- Strong understanding of ETL processes and data integration techniques.- Familiarity with cloud platforms and data storage solutions.- Experience in performance tuning and optimization of data queries. Additional Information:- The candidate should have minimum 5 years of experience in Snowflake Data Warehouse.- This position is based in Pune.- A 15 years full time education is required. Qualification 15 years full time education

Posted 3 days ago

Apply

3.0 - 8.0 years

4 - 8 Lacs

Hyderabad

Work from Office

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Oracle Procedural Language Extensions to SQL (PLSQL) Good to have skills : Google BigQuery, Google Cloud Platform ArchitectureMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will engage in the design, development, and maintenance of data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to effectively migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and contribute to the overall data strategy of the organization, ensuring that data solutions are robust, scalable, and aligned with business objectives. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Develop and optimize data pipelines to ensure efficient data flow and processing.- Monitor and troubleshoot data quality issues, implementing corrective actions as necessary. Professional & Technical Skills: - Must To Have Skills: Proficiency in Oracle Procedural Language Extensions to SQL (PLSQL).- Good To Have Skills: Experience with Google BigQuery, Google Cloud Platform Architecture.- Strong understanding of ETL processes and data integration techniques.- Experience with data modeling and database design principles.- Familiarity with data warehousing concepts and best practices. Additional Information:- The candidate should have minimum 3 years of experience in Oracle Procedural Language Extensions to SQL (PLSQL).- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 3 days ago

Apply

7.0 - 11.0 years

13 - 18 Lacs

Pune

Work from Office

Project Role : Data Architect Project Role Description : Define the data requirements and structure for the application. Model and design the application data structure, storage and integration. Must have skills : Apache Kafka Good to have skills : Data AnalyticsMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Architect, you will define the data requirements and structure for the application. Your typical day will involve modeling and designing the application data structure, storage, and integration, ensuring that the architecture aligns with business needs and technical specifications. You will collaborate with various teams to ensure that data flows seamlessly across systems, contributing to the overall efficiency and effectiveness of data management within the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Develop and maintain documentation related to data architecture and design. Professional & Technical Skills: - Must To Have Skills: Proficiency in Apache Kafka.- Good To Have Skills: Experience with Data Analytics.- Strong understanding of data modeling techniques and best practices.- Experience with data integration tools and methodologies.- Familiarity with cloud-based data storage solutions. Additional Information:- The candidate should have minimum 5 years of experience in Apache Kafka.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 3 days ago

Apply

3.0 - 7.0 years

0 Lacs

kochi, kerala

On-site

At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture, and technology to become the best version of you. And we're counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS Data and Analytics (D&A) OBIEE - Senior The opportunity We're looking for a Senior expert in Data analytics to create and manage large BI and analytics solutions using Visualization Tools such as OBIEE/OAC that turn data into knowledge. In this role, you should have a background in data and business analysis. You should be analytical and an excellent communicator. Having business acumen and problem-solving aptitude would be a plus. Your key responsibilities You need to work as a team member and Lead to contribute to various technical streams of OBIEE/OAC implementation projects. Provide product and design level technical best practices. Interface and communicate with the onsite coordinators. Completion of assigned tasks on time and regular status reporting to the lead. Skills and attributes for success Use an issue-based approach to deliver growth, market, and portfolio strategy engagements for corporates. Strong communication, presentation, and team-building skills and experience in producing high-quality reports, papers, and presentations. Exposure to BI and other visualization tools in the market. Building a quality culture. Foster teamwork. Participating in the organization-wide people initiatives. To qualify for the role, you must have BE/BTech/MCA/MBA with adequate industry experience. Should have at least around 3 to 7 years of experience in OBIEE/OAC. Experience in Working with OBIEE, OAC end-to-end implementation. Understanding ETL/ELT Process using tools like Informatica/ODI/SSIS. Should have knowledge of reporting, dashboards, RPD logical modeling. Experience on BI Publisher. Experience on Agents. Experience in Security implementation in OAC/OBIEE. Ability to manage self-service data preparation, data sync, data flow, and working with curated data sets. Manage connections to multiple data sources - cloud, non-cloud using available various data connector with OAC. Experience in creating pixel-perfect reports, manage contents in the catalog, dashboards, prompts, calculations. Ability to create a dataset, map layers, multiple data visualizations, a story in OAC. Good understanding of various data models e.g. snowflakes, data marts, star data models, data lakes, etc. Excellent written and verbal communication. Having Cloud experience is an added advantage. Migrating OBIEE on-premise to Oracle analytics in the cloud. Knowledge and working experience with Oracle autonomous database. Strong knowledge in DWH concepts. Strong data modeling skills. Familiar with Agile and Waterfall SDLC processes. Strong SQL/PLSQL with analytical skills. Ideally, you'll also have Experience in Insurance and Banking domains. Strong hold in project delivery and team management. Excellent written and verbal communication skills. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people, and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform, and operate. Working across assurance, consulting, law, strategy, tax, and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.,

Posted 6 days ago

Apply

6.0 - 10.0 years

0 Lacs

karnataka

On-site

You will be working as a Monitoring Team Lead for a Data Pipeline L1 team, overseeing the daily operations to ensure the health and stability of data pipelines, and managing incident response. Your role will involve leading the team, monitoring performance, and escalating issues as needed. As a Team Leader, you will guide and mentor the L1 monitoring team to ensure proficiency in data pipeline monitoring, troubleshooting, and escalation procedures. You will manage team performance, distribute tasks effectively, and resolve conflicts. Acting as a point of contact for the team, you will represent them to stakeholders and advocate for their needs. Your responsibilities will also include developing team strengths and promoting a positive work environment. In terms of Data Pipeline Monitoring, you will continuously monitor data pipelines for performance, availability, and data quality issues. Utilizing monitoring tools, you will detect and analyze alerts related to data pipelines to ensure data freshness, completeness, accuracy, consistency, and validity. For Incident Management, you are required to detect, log, categorize, and track incidents within the ticketing system. Any unresolved issues should be escalated to L2/L3 teams based on predefined SLAs and severity. You will also coordinate with other teams to resolve incidents quickly and efficiently while ensuring proper communication and updates to relevant stakeholders throughout the incident lifecycle. Managing Service Level Agreements (SLAs) related to data pipeline monitoring and incident response will be essential. You will monitor and ensure that the team meets or exceeds established SLAs. Process Improvement is another key aspect where you will identify opportunities to enhance monitoring processes, automation, and efficiency. Implementing best practices for data pipeline monitoring and incident management and conducting regular reviews of service performance are part of your responsibilities. Your role will also involve providing technical expertise to the team, staying updated on industry best practices and new technologies related to data pipelines and monitoring. Maintaining and updating documentation related to data pipeline monitoring processes, procedures, and escalation paths is crucial. Accurate shift handovers to the next shift, with updates on ongoing issues, will also be expected. Qualifications: - Proven experience in data pipeline monitoring and incident management. - Strong understanding of data pipeline concepts, including ingestion, transformation, and storage. - Experience with monitoring tools and technologies. - Excellent communication, interpersonal, and leadership skills. - Ability to work independently and as part of a team in a fast-paced environment. - Experience with cloud services (AWS, Azure, or GCP) is a plus. - Knowledge of data governance principles and practices is beneficial. Skills to be evaluated on: - Data Operation/Operations Team Lead. Mandatory Skills: - Data Operation, Operations Team Lead. Desirable Skills: - Lead Operations, data operations, operations management, team management.,

Posted 6 days ago

Apply

4.0 - 7.0 years

9 - 13 Lacs

Pune

Work from Office

skilled Java + GCP Developer Shell scripting and Python, Java, Spring Boot, BigQuery. The ideal candidate should have hands-on experience in Java, Spring Boot, and Google Cloud Platform (GCP)

Posted 6 days ago

Apply

5.0 - 7.0 years

5 - 14 Lacs

Pune, Gurugram, Bengaluru

Work from Office

• Handson experience in objectoriented programming using Python, PySpark, APIs, SQL, BigQuery, GCP • Building data pipelines for huge volume of data • Dataflow Dataproc and BigQuery • Deep understanding of ETL concepts

Posted 6 days ago

Apply

5.0 - 8.0 years

5 - 8 Lacs

Bengaluru

Work from Office

Skills desired: Strong at SQL (Multi pyramid SQL joins) Python skills (FastAPI or flask framework) PySpark Commitment to work in overlapping hours GCP knowledge(BQ, DataProc and Dataflow) Amex experience is preferred(Not Mandatory) Power BI preferred (Not Mandatory) Flask, Pyspark, Python, Sql

Posted 6 days ago

Apply

4.0 - 8.0 years

6 - 11 Lacs

Hyderabad, Pune

Work from Office

NetSuite Functional Consultant1 Job Responsibilities: Configure/customize NetSuite application to meet customers business requirements. Conduct personalization sessions and document with meeting minute summaries. Demonstrated experience in participating and translating customer business requirements into Business solutions, either as a software solution or a re-engineering initiative Collaborate with technical team member(s) to help guide the development of customized solutions, or data extracts using SQL queries Identify test scenarios, establish test cases and support SIT, UAT with core client stakeholders to ensure system configuration objectives have been met Create training/support documentation, and drive end-user training to promote user adoption Documentation of requirement, Process and User documentation Design business process and application configuration for application based on industry best practices. Support the Go Live deployment processes, ensuring a seamless software launch and continuity of business operations during cutover Responsible for owning and delivering complex solutions using Oracle NetSuite platform. Software-testing and Conduct testing of all kinds and prepare test cases of the modules implemented and developed. Suggest process improvements based on application capability and industry best practices. Responsible for NetSuite Setups Customer, Vendor, and Item Department, Class, Locations NetSuite Processes Order to Cash Procure to Pay Bank Reconciliation Accounting Advanced Revenue management Fixed Asset Intercompany Management Call to Resolution (Case Management) Form Customizations & Fields Creation Custom Records CSV Imports Work-Flows setup Saved Searches & Report Customization Integration process mapping Skills & Experience Required: 8+ yrs of hands on experience in NetSuite Implementation & Enhancement projects Thorough knowledge of NetSuite functionalities and architecture Hands-on experience on NetSuite Integration with 3rd party applications. Should have min 4 end to end implementation experience. Strong communication skills to Work closely with customers and partners to gather requirements and design solutions. Strong NetSuite ERP Knowledge and experience. Setups and Configurations, Saved Searches and reports. The mandatory requirement is to have functional experience in Receivables, Order Management ,case management and billing operations within NetSuite Excellent command on flowcharts, Data flow Diagrams Strong analytical and problem-solving skills, Good team player and collaborate with other team Ready to be on-call on a rotational basis. Excellent command on google sheet, google apps, word, excel, PowerPoint.

Posted 6 days ago

Apply

Exploring Data Flow Jobs in India

The data flow job market in India is booming with opportunities for skilled professionals. With the increasing reliance on data-driven decision-making across industries, the demand for data flow experts is on the rise. Whether you are a recent graduate or an experienced professional looking to transition into this field, there are ample job openings waiting for you in India.

Top Hiring Locations in India

  1. Bangalore
  2. Hyderabad
  3. Pune
  4. Mumbai
  5. Delhi-NCR

These cities are known for their strong presence of tech companies and offer a plethora of opportunities for data flow roles.

Average Salary Range

The average salary range for data flow professionals in India varies based on experience and expertise. Entry-level positions can expect to earn between INR 4-6 lakhs per annum, while experienced professionals with advanced skills can command salaries upwards of INR 15 lakhs per annum.

Career Path

In the data flow domain, a typical career path may include roles such as: - Junior Data Analyst - Data Engineer - Data Scientist - Senior Data Architect - Chief Data Officer

As you gain experience and expertise, you can progress to higher positions with increased responsibilities and leadership opportunities.

Related Skills

Apart from expertise in data flow tools and technologies, professionals in this field are often expected to have skills in: - Data visualization - Machine learning - Statistical analysis - Programming languages (Python, R, SQL)

Interview Questions

  • What is ETL and how does it relate to data flow? (basic)
  • Explain the difference between batch processing and real-time processing. (medium)
  • How would you handle missing data in a dataset? (medium)
  • Can you explain the concept of data normalization and why it is important? (medium)
  • What is the difference between supervised and unsupervised learning? (basic)
  • How would you optimize a data pipeline for performance? (advanced)
  • Can you describe a challenging data flow problem you encountered in a previous project and how you solved it? (advanced)
  • What is the role of Apache Kafka in data flow architectures? (medium)
  • How do you ensure data quality and consistency in a data flow process? (medium)
  • Explain the concept of data lineage and its importance in data flow management. (advanced)
  • What are the advantages of using a distributed data processing framework like Apache Spark? (medium)
  • How do you handle data security and privacy issues in a data flow environment? (advanced)
  • Can you explain the concept of data partitioning and its benefits in parallel processing? (medium)
  • How would you approach data profiling and data quality assessment in a new dataset? (medium)
  • What are the key components of a data flow architecture? (basic)
  • How do you handle data skew in distributed data processing? (advanced)
  • Explain the concept of data replication and its use cases in data flow management. (medium)
  • How do you stay updated with the latest trends and technologies in the data flow domain? (basic)
  • Can you describe a scenario where you had to optimize a data flow process for cost efficiency? (advanced)
  • What are the common challenges faced in designing and implementing data pipelines? (medium)
  • How do you ensure data integrity and consistency in a distributed data processing environment? (advanced)
  • Can you explain the difference between stream processing and batch processing? (basic)
  • Describe a time when you had to troubleshoot a data flow issue in a production environment. (medium)
  • How would you handle a sudden increase in data volume in a data flow pipeline? (advanced)

Closing Remark

As you embark on your journey to explore data flow jobs in India, remember to equip yourself with the necessary skills and knowledge to stand out in a competitive job market. Prepare diligently, showcase your expertise, and apply confidently to secure exciting opportunities in this growing field. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies