Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
5.0 - 8.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Our organization is in search of a seasoned Senior Data Engineer to enhance our team. In this role, you will focus on projects involving data integration and ETL for cloud environments. Your primary duties will include the design and execution of intricate data solutions, maintaining data accuracy, dependability, and accessibility. Responsibilities Design and execute intricate data solutions for cloud environments Develop ETL processes utilizing SQL, Python, and other applicable technologies Maintain data accuracy, dependability, and accessibility for all stakeholders Work collaboratively with cross-functional teams to comprehend data integration necessities and specifications Create and uphold documentation such as technical specifications, data flow diagrams, and data mappings Optimize data integration processes for enhanced performance and efficiency while ensuring data accuracy and integrity Requirements Bachelor's degree in Computer Science, Electrical Engineering, or related field 5-8 years of experience in data engineering Proficiency in cloud-native or Spark-based ETL tools like AWS Glue, Azure Data Factory, or GCP Dataflow Strong knowledge of SQL for data querying and manipulation Familiarity with Snowflake for data warehousing Background in cloud platforms such as AWS, GCP, or Azure for data storage and processing Excellent problem-solving skills and meticulous attention to detail Good verbal and written communication skills in English at a B2 level Nice to have Background in ETL using Python Show more Show less
Posted 1 week ago
5.0 - 8.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Our company is looking for an experienced Senior Data Engineer to join our team. As a Senior Data Engineer, you will be working on a project that focuses on data integration and ETL for cloud-based platforms. You will be responsible for designing and implementing complex data solutions, ensuring that the data is accurate, reliable, and easily accessible. Responsibilities Design and implement complex data solutions for cloud-based platforms Develop ETL processes using SQL, Python, and other relevant technologies Ensure that data is accurate, reliable, and easily accessible for all stakeholders Collaborate with cross-functional teams to understand data integration needs and requirements Develop and maintain documentation, including technical specifications, data flow diagrams, and data mappings Monitor and optimize data integration processes for performance and efficiency, ensuring data accuracy and integrity Requirements Bachelor's degree in Computer Science, Electrical Engineering, or a related field 5-8 years of experience in data engineering Experience with cloud-native or Spark-based ETL tools such as AWS Glue, Azure Data Factory, or GCP Dataflow Strong knowledge of SQL for data querying and manipulation Experience with Snowflake for data warehousing Experience with cloud platforms such as AWS, GCP, or Azure for data storage and processing Excellent problem-solving skills and attention to detail Good verbal and written communication skills in English at a B2 level Nice to have Experience with ETL using Python Show more Show less
Posted 1 week ago
0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Summary The Therapeutic Data Strategy Director (TDSD) bridges science and operations by defining how the clinical data strategy is operationalized across the complete data flow within GCO. The TDSD is responsible for ensuring data regulatory compliance, the availability of End-to-End (E2E) standards, that instruments and devices are thoroughly discussed, defined, and finalized prior to the database build and that the operational impact of any new changes are known, , mitigated, and captured in the appropriate knowledge database. In collaboration with the GPT, GCT, and CTT, the TDSD aligns on the fit for purpose data package as part of a program / indication level quality by design to support data strategy needs in the drug development lifecycle of a molecule or across therapeutic area (TA) within an assigned unit in Novartis. This role creates and implements strategies for the end-to-end data product, ensuring application of Novartis clinical data standards and defining the clinical data acquisition and data review, strategy to support the submission of our clinical programs. The TDSD is responsible for ensuring that delivery and timelines are met with quality, whilst ensuring cost efficiencies and stakeholders’ satisfaction. About The Role Major Accountabilities (Describe the 10-14 main results of the role to be achieved) Creation And Execution Of Operational Data Strategy Collaborate with the Global Program Clinical Head (GPCH) to establish and maintain a data strategy for the clinical data standards of the assigned Therapeutic Area, as well as the design, collection, processing, transformation, of clinical data supporting the needs for reporting and submission. Impact assessment of proposed data collection and analysis Drive capability inputs to data team’s resource algorithm based on future incoming demands Matrix data operations leader who is the single focal point for the sustained industry leading cycle time for data product and ensures compliance with relevant Novartis processes Ensures the provision of resource with the skillset to develop robust & lean E2E specification. Leads the full spectrum of standard development and compliance across their portfolio. Consults to drive quality into the study protocol and operational processes. Driving implementation of a lean global data strategy and define minimum data requirements Ensure the minimum data requirements remain intact and understanding the operational impact e.g., resources, and time of any amendments as well as work with clinical development, analytics and regulatory line functions to understand the scientific, clinical, statistical and regulatory impacts. Support assessment on opportunity to capitalize on non-traditional options (e.g., historical data, synthetic data, cross-sponsor shared control arms, adaptive designs, pragmatic trials, decentralization, etc.). Work with COPH and Vendor Program Strategy Director (VPSD) to define the provision of ancillary data, including vendor capabilities. Author the Operational Data Strategy Section of Operational Execution Plan (OEP) (key customers, dataflow, and targets to generate Data-as-a-Product (DaaP) etc.). Establishes a “performance-oriented culture” that is driven / supported by analysis of real-time activity and quality metrics Contribute to the development of the Data Operations organization. Define/contribute to the development of long-term goals and operating policies through his/her leadership role on the management team. As an extended member of the Data Operations Leadership Team support functional excellence for Data Operations by contributing to the definition of the strategic goals and operating policies, and leading/contributing to strategic initiatives in line with the overall Data Operations strategy. Support the BD&L activities from CDO perspective. End-to-End Ownership Of The Clinical Data Flow Ensures that data is collected and reviewed as efficiently as possible, and that extraneous data is not procured. Drives implementation of a lean global data strategy and defines fit for purpose data quality requirements sufficient to support good decision making and meet regulatory requirements. Collaborates cross-functionally to define quality by design review process to ensure fit for purpose data quality sufficient to support good decision making. Accountable for managing operational strategy around data cleaning and data review at portfolio level. Drives standards and processes to facilitate data right the first time. Act as point of escalation for data specific project management issues and for broader data demands (e.g. changing scope, addition of analysis/reporting events). End-to-End Standards Oversight & Lifecycle Management Responsible for compliance with data requirements and the availability of end-to-end clinical data standards (data collection through analysis) for a program/molecule/indication. Influence and support the design of new clinical data standards as required at the enterprise/ therapeutic area level. Drives identification of needs, adoption and maintenance of data standards. Operational Project Management Develop, communicate, and drive implementation of a global data operationalization strategy to deliver value-adding data; CDS supports and guides the Data Team (as part of the CTT) in ensuring the overall program /OEP strategy is aligned with execution. Establish key customers of Clinical Data and establish approach for future consumption. Works with the business to ensure adherence to timelines, adoption of the data strategy and delivery of the target data product quality. Accountable for managing the strategy of the data cleaning, review, and data related specifications at portfolio and study level. Ensure high quality, timely and efficient Data Operation deliverables for projects and trials partnering with other Data Operations functions within assigned Development Unit or program. Work alongside the Operational Program Lead and Trial Lead to ensure all data related risks and issues are identified and mitigated. Link between business needs and technical development/deployment and technology usage in data operations. Influencer and interlocutor for adoption and compliance with company efficiency process and objectives within data workflow. Assesses / approves changes that impact the data collection strategy. Why Novartis: Helping people with disease and their families takes more than innovative science. It takes a community of smart, passionate people like you. Collaborating, supporting and inspiring each other. Combining to achieve breakthroughs that change patients’ lives. Ready to create a brighter future together? https://www.novartis.com/about/strategy/people-and-culture Join our Novartis Network: Not the right Novartis role for you? Sign up to our talent community to stay connected and learn about suitable career opportunities as soon as they come up: https://talentnetwork.novartis.com/network Benefits and Rewards: Read our handbook to learn about all the ways we’ll help you thrive personally and professionally: https://www.novartis.com/careers/benefits-rewards Show more Show less
Posted 1 week ago
8.0 - 13.0 years
0 Lacs
Gurugram, Haryana, India
On-site
We are looking for a skilled Lead Data Engineer to enhance our dynamic team. In this role, you will focus on designing, developing, and maintaining data integration solutions for our clients. Your leadership will guide a team of engineers in delivering scalable, high-quality, and efficient data integration solutions. This role is perfect for an experienced data integration expert who is passionate about technology and excels in a fast-paced, dynamic setting. Responsibilities Design, develop, and maintain data integration solutions for clients Lead a team of engineers to ensure high-quality, scalable, and efficient delivery of data integration solutions Collaborate with cross-functional teams to comprehend business requirements and design fitting data integration solutions Ensure the security, reliability, and efficiency of data integration solutions Develop and maintain documentation, including technical specifications, data flow diagrams, and data mappings Continuously update knowledge on the latest data integration methods and tools Requirements Bachelor's degree in Computer Science, Information Systems, or a related field 8-13 years of experience in data engineering, data integration, or a related field Proficiency in cloud-native or Spark-based ETL tools such as AWS Glue, Azure Data Factory, or GCP Dataflow Strong knowledge of SQL for querying and manipulating data Competency in Snowflake for cloud data warehousing Familiarity with at least one cloud platform such as AWS, Azure, or GCP Experience in leading a team of engineers on data integration projects Good verbal and written communication skills in English at a B2 level Nice to have Background in ETL using Python Show more Show less
Posted 1 week ago
0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. JD for L&A Business Consultant Working as part of the Consulting team, you will take part in engagements related to a wide range of topics. Some examples of domains in which you will support our clients include the following: Proficient in Individual and Group Life Insurance concepts, different type of Annuity products etc. Proficient in different insurance plans - Qualified/Non-Qualified Plans, IRA, Roth IRA, CRA, SEP Solid knowledge on the Policy Life cycle Illustrations/Quote/Rating New Business & Underwriting Policy Servicing and Administration Billing & Payment Claims Processing Disbursement (Systematic withdrawals, RMD, Surrenders) Regulatory Changes & Taxation Understanding of business rules of Pay-out Demonstrated ability of Insurance Company Operations like Nonforfeiture option/ Face amount increase, decrease/ CVAT or GPT calculations /Dollar cost averaging and perform their respective transactions. Understanding on upstream and downstream interfaces for policy lifecycle Consulting Skills – Experience in creating business process map for future state architecture, creating WBS for overall conversion strategy, requirement refinement process in multi-vendor engagement. Worked on multiple Business transformation and modernization programs. Conducted multiple Due-Diligence and Assessment projects as part of Transformation roadmaps to evaluate current state maturity, gaps in functionalities and COTs solution features. Requirements Gathering, Elicitation –writing BRDs, FSDs. Conducting JAD sessions and Workshops to capture requirements and working close with Product Owner. Work with the client to define the most optimal future state operational process and related product configuration. Define scope by providing innovative solutions and challenging all new client requirements and change requests but simultaneously ensuring that client gets the required business value. Elaborate and deliver clearly defined requirement documents with relevant dataflow and process flow diagrams. Work closely with product design development team to analyse and extract functional enhancements. Provide product consultancy and assist the client with acceptance criteria gathering and support throughout the project life cycle. Technology Skills - Proficient in technology solution architecture, with a focus on designing innovative and effective solutions. Experienced in data migration projects, ensuring seamless transfer of data between systems while maintaining data integrity and security. Skilled in data analytics, utilizing various tools and techniques to extract insights and drive informed decision-making. Strong understanding of data governance principles and best practices, ensuring data quality and compliance. Collaborative team player, able to work closely with stakeholders and technical teams to define requirements and implement effective solutions. Industry certifications (AAPA/LOMA) will be added advantage. Experience on these COTS product is preferrable. FAST ALIP OIPA wmA We expect you to work effectively as a team member and build good relationships with the client. You will have the opportunity to expand your domain knowledge and skills and will be able to collaborate frequently with other EY professionals with a wide variety of expertise. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 1 week ago
14.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Job Role: Dataflow is looking to hire an experienced full-stack user with rich experience in developing Node.JS and React.JS or Angular applications on Amazon’s AWS platform for internal and external consumers. This highly responsible position involves using established work procedures to analyse, design, develop, implement, maintain, re-engineer and troubleshoot the applications, both legacy and new. Successful candidates should be focussed on rapid, agile delivery of high quality designs with an eye for the smaller details and a passion for over-delivering. In return you can expect a salary commensurate with your experience, and the freedom to grow your capabilities into leading edge facets of technology. Duties and responsibilities: ● Working with DataFlow’s business analysis and project management team to fully comprehend requirements, to gather requirement stories and to develop solutions that accurately meet the design specs. ● Delivering innovative and well constructed technology solutions that meet the needs of today, but are envisioned for future use ● Owing experienced points of view to the remainder of the technology team ● Developing the highest quality code with associated commentary and documentation ● Ensure that data and application security are considered at the very outset of development and through the lifecycle of deployment. ● Respond quickly to major incidents and outages, providing immediate workarounds where business is impacted Key skills/requirements: ● Design, develop, and maintain high-quality full-stack applications using Node.js for back-end and React.js for front-end. ● Architect and implement microservices with a focus on scalability and performance. ● Develop and manage RESTful APIs using frameworks like Express.js or Meteor. ● Optimize application performance and troubleshoot production issues. ● Collaborate with cross-functional teams to define, design, and deliver new features. ● Ensure code quality by implementing best practices, including testing, documentation, and continuous integration/deployment (CI/CD). ● Work within Amazon AWS architecture, utilizing services such as Lambda, S3, and RDS. ● Manage and maintain relational databases such as Oracle, MySQL, POSTGRES, or their AWS RDS equivalents. ● Document and test APIs using tools like Swagger or Postman. ● 14+ years of full-stack development experience using Node.JS and React.JS/Angular ● Hands-on experience with microservices architecture. ● Strong problem-solving skills and attention to detail. ● Excellent communication skills in English. ● Experience in agile development methodologies. ● Build user-friendly and responsive interfaces using React.js. ● Ensure seamless integration between front-end and back-end services. Show more Show less
Posted 1 week ago
5.0 - 8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
We are seeking a skilled Senior Data Engineer to become a part of our dynamic team. In this role as a Senior Data Engineer, you will focus on projects involving data integration and ETL processes tailored for cloud-based environments. Your main tasks will include crafting and executing sophisticated data structures, while ensuring the integrity, accuracy, and accessibility of data. Responsibilities Design and execute sophisticated data structures for cloud environments Develop ETL workflows utilizing SQL, Python, and other pertinent technologies Maintain data integrity, reliability, and accessibility for all relevant parties Work with diverse teams to comprehend data integration needs and specifications Create and manage documentation such as technical details, data flow charts, and data mappings Enhance and monitor data integration workflows to boost performance and efficiency while maintaining data accuracy and integrity Requirements Bachelor’s degree in Computer Science, Electrical Engineering, or a related field 5-8 years of experience in data engineering Proficiency in cloud-native or Spark-based ETL tools like AWS Glue, Azure Data Factory, or GCP Dataflow Strong understanding of SQL for data querying and manipulation Familiarity with Snowflake for data warehousing Background in cloud platforms such as AWS, GCP, or Azure for data storage and processing Excellent problem-solving skills and attention to detail Good verbal and written communication skills in English at a B2 level Nice to have Background in ETL using Python Show more Show less
Posted 1 week ago
8.0 - 13.0 years
0 Lacs
Pune, Maharashtra, India
On-site
We are seeking an experienced Lead Data Engineer to join our dynamic team. As a Lead Data Engineer, you will be responsible for designing, developing, and maintaining data integration solutions for our clients. You will lead a team of engineers to ensure the delivery of high-quality, scalable, and performant data integration solutions. This is an exciting opportunity for a seasoned data integration professional passionate about technology and who thrives in a fast-paced, dynamic environment. Responsibilities Design, develop, and maintain data integration solutions for clients Lead a team of engineers to ensure the delivery of high-quality, scalable, and performant data integration solutions Collaborate with cross-functional teams to understand business requirements and design data integration solutions that meet those requirements Ensure data integration solutions are secure, reliable, and performant Develop and maintain documentation, including technical specifications, data flow diagrams, and data mappings Continuously learn and stay up-to-date with the latest data integration approaches and tools Requirements Bachelor's degree in Computer Science, Information Systems, or a related field 8-13 years of experience in data engineering, data integration, or a related field Experience with cloud-native or Spark-based ETL tools such as AWS Glue, Azure Data Factory, or GCP Dataflow Strong knowledge of SQL for querying and manipulating data Experience with Snowflake for cloud data warehousing Experience with at least one cloud platform such as AWS, Azure, or GCP Experience leading a team of engineers on data integration projects Good verbal and written communication skills in English at a B2 level Nice to have Experience with ETL using Python Show more Show less
Posted 1 week ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Help shape the future of mobility. At Aptiv, we couldn’t solve mobility’s toughest challenges without our Corporate team. They ensure operations run smoothly by supporting more than 200,000 Aptiv employees and providing the direction and guidance needed as we strive to make the world safer, greener and more connected. IT Data Analytics is a diverse DevOps team of technology enthusiasts enabling our global business. Aptiv has embarked on a Data strategy that focuses on establishing a strong technology team, enterprise data management & cloud-based business solutions. Our team is charged with catalyzing value creation in the most critical areas of Aptiv’s value chain, touching our business by understanding customer demand, manufacturing implications, and our supply base. As a Data Engineer, you will design, develop and implement a cost-effective, scalable, reusable and secured Ingestion framework. You will take advantage of the opportunity to work with business leaders, various stakeholders, and source system SME’s to understand and define the business needs, translate to technical specifications, and ingest data into Google cloud platform, BigQuery. You will design and implement processes for data ingestion, transformation, storage, analysis, modelling, reporting, monitoring, availability, governance and security of high volumes of structured and unstructured data Want to join us? Your Role Pipeline Design & Implementation: Develop and deploy high-throughput data pipelines using the latest GCP technologies. Subject Matter Expertise: Serve as a specialist in data engineering and Google Cloud Platform (GCP) data technologies. Client Communication: Leverage your GCP data engineering experience to engage with clients, understand their requirements, and translate these into technical data solutions. Technical Translation: Analyze business requirements and convert them into technical specifications. Create source-to-target mappings, enhance ingestion frameworks to incorporate internal and external data sources, and transform data according to business rules. Data Cataloging: Develop capabilities to support enterprise-wide data cataloging. Security & Privacy: Design data solutions with a focus on security and privacy. Agile & DataOps: Utilize Agile and DataOps methodologies and implementation strategies in project delivery. Your Background Bachelor’s or Master’s degree in any one of the disciplines: Computer Science, Data & Analytics or similar relevant subjects. 4+ yrs years of hands-on IT experience in a similar role. Proven expertise in SQL – subqueries, aggregations, functions, triggers, Indexes, DB optimization, creating/understanding relational data-based models. Deep experience working with Google Data Products (e.g. BigQuery, Dataproc, Dataplex, Looker, Cloud data fusion, Data Catalog, Dataflow, Cloud composer, Analytics Hub, Pub/Sub, Dataprep, Cloud Bigtable, Cloud SQL, Cloud IAM, Google Kubernetes engine, AutoML). Experience in Qlik replicate , Spark (Scala/Python/Java) and Kafka. Excellent written and verbal skills to communicate technical solutions to business teams. Understanding trends, new concepts, industry standards and new technologies in Data and Analytics space. Ability to work with globally distributed teams. Knowledge of Statistical methods and data modelling knowledge. Working knowledge in designing and creating Tableau/Qlik/Power BI dashboards, Alteryx and Informatica Data Quality. Why join us? You can grow at Aptiv. Aptiv provides an inclusive work environment where all individuals can grow and develop, regardless of gender, ethnicity or beliefs. You can have an impact. Safety is a core Aptiv value; we want a safer world for us and our children, one with: Zero fatalities, Zero injuries, Zero accidents. You have support. We ensure you have the resources and support you need to take care of your family and your physical and mental health with a competitive health insurance package. Your Benefits At Aptiv Benefits/Perks: Personal holidays, Healthcare, Pension, Tax saver scheme, Free Onsite Breakfast, Discounted Corporate Gym Membership. Multicultural environment Learning, professional growth and development in a world-recognized international environment. Access to internal & external training, coaching & certifications. Recognition for innovation and excellence. Access to transportation: Grand Canal Dock is well-connected to public transportation, including DART trains, buses, and bike-sharing services, making it easy to get to and from the area. # Privacy Notice - Active Candidates: https://www.aptiv.com/privacy-notice-active-candidates Aptiv is an equal employment opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, sex, gender identity, sexual orientation, disability status, protected veteran status or any other characteristic protected by law. Show more Show less
Posted 1 week ago
5.0 - 8.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Our company is looking for an experienced Senior Data Engineer to join our team. As a Senior Data Engineer, you will be working on a project that focuses on data integration and ETL for cloud-based platforms. You will be responsible for designing and implementing complex data solutions, ensuring that the data is accurate, reliable, and easily accessible. Responsibilities Design and implement complex data solutions for cloud-based platforms Develop ETL processes using SQL, Python, and other relevant technologies Ensure that data is accurate, reliable, and easily accessible for all stakeholders Collaborate with cross-functional teams to understand data integration needs and requirements Develop and maintain documentation, including technical specifications, data flow diagrams, and data mappings Monitor and optimize data integration processes for performance and efficiency, ensuring data accuracy and integrity Requirements Bachelor's degree in Computer Science, Electrical Engineering, or a related field 5-8 years of experience in data engineering Experience with cloud-native or Spark-based ETL tools such as AWS Glue, Azure Data Factory, or GCP Dataflow Strong knowledge of SQL for data querying and manipulation Experience with Snowflake for data warehousing Experience with cloud platforms such as AWS, GCP, or Azure for data storage and processing Excellent problem-solving skills and attention to detail Good verbal and written communication skills in English at a B2 level Nice to have Experience with ETL using Python Show more Show less
Posted 1 week ago
6.0 years
0 Lacs
Udaipur, Rajasthan, India
On-site
Job Description: We are looking for a highly skilled and experienced Data Engineer with 4–6 years of hands-on experience in designing and implementing robust, scalable data pipelines and infrastructure. The ideal candidate will be proficient in SQL and Python and have a strong understanding of modern data engineering practices. You will play a key role in building and optimizing data systems, enabling data accessibility and analytics across the organization, and collaborating closely with cross-functional teams including Data Science, Product, and Engineering. Key Responsibilities: · Design, develop, and maintain scalable ETL/ELT data pipelines using SQL and Python · Collaborate with data analysts, data scientists, and product teams to understand data needs · Optimize queries and data models for performance and reliability · Integrate data from various sources, including APIs, internal databases, and third-party systems · Monitor and troubleshoot data pipelines to ensure data quality and integrity · Document processes, data flows, and system architecture · Participate in code reviews and contribute to a culture of continuous improvement Required Skills: · 4–6 years of experience in data engineering, data architecture, or backend development with a focus on data · Strong command of SQL for data transformation and performance tuning · Experience with Python (e.g., pandas, Spark, ADF) · Solid understanding of ETL/ELT processes and data pipeline orchestration · Proficiency with RDBMS (e.g., PostgreSQL, MySQL, SQL Server) · Experience with data warehousing solutions (e.g., Snowflake, Redshift, BigQuery) · Familiarity with version control (Git), CI/CD workflows, and containerized environments (Docker, Kubernetes) · Basic Programming Skills · Excellent problem-solving skills and a passion for clean, efficient data systems Preferred Skills: · Experience with cloud platforms (AWS, Azure, GCP) and services like S3, Glue, Dataflow, etc. · Exposure to enterprise solutions (e.g., Databricks, Synapse) · Knowledge of big data technologies (e.g., Spark, Kafka, Hadoop) · Background in real-time data streaming and event-driven architectures · Understanding of data governance, security, and compliance best practices · Prior experience working in agile development environment Educational Qualifications: Bachelor's degree in Computer Science, Information Technology, or a related field. Show more Show less
Posted 1 week ago
7.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Position Overview Job Title - Data Platform Engineer - Tech Lead Location - Pune, India Role Description DB Technology is a global team of tech specialists, spread across multiple trading hubs and tech centers. We have a strong focus on promoting technical excellence – our engineers work at the forefront of financial services innovation using cutting-edge technologies. DB Pune location plays a prominent role in our global network of tech centers, it is well recognized for its engineering culture and strong drive to innovate. We are committed to building a diverse workforce and to creating excellent opportunities for talented engineers and technologists. Our tech teams and business units use agile ways of working to create best solutions for the financial markets. CB Data Services and Data Platform We are seeking an experienced Software Engineer with strong leadership skills to join our dynamic tech team. In this role, you will lead a group of engineers working on cutting-edge technologies in Hadoop, Big Data, GCP, Terraform, Big Query, Data Proc and data management. You will be responsible for overseeing the development of robust data pipelines, ensuring data quality, and implementing efficient data management solutions. Your leadership will be critical in driving innovation, ensuring high standards in data infrastructure, and mentoring team members. Your responsibilities will include working closely with data engineers, analysts, cross-functional teams, and other stakeholders to ensure that our data platform meets the needs of our organization and supports our data-driven initiatives. Join us in building and scaling our tech solutions including hybrid data platform to unlock new insights and drive business growth. If you are passionate about data engineering, we want to hear from you! Deutsche Bank’s Corporate Bank division is a leading provider of cash management, trade finance and securities finance. We complete green-field projects that deliver the best Corporate Bank - Securities Services products in the world. Our team is diverse, international, and driven by shared focus on clean code and valued delivery. At every level, agile minds are rewarded with competitive pay, support, and opportunities to excel. You will work as part of a cross-functional agile delivery team. You will bring an innovative approach to software development, focusing on using the latest technologies and practices, as part of a relentless focus on business value. You will be someone who sees engineering as team activity, with a predisposition to open code, open discussion and creating a supportive, collaborative environment. You will be ready to contribute to all stages of software delivery, from initial analysis right through to production support. What We’ll Offer You As part of our flexible scheme, here are just some of the benefits that you’ll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your Key Responsibilities Technical Leadership: Lead a cross-functional team of engineers in the design, development, and implementation of on prem and cloud-based data solutions. Provide hands-on technical guidance and mentorship to team members, fostering a culture of continuous learning and improvement. Collaborate with product management and stakeholders to define technical requirements and establish delivery priorities. Architectural and Design Capabilities: Architect and implement scalable, efficient, and reliable data management solutions to support complex data workflows and analytics. Evaluate and recommend tools, technologies, and best practices to enhance the data platform. Drive the adoption of microservices, containerization, and serverless architectures within the team. Quality Assurance: Establish and enforce best practices in coding, testing, and deployment to maintain high-quality code standards. Oversee code reviews and provide constructive feedback to promote code quality and team growth. Your Skills And Experience Technical Skills: Bachelor's or Master’s degree in Computer Science, Engineering, or related field. 7+ years of experience in software engineering, with a focus on Big Data and GCP technologies such as Hadoop, PySpark, Terraform, BigQuery, DataProc and data management. Proven experience in leading software engineering teams, with a focus on mentorship, guidance, and team growth. Strong expertise in designing and implementing data pipelines, including ETL processes and real-time data processing. Hands-on experience with Hadoop ecosystem tools such as HDFS, MapReduce, Hive, Pig, and Spark. Hands on experience with cloud platform particularly Google Cloud Platform (GCP), and its data management services (e.g., Terraform, BigQuery, Cloud Dataflow, Cloud Dataproc, Cloud Storage). Solid understanding of data quality management and best practices for ensuring data integrity. Familiarity with containerization and orchestration tools such as Docker and Kubernetes is a plus. Excellent problem-solving skills and the ability to troubleshoot complex systems. Strong communication skills and the ability to collaborate with both technical and non-technical stakeholders Leadership Abilities: Proven experience in leading technical teams, with a track record of delivering complex projects on time and within scope. Ability to inspire and motivate team members, promoting a collaborative and innovative work environment. Strong problem-solving skills and the ability to make data-driven decisions under pressure. Excellent communication and collaboration skills. Proactive mindset, attention to details, and constant desire to improve and innovate. How We’ll Support You Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs About Us And Our Teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment. Show more Show less
Posted 1 week ago
7.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
Responsibilities include: • Design and implement scalable, secure, and cost-effective data architectures using GCP. • Lead the design and development of data pipelines with BigQuery, Dataflow, and Cloud Storage. • Architect and implement data lakes, data warehouses, and real-time data processing solutions on GCP. • Ensure data architecture aligns with business goals, governance, and compliance requirements. • Collaborate with stakeholders to define data strategy and roadmap. • Design and deploy BigQuery solutions for optimized performance and cost efficiency. • Build and maintain ETL/ELT pipelines for large-scale data processing. • Leverage Cloud Pub/Sub, Dataflow, and Cloud Functions for real-time data integration. Requirements • 7+ years of experience in data architecture, with at least 3 years in GCP environments. • Expertise in BigQuery, Cloud Dataflow, Cloud Pub/Sub, Cloud Storage, and related GCP services. • Strong experience in data warehousing, data lakes, and real-time data pipelines. • Proficiency in SQL, Python, or other data processing languages. • Experience with cloud security, data governance, and compliance frameworks. • Strong problem-solving skills and ability to architect solutions for complex data environments. • Google Cloud Certification (Professional Data Engineer, Professional Cloud Architect) preferred. • Leadership experience and ability to mentor technical teams. • Excellent communication and collaboration skills. Show more Show less
Posted 1 week ago
7.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Position Overview Job Title: Production Support Analyst, AVP Location: Bangalore, India Role Description You will be operating within Production services team of Securities Services domain which is a subdivision of Corporate Bank Production Services as a Production Support Engineer. In this role, you will be accountable for the following: To resolve user request supports, troubleshooting functional, application, and infrastructure incidents in the production environment. Work on identified initiatives to automate manual work, application and infrastructure monitoring improvements and platform hygiene. Eyes on glass monitoring of services and batch. Preparing and fulfilling data requests. Participation in incident, change and problem management meetings as required. Deutsche Bank’s Corporate Bank division is a leading provider of cash management, trade finance and securities finance. We complete green-field projects that deliver the best Corporate Bank - Securities Services products in the world. Our team is diverse, international, and driven by shared focus on clean code and valued delivery. At every level, agile minds are rewarded with competitive pay, support, and opportunities to excel. You will work as part of a cross-functional agile delivery team. You will bring an innovative approach to software development, focusing on using the latest technologies and practices, as part of a relentless focus on business value. You will be someone who sees engineering as team activity, with a predisposition to open code, open discussion and creating a supportive, collaborative environment. You will be ready to contribute to all stages of software delivery, from initial analysis right through to production support. What We’ll Offer You As part of our flexible scheme, here are just some of the benefits that you’ll enjoy. Best in class leave policy. Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your Key Responsibilities Provide hands on technical support for a suite of applications/platforms within Deutsche Bank Build up technical subject matter expertise on the applications/platforms being supported including business flows, the application architecture and the hardware configuration. Resolve service requests submitted by the application end users to the best of L2 ability and escalate any issues that cannot be resolved to L3. Conduct real time monitoring to ensure application SLAs are achieved and maximum application availability (up time). Ensure all knowledge is documented and that support runbooks and knowledge articles are kept up to date. Approach support with a proactive attitude, working to improve the environment before issues occur. Update the RUN Book and KEDB as & when required. Participate in all BCP and component failure tests based on the run books Understand flow of data through the application infrastructure. It is critical to understand the dataflow so as to best provide operational support Your Skills And Experience Must Have : Programming Language - Java Operating systems - UNIX, Windows and the underlying infrastructure environments. Middleware - (e.g. MQ, Kafka or similar) WebLogic, Webserver environment - Apache, Tomcat Database - Oracle, MS-SQL, Sybase, No SQL Batch Monitoring - Control-M /Autosys Scripting - UNIX shell and PowerShell, PERL, Python Monitoring Tools – Geneos or App Dynamics or Dynatrace or Grafana ITIL Service Management framework such as Incident, Problem, and Change processes. Preferably knowledge and experience on GCP. Nice to Have : 7+ years of experience in IT in large corporate environments, specifically in the area of controlled production environments or in Financial Services Technology in a client-facing function Good analytical and problem-solving skills ITIL / best practice service context. ITIL foundation is plus. Ticketing Tool experience – Service Desk, Service Now. Understanding of SRE concepts (SLA, SLO’s, SLI’s) Knowledge and development experience in Ansible automation. Working knowledge of one cloud platform (AWS or GCP). Excellent communication skills, both written and verbal, with attention to detail. Ability to work in virtual teams and in matrix structures. How We’ll Support You Training and development to help you excel in your career. Coaching and support from experts in your team A culture of continuous learning to aid progression. A range of flexible benefits that you can tailor to suit your needs. About Us And Our Teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment. Show more Show less
Posted 1 week ago
15.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Position Overview Job Title: Senior Engineer, VP Location: Pune, India Role Description Engineer is responsible for managing or performing work across multiple areas of the bank's overall IT Platform/Infrastructure including analysis, development, and administration. It may also involve taking functional oversight of engineering delivery for specific departments. Work includes: Planning and developing entire engineering solutions to accomplish business goals Building reliability and resiliency into solutions with appropriate testing and reviewing throughout the delivery lifecycle Ensuring maintainability and reusability of engineering solutions Ensuring solutions are well architected and can be integrated successfully into the end-to-end business process flow Reviewing engineering plans and quality to drive re-use and improve engineering capability Participating in industry forums to drive adoption of innovative technologies, tools and solutions in the Bank Engineer is responsible for managing or performing work across multiple areas of the bank's overall IT Platform/Infrastructure including analysis, development, and administration. It may also involve taking functional oversight of engineering delivery for specific departments. Work includes: Planning and developing entire engineering solutions to accomplish business goals Building reliability and resiliency into solutions with appropriate testing and reviewing throughout the delivery lifecycle Ensuring maintainability and reusability of engineering solutions Ensuring solutions are well architected and can be integrated successfully into the endto-end business process flow Reviewing engineering plans and quality to drive re-use and improve engineering capability Participating in industry forums to drive adoption of innovative technologies, tools and solutions in the Bank. Deutsche Bank’s Corporate Bank division is a leading provider of cash management, trade finance and securities finance. We complete green-field projects that deliver the best Corporate Bank - Securities Services products in the world. Our team is diverse, international, and driven by shared focus on clean code and valued delivery. At every level, agile minds are rewarded with competitive pay, support, and opportunities to excel. You will work as part of a cross-functional agile delivery team. You will bring an innovative approach to software development, focusing on using the latest technologies and practices, as part of a relentless focus on business value. You will be someone who sees engineering as team activity, with a predisposition to open code, open discussion and creating a supportive, collaborative environment. You will be ready to contribute to all stages of software delivery, from initial analysis right through to production support." What We’ll Offer You As part of our flexible scheme, here are just some of the benefits that you’ll enjoy, Best in class leave policy. Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your Key Responsibilities The candidate is expected to: Hands-on engineering lead involved in analysis, design, design/code reviews, coding and release activities Champion engineering best practices and guide/mentor team to achieve high performance. Work closely with Business stakeholders, Tribe lead, Product Owner, Lead Architect to successfully deliver the business outcomes. Acquire functional knowledge of the business capability being digitized/re-engineered. Demonstrate ownership, inspire others, innovative thinking, growth mindset and collaborate for success. Your Skills And Experience Minimum 15 years of IT industry experience in Full stack development Expert in Java, Spring Boot, NodeJS, ReactJS, Strong experience in Big data processing – Apache Spark, Hadoop, Bigquery, DataProc, Dataflow etc Strong experience in Kubernetes, OpenShift container platform Experience in Data streaming i.e. Kafka, Pub-sub etc Experience of working on public cloud – GCP preferred, AWS or Azure Knowledge of various distributed/multi-tiered architecture styles – Micro-services, Data mesh, Integrationpatterns etc Experience on modern software product delivery practices, processes and tooling and BIzDevOps skills such asCI/CD pipelines using Jenkins, Git Actions etc Experience on leading teams and mentoring developers Key Skill: Java Spring Boot NodeJS SQL/PLSQL ReactJS Advantageous: Having prior experience in Banking/Finance domain Having worked on hybrid cloud solutions preferably using GCP Having worked on product development How We’ll Support You Training and development to help you excel in your career. Coaching and support from experts in your team. A culture of continuous learning to aid progression. A range of flexible benefits that you can tailor to suit your needs. About Us And Our Teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment. Show more Show less
Posted 1 week ago
0 years
0 Lacs
Kolkata, West Bengal, India
On-site
Azure Data Engineer – Databricks Required Technical Skill Set Data Lake architecture, Azure Services – ADLS, ADF, Azure Databricks, Synapse Build the solution for optimal extraction, transformation, and loading of data from a wide variety of data sources using Azure data ingestion and transformation components. Following technology skills are required – Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Experience with ADF, Dataflow Experience with big data tools like Delta Lake, Azure Databricks Experience with Synapse Designing an Azure Data Solution skills Assemble large, complex data sets that meet functional / non-functional business requirements. Show more Show less
Posted 1 week ago
6.0 - 10.0 years
0 Lacs
Delhi, India
On-site
Description Join GlobalLogic, to be a valid part of the team working on a huge software project for the world-class company providing M2M / IoT 4G/5G modules e.g. to the automotive, healthcare and logistics industries. Through our engagement, we contribute to our customer in developing the end-user modules’ firmware, implementing new features, maintaining compatibility with the newest telecommunication and industry standards, as well as performing analysis and estimations of the customer requirements. Requirements BA / BS degree in Computer Science, Mathematics or related technical field, or equivalent practical experience. Experience in Cloud SQL and Cloud Bigtable Experience in Dataflow, BigQuery, Dataproc, Datalab, Dataprep, Pub / Sub and Genomics Experience in Google Transfer Appliance, Cloud Storage Transfer Service, BigQuery Data Transfer Experience with data processing software (such as Hadoop, Kafka, Spark, Pig, Hive) and with data processing algorithms (MapReduce, Flume). Experience working with technical customers. Experience in writing software in one or more languages such as Java, Python 6-10 years of relevant consulting, industry or technology experience Strong problem solving and troubleshooting skills Strong communicator Job responsibilities Experience working data warehouses, including data warehouse technical architectures, infrastructure components, ETL / ELT and reporting / analytic tools and environments. Experience in technical consulting. Experience architecting, developing software, or internet scale production-grade Big Data solutions in virtualized environments such as Google Cloud Platform (mandatory) and AWS / Azure(good to have) Experience working with big data, information retrieval, data mining or machine learning as well as experience in building multi-tier high availability applications with modern web technologies (such as NoSQL, Kafka,NPL, MongoDB, SparkML, Tensorflow). Working knowledge of ITIL and / or agile methodologies Google Data Engineer certified What we offer Culture of caring. At GlobalLogic, we prioritize a culture of caring. Across every region and department, at every level, we consistently put people first. From day one, you’ll experience an inclusive culture of acceptance and belonging, where you’ll have the chance to build meaningful connections with collaborative teammates, supportive managers, and compassionate leaders. Learning and development. We are committed to your continuous learning and development. You’ll learn and grow daily in an environment with many opportunities to try new things, sharpen your skills, and advance your career at GlobalLogic. With our Career Navigator tool as just one example, GlobalLogic offers a rich array of programs, training curricula, and hands-on opportunities to grow personally and professionally. Interesting & meaningful work. GlobalLogic is known for engineering impact for and with clients around the world. As part of our team, you’ll have the chance to work on projects that matter. Each is a unique opportunity to engage your curiosity and creative problem-solving skills as you help clients reimagine what’s possible and bring new solutions to market. In the process, you’ll have the privilege of working on some of the most cutting-edge and impactful solutions shaping the world today. Balance and flexibility. We believe in the importance of balance and flexibility. With many functional career areas, roles, and work arrangements, you can explore ways of achieving the perfect balance between your work and life. Your life extends beyond the office, and we always do our best to help you integrate and balance the best of work and life, having fun along the way! High-trust organization. We are a high-trust organization where integrity is key. By joining GlobalLogic, you’re placing your trust in a safe, reliable, and ethical global company. Integrity and trust are a cornerstone of our value proposition to our employees and clients. You will find truthfulness, candor, and integrity in everything we do. About GlobalLogic GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner to the world’s largest and most forward-thinking companies. Since 2000, we’ve been at the forefront of the digital revolution – helping create some of the most innovative and widely used digital products and experiences. Today we continue to collaborate with clients in transforming businesses and redefining industries through intelligent products, platforms, and services. Show more Show less
Posted 1 week ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Where: Hyderabad/ Bengaluru, India (Hybrid Mode 3 Days/Week in Office) Job Description Collaborate with stakeholders to develop a data strategy that meets enterprise needs and industry requirements. Create an inventory of the data necessary to build and implement a data architecture. Envision data pipelines and how data will flow through the data landscape. Evaluate current data management technologies and what additional tools are needed. Determine upgrades and improvements to current data architectures. Design, document, build and implement database architectures and applications. Should have hands-on experience in building high scale OLAP systems. Build data models for database structures, analytics, and use cases. Develop and enforce database development standards with solid DB/ Query optimizations capabilities. Integrate new systems and functions like security, performance, scalability, governance, reliability, and data recovery. Research new opportunities and create methods to acquire data. Develop measures that ensure data accuracy, integrity, and accessibility. Continually monitor, refine, and report data management system performance. Required Qualifications And Skillset Extensive knowledge of Azure, GCP clouds, and DataOps Data Eco-System (super strong in one of the two clouds and satisfactory in the other one) Hands-on expertise in systems like Snowflake, Synapse, SQL DW, BigQuery, and Cosmos DB. (Expertise in any 3 is a must) Azure Data Factory, Dataiku, Fivetran, Google Cloud Dataflow (Any 2) Hands-on experience in working with services/technologies like - Apache Airflow, Cloud Composer, Oozie, Azure Data Factory, and Cloud Data Fusion (Expertise in any 2 is required) Well-versed with Data services, integration, ingestion, ELT/ETL, Data Governance, Security, and Meta-driven Development. Expertise in RDBMS (relational database management system) – writing complex SQL logic, DB/Query optimization, Data Modelling, and managing high data volume for mission-critical applications. Strong grip on programming using Python and PySpark. Clear understanding of data best practices prevailing in the industry. Preference to candidates having Azure or GCP architect certification. (Either of the two would suffice) Strong networking and data security experience. Awareness Of The Following Application development understanding (Full Stack) Experience on open-source tools like Kafka, Spark, Splunk, Superset, etc. Good understanding of Analytics Platform Landscape that includes AI/ML Experience in any Data Visualization tool like PowerBI / Tableau / Qlik /QuickSight etc. About Us Gramener is a design-led data science company. We build custom Data & AI solutions that help solve complex business problems with actionable insights and compelling data stories. We partner with enterprise data and digital transformation teams to improve the data-driven decision-making culture across the organization. Our open standard low-code platform, Gramex, rapidly builds engaging Data & AI solutions across multiple business verticals and use cases. Our solutions and technology have been recognized by analysts such as Gartner and Forrester and have won several awards. We Offer You a chance to try new things & take risks. meaningful problems you'll be proud to solve. people you will be comfortable working with. transparent and innovative work environment. To know more about us visit Gramener Website and Gramener Blog. If anyone looking for the same, kindly share below mentioned details. Total Experience Relevant Experience: Ctct Notice Period: Ectc Current Location: Skills:- OLAP, Microsoft Azure, Architecting Show more Show less
Posted 1 week ago
4.0 - 7.0 years
0 Lacs
Karnataka, India
On-site
Base Location: Bengaluru Preferred Industry: Retail, Textile and Apparel, FMCG, Logistics Minimum Qualification: Postgraduate Preferred Experience: 4 to 7 Years of relevant experience Key Result Areas Enable Seamless Inventory Actuals Data Visibility Automations enabled Integrate Dataflow from SAP/Other systems. Inventory Planning Modules for better Tracking Inventory PPM Modules and Supporting Dashboards are built for reviewing deviations Enable Technology usage NLP Reports built to enhance system usage and drive easy on the go knowledge Accessibility Integrate Demand & Supply Planning System built to ensure Demand to Supply flow captured at most granular level Drive System Usage Improvement in Builds are taken, built and tested for ensuring 100% User Adoption Help reduce Manual Excel efforts Drive system available resources to make Dashboards that can be auto triggered as Alerts and Mailers Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
Remote
#Dear Associates, #Hope you are doing well & Safe! #Greetings from Rrootshell Technologiiss #We have URGENT & MULTIPLE Requirements for #Data Scientist Position. # Work Mode: Remote Work Opportunity in INDIA Only. #This is for FULL -TIME role. Job Description: Data Scientist / Data Analysis 5+ years of experience designing and implementing AI/ML solutions on Google Cloud Platform. Google Professional Machine Learning Engineer certification is preffered. Strong proficiency with GCP services such as Vertex AI, BigQuery, Cloud Dataflow, AI Platform, and AutoML. Hands-on experience with machine learning frameworks. Proficiency in programming languages such as Python. Experience with MLOps and CI/CD tools, including Cloud Build and Vertex AI Pipelines. Strong understanding of AI/ML algorithms, data structures, and model optimization techniques. Experience with containerization (Docker, Kubernetes) and orchestration using Google Kubernetes Engine (GKE). Hands-on experience with Generative AI applications and deploying them on any cloud #Preferring for IMMEDIATE JOINERS OR max 15 Days’ notice If you are interested, kindly share your UPDATED resume with jobs@rrootshell.com LinkedIn ID: linkedin.com/in/ravi-kumar-rapalli-636878122 Regards Ravi Kumar Account Manager Show more Show less
Posted 1 week ago
0 years
0 Lacs
Greater Kolkata Area
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals. In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decision-making for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities Strong skills in Python and GCP services including Google Composer, Bigquery, Google Storage) Strong expertise in writing SQL, PL-SQL in Oracle, MYSQL or any other relation database. Good to have skill: Data warehousing & ETL (any tool) Proven experience in using GCP services is preferred. Strong Presentation and communication skills Analytical & Problem-Solving skills Mandatory Skill Sets GCP Data Engineer Preferred Skill Sets GCP Data Engineer Years Of Experience Required 4-8 Education Qualification Btech/MBA/MCA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Technology, Bachelor of Engineering, Master of Business Administration Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Data Engineering, GCP Dataflow Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Communication, Competitive Advantage, Continuous Process Improvement, Creativity, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Inclusion, Industry Trend Analysis {+ 12 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date Show more Show less
Posted 1 week ago
2.0 - 3.0 years
0 Lacs
India
On-site
Meridian Trade Links is urgently hiring experienced Nurses for Arab countries. Requirements: HRD authentication, Dataflow verification, and Prometric Exam cleared Bachelor’s/Diploma in Nursing with 2-3 years of experience Benefits: Competitive salary Apply Now! Send your CV to cv@meridiantradelinks.com OR hr@meridiantradelinks.com Job Type: Full-time Schedule: Day shift Work Location: In person
Posted 1 week ago
10.0 years
0 Lacs
Kerala
On-site
Job Role: Data Architect Experience: 10+ years Notice period: Immediate to 15 days Location: Trivandrum / Kochi Introduction We are looking for candidates with 10 +years of experience in data architect role. Responsibilities include: Design and implement scalable, secure, and cost-effective data architectures using GCP. Lead the design and development of data pipelines with BigQuery, Dataflow, and Cloud Storage. Architect and implement data lakes, data warehouses, and real-time data processing solutions on GCP. Ensure data architecture aligns with business goals, governance, and compliance requirements. Collaborate with stakeholders to define data strategy and roadmap. Design and deploy BigQuery solutions for optimized performance and cost efficiency. Build and maintain ETL/ELT pipelines for large-scale data processing. Leverage Cloud Pub/Sub, Dataflow, and Cloud Functions for real-time data integration. Implement best practices for data security, privacy, and compliance in cloud environments. Integrate machine learning workflows with data pipelines and analytics tools. Define data governance frameworks and manage data lineage. Lead data modeling efforts to ensure consistency, accuracy, and performance across systems. Optimize cloud infrastructure for scalability, performance, and reliability. Mentor junior team members and ensure adherence to architectural standards. Collaborate with DevOps teams to implement Infrastructure as Code (Terraform, Cloud Deployment Manager). Ensure high availability and disaster recovery solutions are built into data systems. Conduct technical reviews, audits, and performance tuning for data solutions. Design solutions for multi-region and multi-cloud data architecture. Stay updated on emerging technologies and trends in data engineering and GCP. Drive innovation in data architecture, recommending new tools and services on GCP. Certifications : Google Cloud Certification is Preferred. Primary Skills : 7+ years of experience in data architecture, with at least 3 years in GCP environments. Expertise in BigQuery, Cloud Dataflow, Cloud Pub/Sub, Cloud Storage, and related GCP services. Strong experience in data warehousing, data lakes, and real-time data pipelines. Proficiency in SQL, Python, or other data processing languages. Experience with cloud security, data governance, and compliance frameworks. Strong problem-solving skills and ability to architect solutions for complex data environments. Google Cloud Certification (Professional Data Engineer, Professional Cloud Architect) preferred. Leadership experience and ability to mentor technical teams. Excellent communication and collaboration skills.
Posted 1 week ago
2.0 years
0 Lacs
Thiruvananthapuram
On-site
What you’ll do: As a Senior Developer at Equifax, you will oversee and steer the delivery of innovative batch and data products, primarily leveraging Java and the Google Cloud Platform (GCP). Your expertise will ensure the efficient and timely deployment of high quality data solutions that support our business objectives and client needs. Project Development: Development, and deployment of batch and data products, ensuring alignment with business goals and technical requirements. Technical Oversight: Provide technical direction and oversee the implementation of solutions using Java and GCP, ensuring best practices in coding, performance, and security. Team Management: Mentor and guide junior developers and engineers, fostering a collaborative and high performance environment. Stakeholder Collaboration: Work closely with cross functional teams including product managers, business analysts, and other stakeholders to gather requirements and translate them into technical solutions. Documentation and Compliance: Maintain comprehensive documentation for all technical processes, ensuring compliance with internal and external standards. Continuous Improvement: Advocate for and implement continuous improvement practices within the team, staying abreast of emerging technologies and methodologies. What experience you need: Education: Bachelor's degree in Computer Science, Information Technology, or a related field. Experience: Minimum of 2-5 years of relevant experience in software development, with a focus on batch processing and data solutions. Technical Skills: Proficiency in Java, with a strong understanding of its ecosystems. 2+ year of relevant Java, Sprint, Spring Boot, REST, Microservices, Hibernate, JPA, RDBMS Minimum 2 Git, CI/CD Pipelines, Jenkins Experience with Google Cloud Platform, including BigQuery, Cloud Storage, Dataflow, Big Table and other GCP tools. Familiarity with ETL processes, data modeling, and SQL. Soft Skills: Strong problem solving abilities and a proactive approach to project management. Effective communication and interpersonal skills, with the ability to convey technical concepts to nontechnical stakeholders. What could set you apart: Technical Skills: Proficiency in Java, with a strong understanding of its ecosystems. 2+ year of relevant Java, Sprint, Spring Boot, REST, Microservices, Hibernate, JPA, RDBMS Minimum 2 Git, CI/CD Pipelines, Jenkins Experience with Google Cloud Platform, including BigQuery, Cloud Storage, Dataflow, Big Table and other GCP tools. Familiarity with ETL processes, data modeling, and SQL. Certification in Google Cloud (e.g., Associate Cloud Engineer). Experience with other cloud platforms (AWS, Azure) is a plus. Understanding of data privacy regulations and compliance frameworks.
Posted 1 week ago
8.0 years
0 Lacs
Bhubaneswar, Odisha, India
Remote
Experience : 8.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: Terraform, Cloud Composer, Dataproc, Dataflow, AWS, GCP, Terraform, BigQuery, SRE, GKE, GCP certification Forbes Advisor is Looking for: Senior GCP Cloud Administrator Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. We are looking for an experienced GCP Administrator to join our team. The ideal candidate will have strong hands-on experience with IAM Administration, multi-account management, Big Query administration, performance optimization, monitoring and cost management within Google Cloud Platform (GCP). Responsibilities: Manages and configures roles/permissions in GCP IAM by following the principle of least privileged access Manages Big Query service by way of optimizing slot assignments and SQL Queries, adopting FinOps practices for cost control, troubleshooting and resolution of critical data queries, etc. Collaborate with teams like Data Engineering, Data Warehousing, Cloud Platform Engineering, SRE, etc. for efficient Data management and operational practices in GCP Create automations and monitoring mechanisms for GCP Data-related services, processes and tasks Work with development teams to design the GCP-specific cloud architecture Provisioning and de-provisioning GCP accounts and resources for internal projects. Managing, and operating multiple GCP subscriptions Keep technical documentation up to date Proactively being up to date on GCP announcements, services and developments. Requirements: Must have 5+ years of work experience on provisioning, operating, and maintaining systems in GCP Must have a valid certification of either GCP Associate Cloud Engineer or GCP Professional Cloud Architect. Must have hands-on experience on GCP services such as Identity and Access Management (IAM), BigQuery, Google Kubernetes Engine (GKE), etc. Must be capable to provide support and guidance on GCP operations and services depending upon enterprise needs Must have a working knowledge of docker containers and Kubernetes. Must have strong communication skills and the ability to work both independently and in a collaborative environment. Fast learner, Achiever, sets high personal goals Must be able to work on multiple projects and consistently meet project deadlines Must be willing to work on shift-basis based on project requirements. Good to Have: Experience in Terraform Automation over GCP Infrastructure provisioning Experience in Cloud Composer, Dataproc, Dataflow Storage and Monitoring services Experience in building and supporting any form of data pipeline. Multi-Cloud experience with AWS. New-Relic monitoring. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The dataflow job market in India is currently experiencing a surge in demand for skilled professionals. With the increasing reliance on data-driven decision-making in various industries, the need for individuals proficient in managing and analyzing dataflow is on the rise. This article aims to provide job seekers with valuable insights into the dataflow job landscape in India.
These cities are known for their thriving tech ecosystems and are home to numerous companies actively hiring for dataflow roles.
The average salary range for dataflow professionals in India varies based on experience levels. Entry-level positions can expect to earn between INR 4-6 lakhs per annum, while experienced professionals can command salaries upwards of INR 12-15 lakhs per annum.
In the dataflow domain, a typical career path may involve starting as a Junior Data Analyst or Data Engineer, progressing to roles such as Senior Data Scientist or Data Architect, and eventually reaching positions like Tech Lead or Data Science Manager.
In addition to expertise in dataflow tools and technologies, dataflow professionals are often expected to have proficiency in programming languages such as Python or R, knowledge of databases like SQL, and familiarity with data visualization tools like Tableau or Power BI.
As you navigate the dataflow job market in India, remember to showcase your skills and experiences confidently during interviews. Stay updated with the latest trends in dataflow and continuously upskill to stand out in a competitive job market. Best of luck in your job search journey!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2