Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
8 - 13 years
14 - 24 Lacs
Chennai
Hybrid
Greetings from Getronics! We have permanent opportunities for GCP Data Engineers in Chennai Location . Hope you are doing well! This is Abirami from Getronics Talent Acquisition team. We have multiple opportunities for Senior GCP Data Engineers for our automotive client in Chennai Sholinganallur location. Please find below the company profile and Job Description. If interested, please share your updated resume, recent professional photograph and Aadhaar proof at the earliest to abirami.rsk@getronics.com. Company : Getronics (Permanent role) Client : Automobile Industry Experience Required : 8+ Years in IT and minimum 4+ years in GCP Data Engineering Location : Chennai (Elcot - Sholinganallur) Work Mode : Hybrid Position Description: We are currently seeking a seasoned GCP Cloud Data Engineer with 4+ years of experience in leading/implementing GCP data projects, preferrable implementing complete data centric model. This position is to design & deploy Data Centric Architecture in GCP for Materials Management platform which would get / give data from multiple applications modern & Legacy in Product Development, Manufacturing, Finance, Purchasing, N-Tier supply Chain, Supplier collaboration Design and implement data-centric solutions on Google Cloud Platform (GCP) using various GCP tools like Storage Transfer Service, Cloud Data Fusion, Pub/Sub, Data flow, Cloud compression, Cloud scheduler, Gutil, FTP/SFTP, Dataproc, BigTable etc. • Build ETL pipelines to ingest the data from heterogeneous sources into our system • Develop data processing pipelines using programming languages like Java and Python to extract, transform, and load (ETL) data • Create and maintain data models, ensuring efficient storage, retrieval, and analysis of large datasets • Deploy and manage databases, both SQL and NoSQL, such as Bigtable, Firestore, or Cloud SQL, based on project requirements • Collaborate with cross-functional teams to understand data requirements and design scalable solutions that meet business needs. • Implement security measures and data governance policies to ensure the integrity and confidentiality of data. • Optimize data workflows for performance, reliability, and cost-effectiveness on the GCP infrastructure. Skill Required: - GCP Data Engineer, Hadoop, Spark/Pyspark, Google Cloud Platform (Google Cloud Platform) services: BigQuery, DataFlow, Pub/Sub, BigTable, Data Fusion, DataProc, Cloud Compose, Cloud SQL, Compute Engine, Cloud Functions, and App Engine. - 8+ years of professional experience in: o Data engineering, data product development and software product launches. - 4+ years of cloud data engineering experience building scalable, reliable, and cost- effective production batch and streaming data pipelines using: Data warehouses like Google BigQuery. Workflow orchestration tools like Airflow. Relational Database Management System like MySQL, PostgreSQL, and SQL Server. Real-Time data streaming platform like Apache Kafka, GCP Pub/Sub. Education Required: Any Bachelors' degree LOOKING FOR IMMEDIATE TO 30 DAYS NOTICE CANDIDATES ONLY. Regards, Abirami Getronics Recruitment team
Posted 1 month ago
3 - 6 years
4 - 8 Lacs
Bengaluru
Work from Office
locationsIN - Bangaloreposted onPosted Today time left to applyEnd DateMay 22, 2025 (5 days left to apply) job requisition idR140300 Company Overview A.P. Moller - Maersk is an integrated container logistics company and member of the A.P. Moller Group. Connecting and simplifying trade to help our customers grow and thrive . With a dedicated team of over 95,000 employees, operating in 130 countries; we go all the way to enable global trade for a growing world . From the farm to your refrigerator, or the factory to your wardrobe, A.P. Moller - Maersk is developing solutions that meet customer needs from one end of the supply chain to the other. About the Team At Maersk, the Global Ocean Manifest team is at the heart of global trade compliance and automation. We build intelligent, high-scale systems that seamlessly integrate customs regulations across 100+ countries, ensuring smooth cross-border movement of cargo by ocean, rail, and other transport modes. Our mission is to digitally transform customs documentation, reducing friction, optimizing workflows, and automating compliance for a complex web of regulatory bodies, ports, and customs authorities. We deal with real-time data ingestion, document generation, regulatory rule engines, and multi-format data exchange while ensuring resilience and security at scale. Key Responsibilities Work with large, complex datasets and ensure efficient data processing and transformation. Collaborate with cross-functional teams to gather and understand data requirements. Ensure data quality, integrity, and security across all processes. Implement data validation, lineage, and governance strategies to ensure data accuracy and reliability. Build, optimize , and maintain ETL pipelines for structured and unstructured data , ensuring high throughput, low latency, and cost efficiency . Experience in building scalable, distributed data pipelines for processing real-time and historical data. Contribute to the architecture and design of data systems and solutions. Write and optimize SQL queries for data extraction, transformation, and loading (ETL). Advisory to Product Owners to identify and manage risks, debt, issues and opportunities for the technical improvement . Providing continuous improvement suggestions in internal code frameworks, best practices and guidelines . Contribute to engineering innovations that fuel Maersks vision and mission. Required Skills & Qualifications 4 + years of experience in data engineering or a related field. Strong problem-solving and analytical skills. E xperience on Java, Spring framework Experience in building data processing pipelines using Apache Flink and Spark. Experience in distributed data lake environments ( Dremio , Databricks, Google BigQuery , etc.) E xperience on Apache Kafka, Kafka Streams Experience working with databases. PostgreSQL preferred, with s olid experience in writin g and opti mizing SQL queries. Hands-on experience in cloud environments such as Azure Cloud (preferred), AWS, Google Cloud, etc. Experience with data warehousing and ETL processes . Experience in designing and integrating data APIs (REST/ GraphQL ) for real-time and batch processing. Knowledge on Great Expectations, Apache Atlas, or DataHub , would be a plus Knowledge on RBAC, encryption, GDPR compliance would be a plus Business skills Excellent communication and collaboration skills Ability to translate between technical language and business language, and communicate to different target groups Ability to understand complex design Possessing the ability to balance and find competing forces & opinions , within the development team Personal profile Fact based and result oriented Ability to work independently and guide the team Excellent verbal and written communication Maersk is committed to a diverse and inclusive workplace, and we embrace different styles of thinking. Maersk is an equal opportunities employer and welcomes applicants without regard to race, colour, gender, sex, age, religion, creed, national origin, ancestry, citizenship, marital status, sexual orientation, physical or mental disability, medical condition, pregnancy or parental leave, veteran status, gender identity, genetic information, or any other characteristic protected by applicable law. We will consider qualified applicants with criminal histories in a manner consistent with all legal requirements. We are happy to support your need for any adjustments during the application and hiring process. If you need special assistance or an accommodation to use our website, apply for a position, or to perform a job, please contact us by emailing . Maersk is committed to a diverse and inclusive workplace, and we embrace different styles of thinking. Maersk is an equal opportunities employer and welcomes applicants without regard to race, colour, gender, sex, age, religion, creed, national origin, ancestry, citizenship, marital status, sexual orientation, physical or mental disability, medical condition, pregnancy or parental leave, veteran status, gender identity, genetic information, or any other characteristic protected by applicable law. We will consider qualified applicants with criminal histories in a manner consistent with all legal requirements. We are happy to support your need for any adjustments during the application and hiring process. If you need special assistance or an accommodation to use our website, apply for a position, or to perform a job, please contact us by emailing .
Posted 1 month ago
10 - 18 years
25 - 35 Lacs
Hyderabad
Work from Office
Roles and Responsibilities • 10+ years of relevant work experience, including previous experience leading Data related projects in the field of Reporting and Analytics. • Design, build & maintain scalable data lake and data warehouse in cloud (GCP) • Expertise in gathering business requirements, analyzing business needs, defining the BI/DW architecture to support and help deliver technical solutions to complex business and technical requirements • Creating solution prototypes and participating in technology selection. Perform POC and technical presentations • Architect, develop and test scalable data warehouses and data pipelines architecture in Cloud Technologies (GCP) • Experience in SQL and No SQL DBMS like MS SQL Server, MySQL, PostgreSQL, DynamoDB, Cassandra, MongoDB. • Design and develop scalable ETL processes, including error handling. • Expert in Query and program languages MS SQL Server, T-SQL, PostgreSQL, MY SQL, Python, R. • Preparing data structures for advanced analytics and self-service reporting using MS SQL, SSIS, SSRS • Write scripts for stored procedures, database snapshots backups and data archiving. • Experience with any of these cloud-based technologies: o PowerBI/Tableau, Azure Data Factory, Azure Synapse, Azure Data Lake o AWS RedShift, Glue, Athena, AWS Quicksight o Google Cloud Platform Good to have : • Agile development environment pairing DevOps with CI/CD pipelines • AI/ML background
Posted 1 month ago
2 - 5 years
5 - 9 Lacs
Hyderabad
Work from Office
Wipro Limited (NYSEWIT, BSE507685, NSEWIPRO) is a leading technology services and consulting company focused on building innovative solutions that address clients’ most complex digital transformation needs. Leveraging our holistic portfolio of capabilities in consulting, design, engineering, and operations, we help clients realize their boldest ambitions and build future-ready, sustainable businesses. With over 230,000 employees and business partners across 65 countries, we deliver on the promise of helping our customers, colleagues, and communities thrive in an ever-changing world. For additional information, visit us at www.wipro.com. ? Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.
Posted 1 month ago
5 - 8 years
10 - 14 Lacs
Chennai
Work from Office
Wipro Limited (NYSEWIT, BSE507685, NSEWIPRO) is a leading technology services and consulting company focused on building innovative solutions that address clients’ most complex digital transformation needs. Leveraging our holistic portfolio of capabilities in consulting, design, engineering, and operations, we help clients realize their boldest ambitions and build future-ready, sustainable businesses. With over 230,000 employees and business partners across 65 countries, we deliver on the promise of helping our customers, colleagues, and communities thrive in an ever-changing world. For additional information, visit us at www.wipro.com. About The Role Role Purpose The purpose of this role is to provide solutions and bridge the gap between technology and business know-how to deliver any client solution ? Please find the below JD Exp5-8 Years Good understanding of DWH GCP(Google Cloud Platform) BigQuery knowledge Knowledge of GCP Storage GCP Workflows and Functions Python CDC Extractor Tools like(Qlik/Nifi) BI Knowledge(like Power BI or looker) ? 2. Skill upgradation and competency building Clear wipro exams and internal certifications from time to time to upgrade the skills Attend trainings, seminars to sharpen the knowledge in functional/ technical domain Write papers, articles, case studies and publish them on the intranet ? Deliver No. Performance Parameter Measure 1. Contribution to customer projects Quality, SLA, ETA, no. of tickets resolved, problem solved, # of change requests implemented, zero customer escalation, CSAT 2. Automation Process optimization, reduction in process/ steps, reduction in no. of tickets raised 3. Skill upgradation # of trainings & certifications completed, # of papers, articles written in a quarter ? Mandatory Skills: Cloud-PaaS-GCP-Google Cloud Platform. Experience5-8 Years. Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.
Posted 1 month ago
10 - 15 years
10 - 20 Lacs
Mumbai, Gurugram, Bengaluru
Work from Office
Job Title - Data Scientist and Analytics Level 7:Manager Ind & Func AI Decision Science Manager S&C Management Level:07 - Manager Location Bangalore/Gurgaon/Hyderabad/Mumbai Must have skills: Technical (Python, SQL, ML and AI), Functional (Data Scientist and B2B Analytics preferably in Telco and S&P industries) Good to have skillsGEN AI, Agentic AI, cloud (AWS/Azure, GCP) Job Summary : About Global Network Data & AI:- Accenture Strategy & Consulting Global Network - Data & AI practice help our clients grow their business in entirely new ways. Analytics enables our clients to achieve high performance through insights from data - insights that inform better decisions and strengthen customer relationships. From strategy to execution, Accenture works with organizations to develop analytic capabilities - from accessing and reporting on data to predictive modelling - to outperform the competition About Comms & Media practice The Accenture Center for Data and Insights (CDI) team helps businesses integrate data and AI into their operations to drive innovation and business growth by designing and implementing data strategies, generating actionable insights from data, and enabling clients to make informed decisions. In CDI, we leverage AI (predictive + generative), analytics, and automation to build innovative and practical solutions, tools and capabilities. The team is also working on building and socializing a Marketplace to democratize data and AI solutions within Accenture and for clients. Globally, CDI practice works across industry to develop value growth strategies for its clients and infuse AI & GenAI to help deliver top their business imperatives i.e., revenue growth & cost reduction. From multi-year Data & AI transformation projects to shorter more agile engagements, we have a rapidly expanding portfolio of hyper-growth clients and an increasing footprint with next-gen solutions and industry practices. Roles & Responsibilities: Experienced in Analytics in B2B domain. Responsible to help the clients with designing & delivering AI/ML solutions. He/she should be strong in Telco and S&P domain, AI fundamentals and should have good hands-on experience working with the following: Ability to work with large data sets and present conclusions to key stakeholders; Data management using SQL. Data manipulation and aggregation using Python. Propensity modeling using various ML algorithms. Text mining using NLP/AI techniques Propose solutions to the client based on gap analysis for the existing Telco platforms that can generate long term & sustainable value to the client. Gather business requirements from client stakeholders via interactions like interviews and workshops with all stakeholders Track down and read all previous information on the problem or issue in question. Explore obvious and known avenues thoroughly. Ask a series of probing questions to get to the root of a problem. Ability to understand the as-is process; understand issues with the processes which can be resolved either through Data & AI or process solutions and design detail level to-be state Understand customer needs and identify/translate them to business requirements (business requirement definition), business process flows and functional requirements and be able to inform the best approach to the problem. Adopt a clear and systematic approach to complex issues (i.e. A leads to B leads to C). Analyze relationships between several parts of a problem or situation. Anticipate obstacles and identify a critical path for a project. Independently able to deliver products and services that empower clients to implement effective solutions. Makes specific changes and improvements to processes or own work to achieve more. Work with other team members and make deliberate efforts to keep others up to date. Establish a consistent and collaborative presence with clients and act as the primary point of contact for assigned clients; escalate, track, and solve client issues. Partner with clients to understand end clients' business goals, marketing objectives, and competitive constraints. Storytelling Crunch the data & numbers to craft a story to be presented to senior client stakeholders. Professional & Technical Skills: Overall 10+ years of experience in Data Science B.Tech Engineering from Tier 1 school or Msc in Statistics/Data Science from a Tier 1/Tier 2 Demonstrated experience in solving real-world data problems through Data & AI Direct onsite experience (i.e., experience of facing client inside client offices in India or abroad) is mandatory. Please note we are looking for client facing roles. Proficiency with data mining, mathematics, and statistical analysis Advanced pattern recognition and predictive modeling experience; knowledge of Advanced analytical fields in text mining, Image recognition, video analytics, IoT etc. Execution level understanding of econometric/statistical modeling packages Traditional techniques like Linear/logistic regression, multivariate statistical analysis, time series techniques, fixed/Random effect modelling. Machine learning techniques like - Random Forest, Gradient Boosting, XG boost, decision trees, clustering etc. Knowledge of Deep learning modeling techniques like RNN, CNN etc. Experience using digital & statistical modeling software Python (must), R, PySpark, SQL (must), BigQuery, Vertex AI Proficient in Excel, MS word, Power point, and corporate soft skills Knowledge of Dashboard creation platforms Excel, tableau, Power BI etc. Excellent written and oral communication skills with ability to clearly communicate ideas and results to non-technical stakeholders. Strong analytical, problem-solving skills and good communication skills Self-Starter with ability to work independently across multiple projects and set priorities Strong team player Proactive and solution oriented, able to guide junior team members. Execution knowledge of optimization techniques is a good-to-have Exact optimization Linear, Non-linear optimization techniques Evolutionary optimization Both population and search-based algorithms Cloud platform Certification, experience in Computer Vision are good-to-haves Qualifications Experience: B.Tech Engineering from Tier 1 school or Msc in Statistics/Data Science from a Tier 1/Tier 2 Educational Qualification: B.tech or MSC in Statistics and Data Science
Posted 1 month ago
7 - 11 years
6 - 10 Lacs
Mumbai
Work from Office
Skill required:Procure to Pay Processing - Invoice Processing Operations Designation:Management Level - Team Lead/Consultant Job Location:Mumbai Qualifications:Any Graduation Years of Experience:7 to 11 years What would you do? The incumbent should be an expert in Accounts payable lifecycle and will be responsible for Must be flexible in working hours UK/US (EST hours in US shift if required) Managing team of 30-35 FTEs.for end to end process. Effciently delivering the service for end-to-end PTP process which includes Invoice processing, Payments, AP helpdesk, AP Account reconciliation, Vendor statement Recon and T&E. The role is also expected to perform the smooth transition for PTP sub-processes. He / She must have independently managed the Accounts payable process for International client, worked in BPO organization in a prior assignment(S) at least 7-8 years out of 10-12 years The Procure to Pay Processing team helps clients and organizations by boosting vendor compliance, cutting savings erosion, improving discount capture using preferred suppliers, and in confirming pricing and terms prior to payment. The team is responsible for accounting of goods and services, through requisitioning, purchasing and receiving. They also look after order sequence of procurement and financial process end to end. In Invoice Processing Operations you will ensure efficient and accurate processing of expense invoices / claims in adherence with client policy and procedures.You will be working on audit claims in accordance with client policies and procedures. You will work on save/post invoice in ERP,verify WHT, VAT/WHT discrepancy resolution.You will also be required to post the invoices for payment and work on PO Process, Non - PO, credit note, 2 way Match & 3 Way Match, Email management and ERP Knowledge. What are we looking for? Adaptable and flexible Ability to perform under pressure Problem-solving skills Detail orientation Ability to establish strong client relationship Minimum 10-12 years of AP experience in BPO out of which 7-8 years minimum with experience @ Lead roles in different capacities. Minimum Bachelor's degree in Finance Accounting or related field Advanced knowledge of AP concepts and applications Strong understanding of AP metrics and SLAs and the factors that influence them System & applications Experience of working in SAP/Oracle ERP would be an added advantage. Intermediate knowledge of MS office tools (Excel/Word/PPT) is must. Having advanced excel knowledge would be an added advantage. Ability to run/support automation/RPA/process improvement initiatives parallel to the core job Ability to interact with client finance leads, understands the business and process. Excellent in communication skills both oral and written as need to interact client leadership. Should be able ssto articulate the things. Good understanding of risks, issues and have thought process to anticipate the potential risks in a process and set mitigations plans/controls to eliminate or minimize the risks. Roles and Responsibilities The Role:The incumbent should be an expert in Accounts payable lifecycle & will be responsible for:Must be flexible in working hours UK/US (EST hours in US shift if required)Managing team of 30-35 FTEs for end to end process.Effciently delivering the service for end-to-end PTP process which includes Invoice processing, Payments, AP helpdesk, AP Account reconciliation, Vendor statement Recon & T&E.The role is also expected to perform the smooth transition for PTP sub-processes. He / She must have independently managed the Accounts payable process for International client, worked in BPO organization in a prior assignment(S) at least 7-8 years out of 10-12 years Functional Responsibilities:Complete underst&ing of accounts payable life cycle & must possess in-depth knowledge of processing all categories of Invoices (PO, Non-PO, OTP Invoices, Utility Invoices, Statutory Payments, Payments Vendor Master, AP helpdesk.Should be an expert in managing all sub-processes PTP.Should have experience of h&ling international client in BPM organization. Must possess great interpersonal skills, must have experience of speaking to client leads & have regular governance.Manage AP teams & processes in accordance with documented procedures & policies.Participate in the weekly, monthly governance call & manage the status call. Lead the resolution of complex or sensitive issues from client, senior management, or vendor queries on a timely basis.Track the progress of Knowledge Transfer, Transition progress & proactively work on deviation if any to fix it.Monitor process & operational KPIs to ensure effective delivery against targets & benchmarks.Manage & oversee control procedures & practices to ensure no significant SOX control deficiencies in the AP delivery sub-function.Drive controls & compliance in a process & ensure 100% noiseless operations. Identify & support AP improvement initiatives to drive operational efficiencies & improved controls.Manage required & appropriate reporting to facilitate informed decision making (e.g. aging, forecasted payables)Support regional leadership through business partnering by providing metrics, problem resolution, & reporting process performance.Maintain files & documentation thoroughly & accurately, in accordance with company policy. People Management Responsibilities:Supervise & manage an PTP team with multiple sub-processes, approximately 30-35 team members, ensuring communication & coordination across teams Closely work with Team leads & SMEs to drive the business transformation
Posted 1 month ago
5 - 10 years
7 - 11 Lacs
Mumbai, Hyderabad
Work from Office
EDS Specialist - NAV02KL Company Worley Primary Location IND-MM-Navi Mumbai Job Engineering Design Systems (EDS) Schedule Full-time Employment Type Employee Job Level Experienced Job Posting Apr 7, 2025 Unposting Date May 30, 2025 Reporting Manager Title Manager We deliver the worlds most complex projects. Work as part of a collaborative and inclusive team . Enjoy a varied & challenging role. Building on our past. Ready for the future Worley is a global professional services company of energy, chemicals and resources experts headquartered in Australia. Right now, were bridging two worlds as we accelerate to more sustainable energy sources, while helping our customers provide the energy, chemicals and resources that society needs now. We partner with our customers to deliver projects and create value over the life of their portfolio of assets. We solve complex problems by finding integrated data-centric solutions from the first stages of consulting and engineering to installation and commissioning, to the last stages of decommissioning and remediation. Join us and help drive innovation and sustainability in our projects.? The Role As an EDS Specialist with Worley, you will work closely with our existing team to deliver projects for our clients while continuing to develop your skills and experience etc. Duties and responsibilities The AVEVA Engineering Senior Administrator is responsible for project set up, maintenance and support of the system. Senior Administrator shall ensure the set-up, configuration and deliverables are in line with organization/project/client standards. Gain full understanding of the scope, overall schedule, deliverables, milestones and coordination procedure. Understanding, documenting and managing the functional requirements (business scope) for an AVEVA Engineering implementation. Performing AVEVA Engineering support tasks Performing project implementations including configurations, reports and gateway. Suggesting how to improve AVEVA Engineering or optimize implementation. Providing advanced support and troubleshooting. Continually seeking opportunities to increase end-user satisfaction. Promote use of AVEVA Engineering and the value it brings to the projects within the organization. Qualifications: Bachelors degree in Engineering with at least 10 years of experience 5+ years of relevant experience in AVEVA Engineering. 5+ years of relevant experience in AEVA PDMS/E3D Administration. In-depth working knowledge of configuration and management of AVEVA Engineering, including project Administration Fully proficient with the management of Dabacon databases Knowledge of Engineering workflow in an EPC environment. Strong analytical and problem-solving skills Ability to work in a fast-paced environment. Effective oral and written communication skills required Experience with setting up Integration between Aveva Engineering and other Aveva and Hexagon design applications Good understanding of the Engineering data flow between various engineering application will be a plus Proficient in PML programming Good to have : Knowledge of writing PML1/2/.net and C# programs, and Visual basic .Net Previous experience of AVEVA NET Previous experience of AVEVA CAT/SPEC, ERM Moving forward together Were committed to building a diverse, inclusive and respectful workplace where everyone feels they belong, can bring themselves, and are heard. We provide equal employment opportunities to all qualified applicants and employees without regard to age, race, creed, color, religion, sex, national origin, ancestry, disability status, veteran status, sexual orientation, gender identity or expression, genetic information, marital status, citizenship status or any other basis as protected by law. We want our people to be energized and empowered to drive sustainable impact. So, our focus is on a values-inspired culture that unlocks brilliance through belonging, connection and innovation. And we're not just talking about it; we're doing it. We're reskilling our people, leveraging transferable skills, and supporting the transition of our workforce to become experts in today's low carbon energy infrastructure and technology. Whatever your ambition, theres a path for you here. And theres no barrier to your potential career success. Join us to broaden your horizons, explore diverse opportunities, and be part of delivering sustainable change.
Posted 1 month ago
3 - 5 years
9 - 13 Lacs
Bengaluru
Work from Office
About The Role Role Purpose The purpose of this role is to design, test and maintain software programs for operating systems or applications which needs to be deployed at a client end and ensure its meet 100% quality assurance parameters ? Do 1. Instrumental in understanding the requirements and design of the product/ software Develop software solutions by studying information needs, studying systems flow, data usage and work processes Investigating problem areas followed by the software development life cycle Facilitate root cause analysis of the system issues and problem statement Identify ideas to improve system performance and impact availability Analyze client requirements and convert requirements to feasible design Collaborate with functional teams or systems analysts who carry out the detailed investigation into software requirements Conferring with project managers to obtain information on software capabilities ? 2. Perform coding and ensure optimal software/ module development Determine operational feasibility by evaluating analysis, problem definition, requirements, software development and proposed software Develop and automate processes for software validation by setting up and designing test cases/scenarios/usage cases, and executing these cases Modifying software to fix errors, adapt it to new hardware, improve its performance, or upgrade interfaces. Analyzing information to recommend and plan the installation of new systems or modifications of an existing system Ensuring that code is error free or has no bugs and test failure Preparing reports on programming project specifications, activities and status Ensure all the codes are raised as per the norm defined for project / program / account with clear description and replication patterns Compile timely, comprehensive and accurate documentation and reports as requested Coordinating with the team on daily project status and progress and documenting it Providing feedback on usability and serviceability, trace the result to quality risk and report it to concerned stakeholders ? 3. Status Reporting and Customer Focus on an ongoing basis with respect to project and its execution Capturing all the requirements and clarifications from the client for better quality work Taking feedback on the regular basis to ensure smooth and on time delivery Participating in continuing education and training to remain current on best practices, learn new programming languages, and better assist other team members. Consulting with engineering staff to evaluate software-hardware interfaces and develop specifications and performance requirements Document and demonstrate solutions by developing documentation, flowcharts, layouts, diagrams, charts, code comments and clear code Documenting very necessary details and reports in a formal way for proper understanding of software from client proposal to implementation Ensure good quality of interaction with customer w.r.t. e-mail content, fault report tracking, voice calls, business etiquette etc Timely Response to customer requests and no instances of complaints either internally or externally ? Deliver No. Performance Parameter Measure 1. Continuous Integration, Deployment & Monitoring of Software 100% error free on boarding & implementation, throughput %, Adherence to the schedule/ release plan 2. Quality & CSAT On-Time Delivery, Manage software, Troubleshoot queries, Customer experience, completion of assigned certifications for skill upgradation 3. MIS & Reporting 100% on time MIS & report generation Mandatory Skills: Google BigQuery. Experience3-5 Years. Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.
Posted 1 month ago
3 - 6 years
4 - 8 Lacs
Bengaluru
Work from Office
About The Role Data engineers are responsible for building reliable and scalable data infrastructure that enables organizations to derive meaningful insights, make data-driven decisions, and unlock the value of their data assets. About The Role - Grade Specific The primary focus is to help organizations design, develop, and optimize their data infrastructure and systems. They help organizations enhance data processes, and leverage data effectively to drive business outcomes. Skills (competencies) Industry Standard Data Modeling (FSLDM) Ab Initio Industry Standard Data Modeling (IBM FSDM)) Agile (Software Development Framework) Influencing Apache Hadoop Informatica IICS AWS Airflow Inmon methodology AWS Athena JavaScript AWS Code Pipeline Jenkins AWS EFS Kimball AWS EMR Linux - Redhat AWS Redshift Negotiation AWS S3 Netezza Azure ADLS Gen2 NewSQL Azure Data Factory Oracle Exadata Azure Data Lake Storage Performance Tuning Azure Databricks Perl Azure Event Hub Platform Update Management Azure Stream Analytics Project Management Azure Sunapse PySpark Bitbucket Python Change Management R Client Centricity RDD Optimization Collaboration SantOs Continuous Integration and Continuous Delivery (CI/CD) SaS Data Architecture Patterns Scala Spark Data Format Analysis Shell Script Data Governance Snowflake Data Modeling SPARK Data Validation SPARK Code Optimization Data Vault Modeling SQL Database Schema Design Stakeholder Management Decision-Making Sun Solaris DevOps Synapse Dimensional Modeling Talend GCP Big Table Teradata GCP BigQuery Time Management GCP Cloud Storage Ubuntu GCP DataFlow Vendor Management GCP DataProc Git Google Big Tabel Google Data Proc Greenplum HQL IBM Data Stage IBM DB2
Posted 1 month ago
- 2 years
2 - 4 Lacs
Gurugram
Remote
What does the team do? The Ad Operations team is responsible for setting up, managing, analysing and optimising digital advertising campaigns. They ensure ads are delivered correctly, track performance, and troubleshoot issues to maximise campaign effectiveness and revenue consumption. What Youll Do? Data Management: Gather, organize, and maintain data related to advertising campaigns and their revenue, ensuring accuracy and consistency. Querying the Database: Using SQL/ BigQuery to run queries on ShareChats analytical engine Scripting: Writing scalable scripts to fetch or modify data from API endpoints. Collaborate with data teams to ensure proper integration and flow of data between different systems and platforms. Reporting and Insights: Create reports and dashboards to visualize key performance metrics. Generate regular and ad-hoc reports that provide insights into monthly/quarterly/ annual revenue, campaign performance and key metrics. Communicate findings and insights to cross-functional teams, including AdOps, sales and management, to drive data-informed decision-making. Ad Strategy: Work with Strategy team with data insights to develop Go to Market strategies for Key Clients Monitor ad inventory levels and work with Strategy teams to ensure ad space is efficiently utilized. Assist in forecasting future ad inventory needs based on historical data. Identify opportunities for process improvements and automation in ad operations workflows. Contribute to the development and implementation of best practices and standard operating procedures for ad operations. Salesforce Administration, Integration and Automation: Configure, customize and maintain the Salesforce CRM system to meet the specific needs of the advertising team. Create and manage custom objects, fields, and workflows to support advertising operations. Integrate Salesforce with other advertising and marketing tools and platforms for seamless data flow. Automate routine tasks and processes to improve efficiency and reduce manual work. Who are you? BS in Mathematics, Economics, Computer Science, Information Management or Statistics is preferred. Proven working experience as a data analyst or business data analyst Strong knowledge of and experience with SQL/ BigQuery and Excel Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy Experience with Salesforce would be an advantage.
Posted 1 month ago
3 - 6 years
10 - 16 Lacs
Hyderabad
Hybrid
Requirements: Must have worked in a QA role for ETL/data transformation Minimum 3+ years of QA experience Strong SQL skills Proficiency in using BigQuery functions and operators for data comparison, aggregation, and validation across different stages of the transformation Should have analytical skills to understand complex requirements to do in depth data validation Should be able to work independently to create artifacts such as test strategy, test plan etc. Good understanding of data mapping and data requirements Inclined to do rigorous and repeatable testing with enthusiasm to find bugs Willing to do manual SQL QA work with complex queries Intuitive and able to work independently Strong communication skills are a must have Experience and desire to work in a Global delivery environment Location : Hyderabad Shift : 1.00 PM to 10.00 PM Notice Period : Immediate to 15 Days
Posted 1 month ago
5 - 10 years
30 - 45 Lacs
Bengaluru
Work from Office
Company Overview Lifesight is a fast growing SaaS company focused on helping businesses leverage data & AI to improve customer acquisition and retention. We have a team of 130 serving 300+ customers across 5 offices in the US, Singapore, India, Australia and the UK. Our mission is to make it easy for non-technical marketers to leverage advanced data activation and marketing measurement tools that are powered by AI, to improve their performance and achieve their KPIs. Our product is being adopted rapidly globally and we need the best people onboard the team to accelerate our growth. Position Overview The ideal candidate is a self-motivated, self managed and multi-tasker, and demonstrated team-player. You will be a lead developer responsible for the development of new software products and working on improving numerous non-functional requirements of the products as well. If youre looking to be a part of a dynamic, highly-analytical team and an opportunity to hone your java, cloud engineering and distributed system skills, look no further. As our Senior Software Engineer for the platform team, you will be handed the reins to build the core microservices. Along with building services to query billions of rows in Googles BigQuery, you will be in charge of building scalable APIs to build user segments and evolve architecture to send millions of notifications through varied streams like Email, SMS, in-app notifications per day. What youll do: Be responsible for the overall development of the modules/services that you will be working on Code, design, prototype, perform reviews and consult in the process of building highly scalable, reliable, and fault-tolerant systems. As our senior software engineer continuously refactor applications and architectures to maintain high-quality levels Continue to stay abreast of the latest technologies in distributed systems, caching and research new technologies, tools that enable building the next generation systems Act as an engineer that enjoys writing readable, concise, reusable, extensible code every day Discuss, articulate requirements with product management and scope, execute the feature road map Participate in teams hiring process by being a panelist in interviews Requirements What youll need: Ideally 7+ years of hands-on experience in designing, developing, testing, and deploying large scale applications, microservices in any language or stack (preferably java, Springboot) Good knowledge in one or more of these areas: Cloud, NoSQL stores; we use Google Cloud, Kubernetes, BigQuery, messaging systems Excellent attitude and passion working in a team with willingness to learn Experience in building low latency, high volume REST API requests handling Experience in working in distributed caches like Redis Ability to Get Stuff Done ! Bonus Points If you have.. Experience in containerization technologies like Docker, Kubernetes Experience working in any cloud platforms (preferably GCP) Experience in NoSQL stores (like Cassandra, Clickhouse, BigQuery) Benefits What is in it for the candidate : As a team, we are concerned with not only the growth of the company, but each other’s personal growth and well being too. Along with our desire to utilize smart technology and innovative engineering strategies to make people’s lives easier, our team also bonds over our shared love for all kinds of tea, movies & fun filled Friday’s events with a prioritizing healthy work-life balance. 1. Working for one of the fastest growing and successful MarTech companies in times 2. Opportunity to be part of an early member of the core team to build a product from scratch starting from making tech stack choices, driving and influencing the way to simplify building complex products. 3. Enjoy working in small teams and a non bureaucratic environment 4. Enjoy an environment that provides high levels of empowerment and space to achieve your Objectives and grow with organization. 5. Work in a highly profitable and growing organization, with opportunities to accelerate and shape your career 6. Great benefits - apart from competitive compensation & benefits 7. Above all - a “fun” working environment.
Posted 1 month ago
4 - 9 years
15 - 30 Lacs
Pune, Gurugram, Bengaluru
Hybrid
5+ experience in software development using C#, MSSQL and GCP/BigQuery. Good to have Python experience Contribute to the design and development of innovative software solutions that meet business requirements. Develop and maintain applications using specified technologies Participate in code reviews to ensure high-quality code and adherence to best practices. Strong problem-solving skills and attention to detail. Excellent communication and teamwork abilities. Experience in code reviews and maintaining code quality. Ability to mentor and guide junior developers. Bachelor's degree in Computer Science, Engineering, or a related field.
Posted 1 month ago
3 - 8 years
12 - 15 Lacs
Mumbai
Work from Office
Responsibilities: Develop and maintain data pipelines using GCP. Write and optimize queries in BigQuery. Utilize Python for data processing tasks. Manage and maintain SQL Server databases. Must-Have Skills: Experience with Google Cloud Platform (GCP). Proficiency in BigQuery query writing. Strong Python programming skills. Expertise in SQL Server. Good to Have: Knowledge of MLOps practices. Experience with Vertex AI. Background in data science. Familiarity with any data visualization tool.
Posted 1 month ago
2 - 4 years
3 - 8 Lacs
Kolkata
Remote
Data Quality Analyst Experience: 2 - 4 Years Exp Salary : Competitive Preferred Notice Period : Within 30 Days Shift : 10:00AM to 7:00PM IST Opportunity Type: Remote Placement Type: Permanent (*Note: This is a requirement for one of Uplers' Clients) Must have skills required : Data Validation, BigQuery, SQL, Communication Skill Good to have skills : Data Visualisation, PowerBI, Tableau Forbes Advisor (One of Uplers' Clients) is Looking for: Data Quality Analyst who is passionate about their work, eager to learn and grow, and who is committed to delivering exceptional results. If you are a team player, with a positive attitude and a desire to make a difference, then we want to hear from you. Role Overview Description Short term objectives We know the importance data validation can play in creating better reporting for our business - we have identified areas we want you to make an impact within the first 3 months. Push 40% of partners through the ingestion validation process Push 40% of partners through the mapping validation process Data Team Culture Our team requires four areas of focus from every team member (see below). We use these focus areas to guide our decision making and career growth. To give you an idea of these requirements, the top three from each area are: Mastery: • Demonstrate skills expertise in relevant tool (e.g., GA, Tableau) or code language (e.g., SQL) • Think about the wider impact & value of decisions • Understand and anticipate the need for scalability, stability, and security Communication: • Provide clear, actionable feedback from peer reviews • Communicate effectively to wider teams and stakeholders • Proactively share knowledge everyday Ownership: • Lead complex initiatives that drive challenging goals • Create and push forward cross cutting concerns between teams • Demonstrate consistently sound judgement Behaviours: • Challenge yourself and others through questioning, assessing business benefits, and understanding cost of delay • Own your workload and decisions - show leadership to others • Innovate to find new solutions, or improve existing ways of working - push yourself to learn everyday Responsibilities: Reports directly to Senior Business Analyst and works closely with Data & Revenue Operations functions to support key deliverables Reconciliation of affiliate network revenue by vertical and publisher brand at monthly level Where discrepancies exist, investigation by to isolate whether specific days, products, providers, or commission values Validate new tickets going on to the Data Engineering JIRA board to ensure requests going into Data Engineering are complete, accurate and as descriptive as possible Investigation results to be updated into JIRA tickets and all outputs saved in mapping google sheet Use Postman API, Webhooks to pull revenue data from partner portals and verify against partner portals and BQ Monitor API failures, rate limits, and response inconsistencies impacting revenue ingestion. As necessary, seek revenue clarifications from the verticals RevOps team member As necessary, clarify JIRA commentary for data engineers Understand requirements, goals, priorities, and communicate to the stakeholders on progress towards data goals Ability to ensure outputs are on time and on target Required competencies: At least two (2) years of data quality analysis experience A strong understanding of SQL and how it can be used to validate data (experience with BigQuery is a plus) An understanding of large, relational databases and how to navigate these datasets to find the data required Ability to communicate data to non-technical audiences through the use of reports and visualisations Strong interpersonal and communication skills Comfortable working remotely and collaboratively with teammates across multiple geographies and time zones Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves How to apply for this opportunity: Easy 3-Step Process: 1. Click On Apply! And Register or log in on our portal 2. Upload updated Resume & Complete the Screening Form 3. Increase your chances to get shortlisted & meet the client for the Interview! About Our Client: Forbes Advisor is a global platform dedicated to helping consumers make the best financial choices for their individual lives. About Uplers: Our goal is to make hiring and getting hired reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant product and engineering job opportunities and progress in their career. (Note: There are many more opportunities apart from this on the portal.) So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 1 month ago
7 - 10 years
8 - 14 Lacs
Patna
Work from Office
Role : Data Engineer We are looking for a highly skilled and experienced Senior Data Engineer to join our dynamic team. The ideal candidate will have a strong background in data engineering, with specific expertise in Oracle to BigQuery data warehouse migration and modernization. This role requires proficiency in various data engineering tools and technologies, including BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem. Key Responsibilities : - Oracle to BigQuery Migration: Lead the migration and modernization of data warehouses from Oracle to BigQuery, ensuring seamless data transfer and integration. - Data Engineering: Utilize BigQuery, DataProc, GCS, PySpark, Airflow, and Hadoop ecosystem to design, develop, and maintain scalable data pipelines and workflows.- Data Management: Ensure data integrity, accuracy, and consistency across various systems and platforms. - SQL Writing: Write and optimize complex SQL queries to extract, transform, and load data efficiently.- Collaboration: Work closely with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand data requirements and deliver solutions that meet business needs. - Performance Optimization: Monitor and optimize data processing performance to ensure efficient and reliable data operations.Skills and Qualifications :- Proven experience as a Data Engineer or similar role.- Strong knowledge of Oracle to BigQuery data warehouse migration and modernization. - Proficiency in BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem.- In-depth knowledge of Oracle DB and PL/SQL.- Excellent SQL writing skills.- Strong analytical and problem-solving abilities.- Ability to work collaboratively with cross-functional teams.- Excellent communication and interpersonal skills. Preferred Qualifications :- Experience with other data management tools and technologies.- Knowledge of cloud-based data solutions.- Certification in data engineering or related fields.
Posted 1 month ago
6 - 11 years
10 - 18 Lacs
Noida, Indore
Work from Office
Role & responsibilities Job Description: We are looking for GCP Data Engineer and SQL Programmer with good working experience on PostgreSQL, & PL/SQL programming experience and following technical skills PL/SQL and PostgreSQL programming Ability to write complex SQL Queries, Stored Procedures. Migration Working experience in migrating Database structure and data from Oracle to Postgres SQL preferably on GCP Alloy DB or Cloud SQL Working experience on Cloud SQL/Alloy DB Working experience to tune autovacuum in postgresql. Working experience on tuning Alloy DB / PostgreSQL for better performance. Working experience on Big Query, Fire Store, Memory Store, Spanner and bare metal setup for PostgreSQL Ability to tune the Alloy DB / Cloud SQL database for better performance Experience on GCP Data migration service Working experience on MongoDB Working experience on Cloud Dataflow Working experience on Database Disaster Recovery Working experience on Database Job scheduling Working experience on Database logging techniques Knowledge of OLTP And OLAP Desirable: GCP Database Engineer Certification Other Skills:- Out of the Box Thinking Problem Solving Skills Ability to make tech choices (build v/s buy) Performance management (profiling, benchmarking, testing, fixing) Enterprise Architecture Project management/Delivery Capabilty/ Quality Mindset Scope management Plan (phasing, critical path, risk identification) Schedule management / Estimations Leadership skills Other Soft Skills Learning ability Innovative / Initiative Preferred candidate profile Roles & Responsibilities: Develop, construct, test, and maintain data architectures Migrate Enterprise Oracle database from On Prem to GCP cloud autovacuum in postgresql Ability to tune autovacuum in postgresql. Working on tuning Alloy DB / PostgreSQL for better performance. Performance Tuning of PostgreSQL stored procedure code and queries Converting Oracle stored procedure & queries to PostgreSQL stored procedures & Queries Creating Hybrid data store with Datawarehouse and No SQL GCP solutions along with PostgreSQL. Migrate Oracle Table data from Oracle to Alloy DB Leading the database team Mandatory Skills: Postgresql, plsql, Bigquery Bottom of Form
Posted 1 month ago
5 - 10 years
0 - 3 Lacs
Hyderabad
Hybrid
Job Profile We are seeking a Senior Data Engineer with proven expertise in designing and maintaining scalable, efficient, and reliable data pipelines. The ideal candidate should have strong proficiency in SQL, DBT, BigQuery, Python, and Airflow, along with a solid foundation in data warehousing principles. In this role, you will be instrumental in managing and optimizing data workflows, ensuring high data quality, and supporting data-driven decision-making across the organization. Experience with Oracle ERP systems and knowledge of data migration to a data warehouse environment will be considered a valuable advantage. Years of Experience: 5 to 10 Years. Shift Timings: 1PM to 10PM IST. Skill Set • SQL: Advanced proficiency in writing optimized queries, working with complex joins, CTEs, window functions, etc. • DBT (Data Build Tool): Experience in modelling data with dbt, managing data transformations, and maintaining project structure. Python: Proficient in writing data processing scripts and building Airflow DAGs using Python. BigQuery: Strong experience with GCPs BigQuery, including dataset optimization, partitioning, and query cost management. Apache Airflow: Experience building and managing DAGs, handling dependencies, scheduling jobs, and error handling. Data Warehousing Concepts: Strong grasp of ETL/ELT, dimensional modelling (star/snowflake), fact/dimension tables, slowly changing dimensions, etc. Version Control: Familiarity with Git/GitHub for code collaboration and deployment. • Cloud Platforms: Working knowledge of Google Cloud Platform (GCP). Job Description Roles & Responsibilities: Design, build, and maintain robust ETL/ELT data pipelines using Python, Airflow, and DBT. Develop and manage dbt models to enable efficient, reusable, and well-documented data transformations. Collaborate with stakeholders to gather data requirements and design data marts comprising fact and dimension tables in a well-structured star schema. Manage and optimize data models and transformation logic in BigQuery, ensuring high performance and cost-efficiency. Implement and uphold robust data quality checks, logging, and alerting mechanisms to ensure reliable data delivery. Maintain the BigQuery data warehouse, including routine optimizations and updates. Enhance and support the data warehouse architecture, including the use of star/snowflake schemas, partitioning strategies, and data mart structures. Proactively monitor and troubleshoot production pipelines to minimize downtime and ensure data accuracy.
Posted 1 month ago
3 - 8 years
15 - 30 Lacs
Pune, Gurugram, Bengaluru
Hybrid
Salary: 15 to 30 LPA Exp: 3 to 8 years Location : Gurgaon/Bangalore/Pune/Chennai Notice: Immediate to 30 days..!! Key Responsibilities & Skillsets: Common Skillsets : 3+ years of experience in analytics, SAS Pyspark, Python, Spark, SQL and associated data engineering jobs. Must have experience with managing and transforming big data sets using pyspark, spark-scala, Numpy pandas Excellent communication & presentation skills Experience in managing Python codes and collaborating with customer on model evolution Good knowledge of data base management and Hadoop/Spark, SQL, HIVE, Python (expertise). Superior analytical and problem solving skills Should be able to work on a problem independently and prepare client ready deliverable with minimal or no supervision Good communication skill for client interaction Data Management Skillsets: Ability to understand data models and identify ETL optimization opportunities. Exposure to ETL tools is preferred Should have strong grasp of advanced SQL functionalities (joins, nested query, and procedures). Strong ability to translate functional specifications / requirements to technical requirements
Posted 1 month ago
7 - 10 years
8 - 14 Lacs
Pune
Work from Office
Role : Data Engineer We are looking for a highly skilled and experienced Senior Data Engineer to join our dynamic team. The ideal candidate will have a strong background in data engineering, with specific expertise in Oracle to BigQuery data warehouse migration and modernization. This role requires proficiency in various data engineering tools and technologies, including BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem. Key Responsibilities : - Oracle to BigQuery Migration: Lead the migration and modernization of data warehouses from Oracle to BigQuery, ensuring seamless data transfer and integration.- Data Engineering: Utilize BigQuery, DataProc, GCS, PySpark, Airflow, and Hadoop ecosystem to design, develop, and maintain scalable data pipelines and workflows. - Data Management: Ensure data integrity, accuracy, and consistency across various systems and platforms.- SQL Writing: Write and optimize complex SQL queries to extract, transform, and load data efficiently. - Collaboration: Work closely with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand data requirements and deliver solutions that meet business needs.- Performance Optimization: Monitor and optimize data processing performance to ensure efficient and reliable data operations. Skills and Qualifications :- Proven experience as a Data Engineer or similar role.- Strong knowledge of Oracle to BigQuery data warehouse migration and modernization.- Proficiency in BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem.- In-depth knowledge of Oracle DB and PL/SQL. - Excellent SQL writing skills.- Strong analytical and problem-solving abilities.- Ability to work collaboratively with cross-functional teams.- Excellent communication and interpersonal skills. Preferred Qualifications :- Experience with other data management tools and technologies.- Knowledge of cloud-based data solutions.- Certification in data engineering or related fields.
Posted 1 month ago
7 - 10 years
8 - 14 Lacs
Lucknow
Work from Office
We are looking for a highly skilled and experienced Senior Data Engineer to join our dynamic team. The ideal candidate will have a strong background in data engineering, with specific expertise in Oracle to BigQuery data warehouse migration and modernization. This role requires proficiency in various data engineering tools and technologies, including BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem. Key Responsibilities : - Oracle to BigQuery Migration: Lead the migration and modernization of data warehouses from Oracle to BigQuery, ensuring seamless data transfer and integration. - Data Engineering: Utilize BigQuery, DataProc, GCS, PySpark, Airflow, and Hadoop ecosystem to design, develop, and maintain scalable data pipelines and workflows. - Data Management: Ensure data integrity, accuracy, and consistency across various systems and platforms. - SQL Writing: Write and optimize complex SQL queries to extract, transform, and load data efficiently. - Collaboration: Work closely with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand data requirements and deliver solutions that meet business needs. - Performance Optimization: Monitor and optimize data processing performance to ensure efficient and reliable data operations. Skills and Qualifications : - Proven experience as a Data Engineer or similar role. - Strong knowledge of Oracle to BigQuery data warehouse migration and modernization. - Proficiency in BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem. - In-depth knowledge of Oracle DB and PL/SQL. - Excellent SQL writing skills. - Strong analytical and problem-solving abilities. - Ability to work collaboratively with cross-functional teams. - Excellent communication and interpersonal skills. Preferred Qualifications : - Experience with other data management tools and technologies. - Knowledge of cloud-based data solutions. - Certification in data engineering or related fields.
Posted 1 month ago
7 - 10 years
8 - 14 Lacs
Bengaluru
Work from Office
We are looking for a highly skilled and experienced Senior Data Engineer to join our dynamic team. The ideal candidate will have a strong background in data engineering, with specific expertise in Oracle to BigQuery data warehouse migration and modernization. This role requires proficiency in various data engineering tools and technologies, including BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem. Key Responsibilities : - Oracle to BigQuery Migration: Lead the migration and modernization of data warehouses from Oracle to BigQuery, ensuring seamless data transfer and integration. - Data Engineering: Utilize BigQuery, DataProc, GCS, PySpark, Airflow, and Hadoop ecosystem to design, develop, and maintain scalable data pipelines and workflows. - Data Management: Ensure data integrity, accuracy, and consistency across various systems and platforms. - SQL Writing: Write and optimize complex SQL queries to extract, transform, and load data efficiently. - Collaboration: Work closely with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand data requirements and deliver solutions that meet business needs. - Performance Optimization: Monitor and optimize data processing performance to ensure efficient and reliable data operations. Skills and Qualifications : - Proven experience as a Data Engineer or similar role. - Strong knowledge of Oracle to BigQuery data warehouse migration and modernization. - Proficiency in BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem. - In-depth knowledge of Oracle DB and PL/SQL. - Excellent SQL writing skills. - Strong analytical and problem-solving abilities. - Ability to work collaboratively with cross-functional teams. - Excellent communication and interpersonal skills. Preferred Qualifications : - Experience with other data management tools and technologies. - Knowledge of cloud-based data solutions. - Certification in data engineering or related fields.
Posted 1 month ago
5 - 9 years
18 - 20 Lacs
Noida
Work from Office
Experience: 5-7 Years Location-Noida Position: Data Analyst Technical Skills: Strong proficiency in Python (Pandas, NumPy, Matplotlib, Seaborn, etc.). Advanced SQL skills for querying large datasets. Experience with data visualization tools ( Looker, etc.). Hands-on experience with data wrangling, cleansing, and transformation. Familiarity with ETL processes and working with structured/unstructured data. Analytical & Business Skills: Strong problem-solving skills with the ability to interpret complex data. Business acumen to connect data insights with strategic decision-making. Excellent communication and presentation skills. Preferred (Nice to Have): Knowledge of machine learning concepts (scikit-learn, TensorFlow, etc.) Exposure to cloud platforms (GCP) for data processing.
Posted 1 month ago
4 - 9 years
15 - 19 Lacs
Pune
Work from Office
About The Role : Job Title: Technical-Specialist GCP Developer LocationPune, India Role Description This role is for Engineer who is responsible for design, development, and unit testing software applications. The candidate is expected to ensure good quality, maintainable, scalable, and high performing software applications getting delivered to users in an Agile development environment. Candidate / Applicant should be coming from a strong technological background. The candidate should have goo working experience in Spark and GCP technology. Should be hands on and be able to work independently requiring minimal technical/tool guidance. Should be able to technically guide and mentor junior resources in the team. As a developer you will bring extensive design and development skills to enforce the group of developers within the team. The candidate will extensively make use and apply Continuous Integration tools and practices in the context of Deutsche Banks digitalization journey. What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy. Best in class leave policy. Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Design and discuss your own solution for addressing user stories and tasks. Develop and unit-test, Integrate, deploy, maintain, and improve software. Perform peer code review. Actively participate into the sprint activities and ceremonies e.g., daily stand-up/scrum meeting, Sprint planning, retrospectives, etc. Apply continuous integration best practices in general (SCM, build automation, unit testing, dependency management) Collaborate with other team members to achieve the Sprint objectives. Report progress/update Agile team management tools (JIRA/Confluence) Manage individual task priorities and deliverables. Responsible for quality of solutions candidate / applicant provides. Contribute to planning and continuous improvement activities & support PO, ITAO, Developers and Scrum Master. Your skills and experience Engineer with Good development experience in Google Cloud platform for at least 4 years. Hands own experience in Bigquery, Dataproc, Composer, Terraform, GKE, Cloud SQL and Cloud functions. Experience in set-up, maintenance, and ongoing development of continuous build/ integration infrastructure as a part of Devops. Create and maintain fully automated CI build processes and write build and deployment scripts. Has experience with development platformsOpenShift/ Kubernetes/Docker configuration and deployment with DevOps tools e.g., GIT, TeamCity, Maven, SONAR Good Knowledge about the core SDLC processes and tools such as HP ALM, Jira, Service Now. Knowledge on working with APIs and microservices , integrating external and internal web services including SOAP, XML, REST, JSON . Strong analytical skills. Proficient communication skills. Fluent in English (written/verbal). Ability to work in virtual teams and in matrixed organizations. Excellent team player. Open minded and willing to learn business and technology. Keeps pace with technical innovation. Understands the relevant business area. Ability to share information, transfer knowledge to expertise the team members. How we'll support you Training and development to help you excel in your career. Coaching and support from experts in your team. A culture of continuous learning to aid progression. A range of flexible benefits that you can tailor to suit your needs.
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20312 Jobs | Dublin
Wipro
11977 Jobs | Bengaluru
EY
8165 Jobs | London
Accenture in India
6667 Jobs | Dublin 2
Uplers
6464 Jobs | Ahmedabad
Amazon
6352 Jobs | Seattle,WA
Oracle
5993 Jobs | Redwood City
IBM
5803 Jobs | Armonk
Capgemini
3897 Jobs | Paris,France
Tata Consultancy Services
3776 Jobs | Thane