Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 10.0 years
7 - 17 Lacs
Pune, Bengaluru
Hybrid
Project Role : Cloud Platform Engineer Project Role Description : Designs, builds, tests, and deploys cloud application solutions that integrate cloud and non-cloud infrastructure. Can deploy infrastructure and platform environments, creates a proof of architecture to test architecture viability, security and performance. Must have skills : Google Cloud Compute Services, Google Kubernetes Engine, Google BigQuery, Linux Containers Administration, Terraform Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary: As a Cloud Platform Engineer, you will be responsible for designing, building, testing, and deploying cloud application solutions that integrate cloud and non-cloud infrastructure. Your typical day will involve deploying infrastructure and platform environments, creating a proof of architecture to test architecture viability, security, and performance. You will collaborate with cross-functional teams to ensure successful implementation of cloud solutions. Roles & Responsibilities: - Expected to be an SME, collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Manage and maintain Google Kubernetes Engine and Google BigQuery. - Administer Linux Containers and ensure smooth operation. - Design and implement cloud-based solutions using Google Cloud Compute Services. - Collaborate with stakeholders to gather requirements and provide technical guidance. Professional & Technical Skills: - Must To Have Skills: Proficiency in Google Cloud Compute Services, Google Kubernetes Engine, Google BigQuery, and Linux Containers Administration. - Strong understanding of cloud architecture and infrastructure. - Experience with containerization technologies such as Docker and Kubernetes. - Knowledge of scripting languages like Python or Bash. - Familiarity with CI/CD pipelines and automation tools like Jenkins or GitLab.
Posted 1 month ago
5.0 - 10.0 years
8 - 16 Lacs
Bhubaneswar, Bengaluru, Delhi / NCR
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Apache Spark Good to have skills : Oracle Procedural Language Extensions to SQL (PLSQL) Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. You will be responsible for ensuring that the applications are developed and implemented efficiently and effectively, while meeting the needs of the organization. Your typical day will involve collaborating with the team, making team decisions, engaging with multiple teams, and providing solutions to problems for your immediate team and across multiple teams. You will also contribute to key decisions and provide expertise in application development. Roles & Responsibilities: - Expected to be an SME - Collaborate and manage the team to perform - Responsible for team decisions - Engage with multiple teams and contribute on key decisions - Provide solutions to problems for their immediate team and across multiple teams - Design, build, and configure applications to meet business process and application requirements - Ensure that applications are developed and implemented efficiently and effectively - Contribute expertise in application development Professional & Technical Skills: - Must To Have Skills: Proficiency in Apache Spark - Good To Have Skills: Experience with Oracle Procedural Language Extensions to SQL (PLSQL), Google BigQuery - Strong understanding of statistical analysis and machine learning algorithms - Experience with data visualization tools such as Tableau or Power BI - Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms - Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity Additional Information: - The candidate should have a minimum of 5 years of experience in Apache Spark - This position is based at our Gurugram office - A 15 years full time education is required
Posted 1 month ago
3.0 - 6.0 years
6 - 16 Lacs
Noida, Mumbai (All Areas)
Hybrid
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Google BigQuery Minimum 3 year(s) of experience is required Educational Qualification : 15 years fulltime education Summary: As a Data Engineer, you will be responsible for designing, developing, and maintaining data solutions for data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across systems using Google BigQuery. Roles & Responsibilities: - Design, develop, and maintain data solutions for data generation, collection, and processing using Google BigQuery. - Create data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across systems. - Collaborate with cross-functional teams to understand data requirements and design solutions that meet business needs. - Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency. - Optimize data storage and retrieval processes to ensure efficient and effective use of resources. Professional & Technical Skills: - Must To Have Skills: Experience with Google BigQuery. - Good To Have Skills: Experience with ETL tools such as Apache NiFi or Talend. - Strong understanding of data modeling and database design principles. - Experience with SQL and NoSQL databases. - Experience with data warehousing and data integration technologies. - Familiarity with cloud computing platforms such as AWS or Google Cloud Platform.
Posted 1 month ago
6.0 - 8.0 years
8 - 14 Lacs
Bengaluru
Work from Office
Must-have skills : - 5+ years of strong expertise in Java and C++. - C# is preferred. - Good exposure to Spring Boot will be an added advantage. - Good database experience, preferably in Oracle and Google BigQuery. - Good coding knowledge in either Java or Node.js. - Good knowledge of network operations, load balancers, router traffics, and strong knowledge of object oriented concepts and data structures. - The candidate should be flexible to take calls during late hours and should also be available to take important calls during weekends. Responsibilities : - Provides direct support to PayPal Customer service agents for high impacting issues. - Identifies, verifies, and documents irregularities in PayPal CS tools functionality, including posting appropriate bugs. - Project-manages posted bugs when appropriate (i. follow up and make sure they are addressed). - Generates and reviews in-depth technical information (sample code, white papers, FAQs, Snippets) for distribution to agents. - Presents issues and solutions to audiences of varying sizes. - Provides recognized technical and business leadership and is able to provide deep technical support across a range of functionality. - Manages workload and other assignments efficiently while being able to resolve time-critical situations reliably and professionally. - Monitors code rollouts (Live) for issues affecting customer service agents. - Participates in the development of tools, systems, and processes aimed at improving product supportability or overall support productivity. - Works with Network Operation and SWAT to address site issues. - Creates post-mortem and resolution documentation for issues. - Mentors other engineers and developers by providing technical and business-related guidance and resources. - Advises management and appropriate groups on support issues which impact customer satisfaction and provides recommendations for appropriate actions. - Night and weekend support, on a rotating schedule, is required. - Share on-call responsibilities.
Posted 1 month ago
5.0 - 9.0 years
7 - 17 Lacs
Pune
Work from Office
Job Overview: Diacto is seeking an experienced and highly skilled Data Architect to lead the design and development of scalable and efficient data solutions. The ideal candidate will have strong expertise in Azure Databricks, Snowflake (with DBT, GitHub, Airflow), and Google BigQuery. This is a full-time, on-site role based out of our Baner, Pune office. Qualifications: B.E./B.Tech in Computer Science, IT, or related discipline MCS/MCA or equivalent preferred Key Responsibilities: Design, build, and optimize robust data architecture frameworks for large-scale enterprise solutions Architect and manage cloud-based data platforms using Azure Databricks, Snowflake, and BigQuery Define and implement best practices for data modeling, integration, governance, and security Collaborate with engineering and analytics teams to ensure data solutions meet business needs Lead development using tools such as DBT, Airflow, and GitHub for orchestration and version control Troubleshoot data issues and ensure system performance, reliability, and scalability Guide and mentor junior data engineers and developers Experience and Skills Required: 5 to12 years of experience in data architecture, engineering, or analytics roles Hands-on expertise in Databricks , especially Azure Databricks Proficient in Snowflake , with working knowledge of DBT, Airflow, and GitHub Experience with Google BigQuery and cloud-native data processing workflows Strong knowledge of modern data architecture, data lakes, warehousing, and ETL pipelines Excellent problem-solving, communication, and analytical skills Nice to Have: Certifications in Azure, Snowflake, or GCP Experience with containerization (Docker/Kubernetes) Exposure to real-time data streaming and event-driven architecture Why Join Diacto Technologies? Collaborate with experienced data professionals and work on high-impact projects Exposure to a variety of industries and enterprise data ecosystems Competitive compensation, learning opportunities, and an innovation-driven culture Work from our collaborative office space in Baner, Pune How to Apply: Option 1 (Preferred) Copy and paste the following link on your browser and submit your application for the automated interview process : - https://app.candidhr.ai/app/candidate/gAAAAABoRrTQoMsfqaoNwTxsE_qwWYcpcRyYJk7NzSUmO3LKb6rM-8FcU58CUPYQKc65n66feHor-TGdCEfyouj0NmKdgYcNbA==/ Option 2 1. Please visit our website's career section at https://www.diacto.com/careers/ 2. Scroll down to the " Who are we looking for ?" section 3. Find the listing for " Data Architect (Data Bricks) " and 4. Proceed with the virtual interview by clicking on " Apply Now ."
Posted 1 month ago
5.0 - 8.0 years
3 - 7 Lacs
Chennai
Work from Office
Role Purpose The purpose of this role is to interpret data and turn into information (reports, dashboards, interactive visualizations etc) which can offer ways to improve a business, thus affecting business decisions. Do 1. Managing the technical scope of the project in line with the requirements at all stages a. Gather information from various sources (data warehouses, database, data integration and modelling) and interpret patterns and trends b. Develop record management process and policies c. Build and maintain relationships at all levels within the client base and understand their requirements. d. Providing sales data, proposals, data insights and account reviews to the client base e. Identify areas to increase efficiency and automation of processes f. Set up and maintain automated data processes g. Identify, evaluate and implement external services and tools to support data validation and cleansing. h. Produce and track key performance indicators 2. Analyze the data sets and provide adequate information a. Liaise with internal and external clients to fully understand data content b. Design and carry out surveys and analyze survey data as per the customer requirement c. Analyze and interpret complex data sets relating to customers business and prepare reports for internal and external audiences using business analytics reporting tools d. Create data dashboards, graphs and visualization to showcase business performance and also provide sector and competitor benchmarking e. Mine and analyze large datasets, draw valid inferences and present them successfully to management using a reporting tool f. Develop predictive models and share insights with the clients as per their requirement Deliver NoPerformance ParameterMeasure1.Analyses data sets and provide relevant information to the clientNo. Of automation done, On-Time Delivery, CSAT score, Zero customer escalation, data accuracy Mandatory Skills: Google BigQuery.
Posted 1 month ago
5.0 - 10.0 years
3 - 6 Lacs
Indore, Madhya Pradesh, India
On-site
Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Summary As a Data Platform Engineer, you will play a crucial role in assisting with the data platform blueprint and design. You will collaborate with Integration Architects and Data Architects to ensure seamless integration between systems and data models, contributing to the development of relevant data platform components. Roles & Responsibilities Expected to perform independently and become a Subject Matter Expert (SME). Collaborate and manage the team to drive performance. Responsible for team decisions and engagement with multiple teams. Provide solutions to problems for immediate and cross-functional teams. Lead the data platform blueprint and design. Implement data platform components to support organizational needs. Ensure seamless integration between systems and data models. Professional & Technical Skills Must Have Skills : Proficiency in Google BigQuery. Good to Have Skills : Experience with Google BigTable. Strong understanding of data platform architecture. Experience in data integration and data modeling. Proficient in SQL and database management. Knowledge of cloud data services. Additional Information Minimum of 7.5 years of experience in Google BigQuery is required. This position is based at our Indore office. A 15 years full-time education is required.
Posted 1 month ago
5.0 - 9.0 years
7 - 12 Lacs
Bengaluru
Work from Office
Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to get to the heart of customer issues, diagnose problem areas, design innovative solutions and facilitate deployment resulting in client delight. You will develop a proposal by owning parts of the proposal document and by giving inputs in solution design based on areas of expertise. You will plan the activities of configuration, configure the product as per the design, conduct conference room pilots and will assist in resolving any queries related to requirements and solution design You will conduct solution/product demonstrations, POC/Proof of Technology workshops and prepare effort estimates which suit the customer budgetary requirements and are in line with organizations financial guidelines Actively lead small projects and contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional Requirements: Technology->Cloud Platform->GCP Data Analytics->Looker,Technology->Cloud Platform->GCP Database->Google BigQuery Preferred Skills: Technology->Cloud Platform->Google Big Data->GCP Technology->Cloud Platform->GCP Data Analytics->Looker Additional Responsibilities: Ability to develop value-creating strategies and models that enable clients to innovate, drive growth and increase their business profitability Good knowledge on software configuration management systems Awareness of latest technologies and Industry trends Logical thinking and problem solving skills along with an ability to collaborate Understanding of the financial processes for various types of projects and the various pricing models available Ability to assess the current processes, identify improvement areas and suggest the technology solutions One or two industry domain knowledge Client Interfacing skills Project and Team management Educational Requirements Bachelor of Engineering Service Line Data & Analytics Unit * Location of posting is subject to business requirements
Posted 1 month ago
8.0 - 10.0 years
10 - 15 Lacs
Bengaluru
Work from Office
Role Purpose The purpose of the role is to create exceptional architectural solution design and thought leadership and enable delivery teams to provide exceptional client engagement and satisfaction. Do 1.Develop architectural solutions for the new deals/ major change requests in existing deals Creates an enterprise-wide architecture that ensures systems are scalable, reliable, and manageable. Provide solutioning of RFPs received from clients and ensure overall design assurance Develop a direction to manage the portfolio of to-be-solutions including systems, shared infrastructure services, applications in order to better match business outcome objectives Analyse technology environment, enterprise specifics, client requirements to set a collaboration solution design framework/ architecture Provide technical leadership to the design, development and implementation of custom solutions through thoughtful use of modern technology Define and understand current state solutions and identify improvements, options & tradeoffs to define target state solutions Clearly articulate, document and sell architectural targets, recommendations and reusable patterns and accordingly propose investment roadmaps Evaluate and recommend solutions to integrate with overall technology ecosystem Works closely with various IT groups to transition tasks, ensure performance and manage issues through to resolution Perform detailed documentation (App view, multiple sections & views) of the architectural design and solution mentioning all the artefacts in detail Validate the solution/ prototype from technology, cost structure and customer differentiation point of view Identify problem areas and perform root cause analysis of architectural design and solutions and provide relevant solutions to the problem Collaborating with sales, program/project, consulting teams to reconcile solutions to architecture Tracks industry and application trends and relates these to planning current and future IT needs Provides technical and strategic input during the project planning phase in the form of technical architectural designs and recommendation Collaborates with all relevant parties in order to review the objectives and constraints of solutions and determine conformance with the Enterprise Architecture Identifies implementation risks and potential impacts 2.Enable Delivery Teams by providing optimal delivery solutions/ frameworks Build and maintain relationships with executives, technical leaders, product owners, peer architects and other stakeholders to become a trusted advisor Develops and establishes relevant technical, business process and overall support metrics (KPI/SLA) to drive results Manages multiple projects and accurately reports the status of all major assignments while adhering to all project management standards Identify technical, process, structural risks and prepare a risk mitigation plan for all the projects Ensure quality assurance of all the architecture or design decisions and provides technical mitigation support to the delivery teams Recommend tools for reuse, automation for improved productivity and reduced cycle times Leads the development and maintenance of enterprise framework and related artefacts Develops trust and builds effective working relationships through respectful, collaborative engagement across individual product teams Ensures architecture principles and standards are consistently applied to all the projects Ensure optimal Client Engagement Support pre-sales team while presenting the entire solution design and its principles to the client Negotiate, manage and coordinate with the client teams to ensure all requirements are met and create an impact of solution proposed Demonstrate thought leadership with strong technical capability in front of the client to win the confidence and act as a trusted advisor 3.Competency Building and Branding Ensure completion of necessary trainings and certifications Develop Proof of Concepts (POCs),case studies, demos etc. for new growth areas based on market and customer research Develop and present a point of view of Wipro on solution design and architect by writing white papers, blogs etc. Attain market referencability and recognition through highest analyst rankings, client testimonials and partner credits Be the voice of Wipros Thought Leadership by speaking in forums (internal and external) Mentor developers, designers and Junior architects in the project for their further career development and enhancement Contribute to the architecture practice by conducting selection interviews etc 4.Team Management Resourcing Anticipating new talent requirements as per the market/ industry trends or client requirements Hire adequate and right resources for the team Talent Management Ensure adequate onboarding and training for the team members to enhance capability & effectiveness Build an internal talent pool and ensure their career progression within the organization Manage team attrition Drive diversity in leadership positions Performance Management Set goals for the team, conduct timely performance reviews and provide constructive feedback to own direct reports Ensure that the Performance Nxt is followed for the entire team Employee Satisfaction and Engagement Lead and drive engagement initiatives for the team Track team satisfaction scores and identify initiatives to build engagement within the team Mandatory Skills: Google BigQuery. Experience: 8-10 Years.
Posted 2 months ago
5.0 - 8.0 years
9 - 14 Lacs
Hyderabad
Work from Office
Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLAs defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers and clients business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLAs Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks Deliver NoPerformance ParameterMeasure 1ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT 2Team ManagementProductivity, efficiency, absenteeism 3Capability developmentTriages completed, Technical Test performance Mandatory Skills: Google BigQuery.Experience5-8 Years.
Posted 2 months ago
5.0 - 8.0 years
9 - 14 Lacs
Hyderabad
Work from Office
Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLAs defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers and clients business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLAs Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks Deliver/No Performance/Parameter/Measure 1 ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT 2 Team ManagementProductivity, efficiency, absenteeism 3 Capability developmentTriages completed, Technical Test performance Mandatory Skills: Google BigQuery.
Posted 2 months ago
3.0 - 6.0 years
2 - 9 Lacs
Noida, Uttar Pradesh, India
On-site
Project Role : Application Designer Project Role Description : Assist in defining requirements and designing applications to meet business process and application requirements. Must have skills : Google BigQuery Good to have skills : Teradata BI, Oracle Procedural Language Extensions to SQL (PLSQL) Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary: As an Application Designer, you will assist in defining requirements and designing applications to meet business process and application requirements. You will play a crucial role in ensuring the successful implementation of applications. Your typical day will involve collaborating with cross-functional teams, analyzing business needs, and designing innovative solutions to enhance application functionality and user experience. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Collaborate with cross-functional teams to gather and analyze business requirements. - Design and develop applications to meet business process and application requirements. - Create technical specifications and documentation for application design and development. - Perform code reviews and ensure adherence to coding standards. - Troubleshoot and resolve application defects and issues. - Stay updated with emerging technologies and industry trends to continuously improve application design and development processes. Professional & Technical Skills: Must To Have Skills: Proficiency in Google BigQuery. Good To Have Skills: Experience with Teradata BI, Oracle Procedural Language Extensions to SQL (PLSQL). Strong understanding of data warehousing concepts and principles. Experience in designing and optimizing data models for efficient data retrieval and analysis. Hands-on experience with SQL and query optimization techniques. Familiarity with ETL processes and tools for data extraction, transformation, and loading. Knowledge of data governance and data quality best practices. Excellent problem-solving and analytical skills. Additional Information: The candidate should have a minimum of 3 years of experience in Google BigQuery. This position is based at our Mumbai office. A 15 years full time education is required. 15 years full time education
Posted 2 months ago
3.0 - 6.0 years
2 - 9 Lacs
Hyderabad / Secunderabad, Telangana, Telangana, India
On-site
Project Role : Application Designer Project Role Description : Assist in defining requirements and designing applications to meet business process and application requirements. Must have skills : Google BigQuery Good to have skills : Teradata BI, Oracle Procedural Language Extensions to SQL (PLSQL) Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary: As an Application Designer, you will assist in defining requirements and designing applications to meet business process and application requirements. You will play a crucial role in ensuring the successful implementation of applications. Your typical day will involve collaborating with cross-functional teams, analyzing business needs, and designing innovative solutions to enhance application functionality and user experience. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Collaborate with cross-functional teams to gather and analyze business requirements. - Design and develop applications to meet business process and application requirements. - Create technical specifications and documentation for application design and development. - Perform code reviews and ensure adherence to coding standards. - Troubleshoot and resolve application defects and issues. - Stay updated with emerging technologies and industry trends to continuously improve application design and development processes. Professional & Technical Skills: Must To Have Skills: Proficiency in Google BigQuery. Good To Have Skills: Experience with Teradata BI, Oracle Procedural Language Extensions to SQL (PLSQL). Strong understanding of data warehousing concepts and principles. Experience in designing and optimizing data models for efficient data retrieval and analysis. Hands-on experience with SQL and query optimization techniques. Familiarity with ETL processes and tools for data extraction, transformation, and loading. Knowledge of data governance and data quality best practices. Excellent problem-solving and analytical skills. Additional Information: The candidate should have a minimum of 3 years of experience in Google BigQuery. This position is based at our Mumbai office. A 15 years full time education is required. 15 years full time education
Posted 2 months ago
3.0 - 8.0 years
10 - 18 Lacs
Faridabad
Work from Office
Design and implement scalable data architectures to optimize data flow and analytics capabilities. Develop ETL pipelines, data warehouses, and real-time data processing systems. Must have expertise in SQL, Python, and cloud data platforms like AWS Redshift or Google BigQuery. Work closely with data scientists to enhance machine learning models with structured and unstructured data. Prior experience in handling large-scale datasets is preferred.
Posted 2 months ago
3.0 - 8.0 years
10 - 18 Lacs
Vadodara
Work from Office
Design and implement scalable data architectures to optimize data flow and analytics capabilities. Develop ETL pipelines, data warehouses, and real-time data processing systems. Must have expertise in SQL, Python, and cloud data platforms like AWS Redshift or Google BigQuery. Work closely with data scientists to enhance machine learning models with structured and unstructured data. Prior experience in handling large-scale datasets is preferred.
Posted 2 months ago
3.0 - 8.0 years
10 - 18 Lacs
Varanasi
Work from Office
Design and implement scalable data architectures to optimize data flow and analytics capabilities Develop ETL pipelines, data warehouses, and real-time data processing systems Must have expertise in SQL, Python, and cloud data platforms like AWS Redshift or Google BigQuery Work closely with data scientists to enhance machine learning models with structured and unstructured data Prior experience in handling large-scale datasets is preferred
Posted 2 months ago
3.0 - 8.0 years
10 - 18 Lacs
Agra
Work from Office
Design and implement scalable data architectures to optimize data flow and analytics capabilities. Develop ETL pipelines, data warehouses, and real-time data processing systems. Must have expertise in SQL, Python, and cloud data platforms like AWS Redshift or Google BigQuery. Work closely with data scientists to enhance machine learning models with structured and unstructured data. Prior experience in handling large-scale datasets is preferred.
Posted 2 months ago
3.0 - 8.0 years
10 - 18 Lacs
Surat
Work from Office
Design and implement scalable data architectures to optimize data flow and analytics capabilities. Develop ETL pipelines, data warehouses, and real-time data processing systems. Must have expertise in SQL, Python, and cloud data platforms like AWS Redshift or Google BigQuery. Work closely with data scientists to enhance machine learning models with structured and unstructured data. Prior experience in handling large-scale datasets is preferred.
Posted 2 months ago
3.0 - 8.0 years
10 - 18 Lacs
Ludhiana
Work from Office
Design and implement scalable data architectures to optimize data flow and analytics capabilities Develop ETL pipelines, data warehouses, and real-time data processing systems Must have expertise in SQL, Python, and cloud data platforms like AWS Redshift or Google BigQuery Work closely with data scientists to enhance machine learning models with structured and unstructured data Prior experience in handling large-scale datasets is preferred
Posted 2 months ago
3.0 - 8.0 years
10 - 18 Lacs
Coimbatore
Work from Office
Design and implement scalable data architectures to optimize data flow and analytics capabilities Develop ETL pipelines, data warehouses, and real-time data processing systems Must have expertise in SQL, Python, and cloud data platforms like AWS Redshift or Google BigQuery Work closely with data scientists to enhance machine learning models with structured and unstructured data Prior experience in handling large-scale datasets is preferred
Posted 2 months ago
3.0 - 8.0 years
10 - 18 Lacs
Jaipur
Work from Office
Design and implement scalable data architectures to optimize data flow and analytics capabilities Develop ETL pipelines, data warehouses, and real-time data processing systems Must have expertise in SQL, Python, and cloud data platforms like AWS Redshift or Google BigQuery Work closely with data scientists to enhance machine learning models with structured and unstructured data Prior experience in handling large-scale datasets is preferred
Posted 2 months ago
8 - 13 years
10 - 15 Lacs
Jaipur, Rajasthan
Work from Office
Job Summary Auriga is looking for a Data Engineer to design and maintain cloud-native data pipelines supporting real-time analytics and machine learning. You'll work with cross-functional teams to build scalable, secure data solutions using GCP (BigQuery, Looker), SQL, Python, and orchestration tools like Dagster and DBT. Mentoring junior engineers and ensuring data best practices will also be part of your role. WHAT YOU'LL DO: Design, build, and maintain scalable data pipelines and architectures to support analytical and operational workloads. Develop and optimize ETL/ELT pipelines, ensuring efficient data extraction, transformation, and loading from various sources. Work closely with backend and platform engineers to integrate data pipelines into cloud-native applications. Manage and optimize cloud data warehouses, primarily BigQuery, ensuring performance, scalability, and cost efficiency. Implement data governance, security, and privacy best practices, ensuring compliance with company policies and regulations. Collaborate with analytics teams to define data models and enable self-service reporting and BI capabilities. Develop and maintain data documentation, including data dictionaries, lineage tracking, and metadata management. Monitor, troubleshoot, and optimize data pipelines, ensuring high availability and reliability. Stay up to date with emerging data engineering technologies and best practices, continuously improving our data infrastructure. WHAT WE'RE LOOKING FOR: Strong proficiency in English (written and verbal communication) is required. Experience working with remote teams across North America and Latin America, ensuring smooth collaboration across time zones. 5+ years of experience in data engineering, with expertise in building scalable data pipelines and cloud-native data architectures. Strong proficiency in SQL for data modeling, transformation, and performance optimization. Experience with BI and data visualization tools (e.g., Looker, Tableau, or Google Data Studio). Expertise in Python for data processing, automation, and pipeline development. Experience with cloud data platforms, particularly Google Cloud Platform (GCP).Hands-on experience with Google BigQuery, Cloud Storage, and Pub/Sub. Strong knowledge of ETL/ELT frameworks such as DBT, Dataflow, or Apache Beam. Familiarity with workflow orchestration tools like Dagster, Apache Airflow or Google Cloud Workflows. Understanding of data privacy, security, and compliance best practices. Strong problem-solving skills, with the ability to debug and optimize complex data workflows. Excellent communication and collaboration skills. NICE TO HAVES: Experience with real-time data streaming solutions (e.g., Kafka, Pub/Sub, or Kinesis). Familiarity with machine learning workflows and MLOps best practices. Knowledge of Terraform for Infrastructure as Code (IaC) in data environments. Familiarity with data integrations involving Contentful, Algolia, Segment, and .
Posted 2 months ago
5 - 10 years
9 - 19 Lacs
Chennai, Bengaluru, Mumbai (All Areas)
Hybrid
Google BigQuery Location- Pan India Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Key Responsibilities : Analyze and model client market and key performance data Use analytical tools and techniques to develop business insights and improve decisionmaking \n1:Data Proc PubSub Data flow Kalka Streaming Looker SQL No FLEX\n2:Proven track record of delivering data integration data warehousing soln\n3: Strong SQL And Handson Pro in BigQuery SQL languageExp in Shell Scripting Python No FLEX\n4:Exp with data integration and migration projects Oracle SQL Technical Experience : Google BigQuery\n\n1: Expert in Python NO FLEX Strong handson knowledge in SQL NO FLEX Python programming using Pandas NumPy deep understanding of various data structure dictionary array list tree etc experiences in pytest code coverage skills\n2: Exp with building solutions using cloud native services: bucket storage Big Query cloud function pub sub composer and Kubernetes NO FLEX\n3: Pro with tools to automate AZDO CI CD pipelines like ControlM GitHub JIRA confluence CI CD Pipeline Professional Attributes :
Posted 2 months ago
3 - 6 years
5 - 11 Lacs
Bengaluru, Hyderabad, Mumbai (All Areas)
Work from Office
Project Role : Application Designer Project Role Description : Assist in defining requirements and designing applications to meet business process and application requirements. Must have skills : Google BigQuery Good to have skills : Teradata BI, Oracle Procedural Language Extensions to SQL (PLSQL) Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary: As an Application Designer, you will assist in defining requirements and designing applications to meet business process and application requirements. You will play a crucial role in ensuring the successful implementation of applications. Your typical day will involve collaborating with cross-functional teams, analyzing business needs, and designing innovative solutions to enhance application functionality and user experience. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Collaborate with cross-functional teams to gather and analyze business requirements. - Design and develop applications to meet business process and application requirements. - Create technical specifications and documentation for application design and development. - Perform code reviews and ensure adherence to coding standards. - Troubleshoot and resolve application defects and issues. - Stay updated with emerging technologies and industry trends to continuously improve application design and development processes. Professional & Technical Skills: - Must To Have Skills: Proficiency in Google BigQuery. - Good To Have Skills: Experience with Teradata BI, Oracle Procedural Language Extensions to SQL (PLSQL). - Strong understanding of data warehousing concepts and principles. - Experience in designing and optimizing data models for efficient data retrieval and analysis. - Hands-on experience with SQL and query optimization techniques. - Familiarity with ETL processes and tools for data extraction, transformation, and loading. - Knowledge of data governance and data quality best practices. - Excellent problem-solving and analytical skills. Additional Information: - The candidate should have a minimum of 3 years of experience in Google BigQuery. - This position is based at our Mumbai office. - A 15 years full time education is required. 15 years full time education
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough