Jobs
Interviews

1356 Bigquery Jobs - Page 13

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 9.0 years

12 - 17 Lacs

Chennai

Work from Office

Job Summary Synechron is seeking an experienced Data Processing Engineer to lead the development of large-scale data processing solutions using Java, Apache Flink/Storm/Beam, and Google Cloud Platform (GCP). In this role, you will collaborate across teams to design, develop, and optimize data-intensive applications that support strategic business objectives. Your expertise will help evolve our data architecture, improve processing efficiency, and ensure the delivery of reliable, scalable solutions in an Agile environment. Software Requirements Required: Java (version 8 or higher) Apache Flink, Storm, or Beam for streaming data processing Google Cloud Platform (GCP) services, especially BigQuery and related data tools Experience with databases such as BigQuery, Oracle, or equivalent Familiarity with version control tools such as Git Preferred: Cloud deployment experience with GCP in particular Additional familiarity with containerization (Docker/Kubernetes) Knowledge of CI/CD pipelines and DevOps practices Overall Responsibilities Collaborate closely with cross-functional teams to understand data and system requirements, then design scalable solutions aligned with business needs. Develop detailed technical specifications, implementation plans, and documentation for new features and enhancements. Implement, test, and deploy data processing applications using Java and Apache Flink/Storm/Beam within GCP environments. Conduct code reviews to ensure quality, security, and maintainability, supporting team members' growth and best practices. Troubleshoot technical issues, resolve bottlenecks, and optimize application performance and resource utilization. Stay current with advancements in data processing, cloud technology, and Java development to continuously improve solutions. Support testing teams to verify data workflows and validation processes, ensuring reliability and accuracy. Participate in Agile ceremonies, including sprint planning, stand-ups, and retrospectives to ensure continuous delivery and process improvement. Technical Skills (By Category) Programming Languages: Required: Java (8+) Preferred: Python, Scala, or Node.js for scripting or auxiliary processing Databases/Data Management: Experience with BigQuery, Oracle, or similar relational data stores Cloud Technologies: GCP (BigQuery, Cloud Storage, Dataflow etc.) with hands-on experience in cloud data solutions Frameworks and Libraries: Apache Flink, Storm, or Beam for stream processing Java SDKs, APIs, and data integration libraries Development Tools and Methodologies: Git, Jenkins, JIRA, and Agile/Scrum practices Familiarity with containerization (Docker, Kubernetes) is a plus Security and Compliance: Understanding of data security principles in cloud environments Experience Requirements 4+ years of experience in software development, with a focus on data processing and Java-based backend development Proven experience working with Apache Flink, Storm, or Beam in production environments Strong background in managing large data workflows and pipeline optimization Experience with GCP data services and cloud-native development Demonstrated success in Agile projects, including collaboration with cross-functional teams Previous leadership or mentorship experience is a plus Day-to-Day Activities Design, develop, and deploy scalable data processing applications in Java using Flink/Storm/Beam on GCP Collaborate with data engineers, analysts, and architects to translate business needs into technical solutions Conduct code reviews, optimize data pipelines, and troubleshoot system issues swiftly Document technical specifications, data schemas, and process workflows Participate actively in Agile ceremonies, provide updates on task progress, and suggest process improvements Support continuous integration and deployment of data applications Mentor junior team members, sharing best practices and technical insights Qualifications Bachelors or Masters degree in Computer Science, Information Technology, or equivalent Relevant certifications in cloud technologies or data processing (preferred) Evidence of continuous professional development and staying current with industry trends Professional Competencies Strong analytical and problem-solving skills focused on data processing challenges Leadership abilities to guide, mentor, and develop team members Excellent communication skills for technical documentation and stakeholder engagement Adaptability to rapidly changing technologies and project priorities Capacity to prioritize tasks and manage time efficiently under tight deadlines Innovative mindset to leverage new tools and techniques for performance improvements S YNECHRONS DIVERSITY & INCLUSION STATEMENT Diversity & Inclusion are fundamental to our culture, and Synechron is proud to be an equal opportunity workplace and is an affirmative action employer. Our Diversity, Equity, and Inclusion (DEI) initiative Same Difference is committed to fostering an inclusive culture promoting equality, diversity and an environment that is respectful to all. We strongly believe that a diverse workforce helps build stronger, successful businesses as a global company. We encourage applicants from across diverse backgrounds, race, ethnicities, religion, age, marital status, gender, sexual orientations, or disabilities to apply. We empower our global workforce by offering flexible workplace arrangements, mentoring, internal mobility, learning and development programs, and more.

Posted 2 weeks ago

Apply

7.0 - 12.0 years

22 - 27 Lacs

Kochi, Chennai, Bengaluru

Hybrid

Ab Initio ETL Application Developer ETL developer with Ab initio experience for Data Warehouse and Data Mart applications within the HealthCare Insurance business. The position requires a strong proficiency in data integration (ETL) and involvement in all phases of application development. When you join the team, you are joining a team of elite, passionate software professionals that take pride in engineering excellence and creative solutions that add value to our stakeholders. You will have plenty of opportunities to showcase your talent, learn new technologies and have a rewarding and fulfilling career. Responsibilities • Develop complex programs from detailed technical specifications. • Designs, codes, tests, debugs, and documents those programs. Competent to work at the highest technical level of phases of applications systems analysis and programming activities. • Independently designs and/or codes the development of cost-effective application and program solutions. • Independently performs ongoing system maintenance, research, problem resolution and on-call support tasks for existing systems. • Is fully familiar and compliant with the efficient utilization of the prescribed methodologies and ensures compliance with all work performed. • Performs unit testing. May perform or assist with integration and system testing, according to detailed test plans to ensure high-quality systems. May assist business partners with User Acceptance Testing. • Responsible to follow all procedures and directions to ensure Code Asset Management for an application or set of applications. • Supports and promotes the reuse of assets across the organization. Required Qualifications • Expertise in one or more programming language, development tools, and/or databases and the systems development life cycle, applicable to development organization. • 7+ years of Ab initio and Data Warehousing in parallel processing environment required • 5+ years SQL experience, advanced SQL coding in an enterprise setting • Hadoop, Pig, Hive, Scope, HQL Experience • Strong Unix KShell scripting desired. • Ab initio and Data Warehousing coding experience, testing, and debugging experience required • Solid analytical and software development skills • Ability to optimize SQL coding for efficiency • GCP cloud technologies including Big Query Preferred Qualifications • Healthcare domain experience • ZEKE knowledge a plus • Working knowledge of mainframe and midrange environments • Experience with application development support software packages • Experience working in an Agile framework such as SAFe. • Affiliations with a technical or professional organization or user group

Posted 2 weeks ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Chennai

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : PySpark Good to have skills : Java Enterprise Edition, Google BigQuery, Google Cloud Platform ArchitectureMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will engage in the design, construction, and configuration of applications tailored to fulfill specific business processes and application requirements. Your typical day will involve collaborating with team members to understand project needs, developing innovative solutions, and ensuring that applications are optimized for performance and usability. You will also participate in testing and debugging processes to ensure the applications function as intended, while continuously seeking ways to enhance application efficiency and user experience. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application specifications and user guides.- Engage in code reviews to ensure adherence to best practices and standards. Professional & Technical Skills: - Must To Have Skills: Proficiency in PySpark.- Good To Have Skills: Experience with Java Enterprise Edition, Google BigQuery, Google Cloud Platform Architecture.- Strong understanding of data processing frameworks and distributed computing.- Experience with application development lifecycle and methodologies.- Familiarity with database management systems and data modeling. Additional Information:- The candidate should have minimum 3 years of experience in PySpark.- This position is based at our Chennai office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

15.0 - 20.0 years

3 - 7 Lacs

Bengaluru

Work from Office

Project Role : Business and Integration Practitioner Project Role Description : Assists in documenting the integration strategy endpoints and data flows. Is familiar with the entire project life-cycle, including requirements analysis, coding, testing, deployment, and operations to ensure successful integration. Under the guidance of the Architect, ensure the integration strategy meets business goals. Must have skills : SAP Configure Price & Quote Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Business and Integration Practitioner, you will assist in documenting the integration strategy endpoints and data flows. Your typical day will involve collaborating with various teams to ensure that the integration strategy aligns with business objectives. You will engage in discussions to analyze requirements, participate in coding and testing activities, and contribute to deployment efforts. Your role will also include monitoring operations to ensure successful integration throughout the project life-cycle, all while working under the guidance of the Architect to meet the overall business goals effectively. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities and foster a collaborative environment.- Monitor project progress and provide regular updates to stakeholders to ensure alignment with business objectives. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP Configure Price & Quote.- Strong understanding of integration strategies and data flow documentation.- Experience with project life-cycle management, including requirements analysis and deployment.- Ability to collaborate effectively with cross-functional teams to achieve project goals.- Familiarity with testing methodologies to ensure quality and performance of integrations. Additional Information:- The candidate should have minimum 5 years of experience in SAP Configure Price & Quote.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Chennai

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Apache Spark Good to have skills : MySQL, Python (Programming Language), Google BigQueryMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will engage in the design, construction, and configuration of applications tailored to fulfill specific business processes and application requirements. Your typical day will involve collaborating with team members to understand project needs, developing innovative solutions, and ensuring that applications are optimized for performance and usability. You will also participate in testing and debugging processes to ensure the applications function as intended, while continuously seeking ways to enhance application efficiency and user experience. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application specifications and user guides.- Engage in code reviews to ensure quality and adherence to best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in Apache Spark.- Good To Have Skills: Experience with MySQL, Python (Programming Language), Google BigQuery.- Strong understanding of data processing frameworks and distributed computing.- Experience in developing and deploying applications in cloud environments.- Familiarity with version control systems such as Git. Additional Information:- The candidate should have minimum 3 years of experience in Apache Spark.- This position is based at our Chennai office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

5.0 - 8.0 years

2 - 6 Lacs

Mumbai

Work from Office

Skill required: Delivery - Signal Tag Management Designation: I&F Decision Science Practitioner Senior Analyst Qualifications: Any Graduation Years of Experience: 5 to 8 Years About Accenture Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song all powered by the worlds largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. Visit us at www.accenture.com What would you do Data & AI Enterprise tag management platform used to manage website advertising and analytics tags. Configure, Implement and update tags and libraries using JavaScript in IT development cycle, debug and understand which tags are firing on a website and why. What are we looking for Web Analytics Tag Implementation Google Tag Manager (GTM) Implementation Google Analytics Tag implementation Strong expertise of GA4 Experience in building Looker/Data Studio Dashboards or Tableau Dashboards Experience with Tag Auditing and Testing Strong JavaScript Skills Data Layer Understanding GTM/Google Analytics Certification preferred Ability to meet deadlines Adaptable and flexible Ability to work well in a team Ability to handle disputes Commitment to quality Roles and Responsibilities: Expert in Digital Analytics Implementation working with international client for 6-8 years Understanding functional and technical requirements Extensive experience working with Analytics Tags implementation using Google Tag Manager Extensive experience working with Google Analytics 4 along with GTM Excellent JS skills required for Analytics implementation Develop custom solutions using JavaScript with within the Tag Management System for tracking user behavior and send the data to web analytics tools for building further analytics reporting capabilities Experience with Data visualization tools like Looker /Data Studio or Tableau dashboard Exposure in digital marketing, campaign analytics and web reporting Manage expectations in Data Extraction from any analytics tool & reporting delivery and implementation efforts. Understanding of Website Data Layer and mapping of website data layer with GTM Tags Excellent written and oral communication skills to build relationship with the client keeping the geographical and cultural difference in mind. Strong analytical and problem-solving skills. Ability to understand the business context and apply analytical concepts to provide business solutions Willingness to continuously learn and upgrade skills Ability to work both independently and in a team-oriented environment Interest in working in digital marketing and passion for analytics Hands-on experience working on 3rd party tracking implementations using tag manager templates Nice to have Experience on GA/GTM APIs and building automation tools for analytics reporting & Implementation will be an added advantage Qualification Any Graduation

Posted 2 weeks ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Chennai

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Apache Spark Good to have skills : MySQL, Python (Programming Language), Google BigQueryMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will engage in the design, construction, and configuration of applications tailored to fulfill specific business processes and application requirements. Your typical day will involve collaborating with team members to understand project needs, developing innovative solutions, and ensuring that applications are optimized for performance and usability. You will also participate in testing and debugging processes to guarantee the quality of the applications you create, while continuously seeking ways to enhance functionality and user experience. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with cross-functional teams to gather and analyze requirements for application development.- Participate in code reviews to ensure adherence to best practices and coding standards. Professional & Technical Skills: - Must To Have Skills: Proficiency in Apache Spark.- Good To Have Skills: Experience with MySQL, Python (Programming Language), Google BigQuery.- Strong understanding of data processing and transformation techniques.- Experience with application development frameworks and methodologies.- Familiarity with cloud computing platforms and services. Additional Information:- The candidate should have minimum 3 years of experience in Apache Spark.- This position is based at our Chennai office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

10.0 - 14.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Oracle Procedural Language Extensions to SQL (PLSQL) Good to have skills : Python (Programming Language)Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions, and ensuring that applications function seamlessly within the existing infrastructure. You will engage in problem-solving activities, participate in team meetings, and contribute to the overall success of projects by delivering high-quality applications that align with business objectives. Roles & Responsibilities:- Identifying cost optimization opportunities, such as rightsizing, reservations, savings plans, and technology upgrades.- Oversee and guide the analytics team, fostering an environment of collaboration and continuous improvement. Ensure the team delivers high-quality data insights and business intelligence solutions.- Utilize advanced SQL, Big Query, and BI tools to perform deep data analysis, extract insights, and create tailored dashboards and reports for internal stakeholders.- Leverage in-depth knowledge of cloud services to generate data-driven cost optimization strategies and actionable insights that align with business objectives.- Serve as the main point of contact between technical teams and various internal departments to understand requirements, bridge gaps, and facilitate the effective communication of technical information. Professional & Technical Skills: - Must To Have Skills: Cloud FinOps, Cloud Cost Management ,Data Analytics and Warehousing, Strong skills in SQL, Big Query, Python and business intelligence tools (e.g., Power BI, Tableau)- Strong understanding with AWS, Azure, and GCP platforms, including their pricing models and service offerings- Analytical thinking with a proactive approach to overcoming challenges- Knowledge of SAP Cloud Platform services and integration capabilities.- Familiarity with SAP Fiori design principles and UI5 development.- Good To Have Skills: Cloud Infrastructure Management, DevOps ,SDLC.- RecommendationBlend of Data Engineering with Cloud + FinOps. Additional Information:- The candidate should have a minimum of 10 years of experience in Cloud Infra , Analytics and Insights and Cloud FinOps Analytics. (Relevant on Cloud FinOps :Min 3 years)- This position is based at our Bengaluru office.- A 15 years full-time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

8.0 - 13.0 years

20 - 30 Lacs

Gurugram

Work from Office

Hi, Wishes from GSN!!! Pleasure connecting with you!!! We been into Corporate Search Services for Identifying & Bringing in Stellar Talented Professionals for our reputed IT / Non-IT clients in India. We have been successfully providing results to various potential needs of our clients for the last 20 years. At present, GSN is hiring GCP ENGINEER for one of our leading MNC client. PFB the details for your better understanding: 1. WORK LOCATION : Gurugram 2. Job Role: GCP ENGINEER 3. EXPERIENCE : 8+ yrs 4. CTC Range: Rs. 20 LPA to Rs. 30 LPA 5. Work Type : WFO (Hybrid) ****** Looking for IMMEDIATE JOINER ****** Who are we looking for ? MLOPS Engineer with AWS Experience. Required Skills : GCP Arch. Certification Terraform GitLab Shell Scripting GCP Services Compute Engine Cloud Storage Data Flow Big Query IAM . ****** Looking for IMMEDIATE JOINER ****** Best regards, Kaviya | GSN | Kaviya@gsnhr.net | 9150016092 | Google review : https://g.co/kgs/UAsF9W

Posted 2 weeks ago

Apply

5.0 - 10.0 years

3 - 7 Lacs

Gurugram

Work from Office

Project Role : Business and Integration Practitioner Project Role Description : Assists in documenting the integration strategy endpoints and data flows. Is familiar with the entire project life-cycle, including requirements analysis, coding, testing, deployment, and operations to ensure successful integration. Under the guidance of the Architect, ensure the integration strategy meets business goals. Must have skills : ALIP Product Configuration Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Business and Integration Practitioner, you will assist in documenting the integration strategy endpoints and data flows. You will be involved in the entire project life-cycle, from requirements analysis to deployment, ensuring successful integration under the guidance of the Architect. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Ensure the integration strategy endpoints and data flows are documented effectively.- Support the Architect in aligning the integration strategy with business goals.- Coordinate with stakeholders to gather integration requirements.- Contribute to the coding, testing, and deployment phases of the project. Professional & Technical Skills: - Must To Have Skills: Proficiency in ALIP Product Configuration.- Strong understanding of integration strategies and data flows.- Experience in requirements analysis and coding for integration projects.- Knowledge of testing methodologies for integration solutions.- Familiarity with deployment and operations of integrated systems. Additional Information:- The candidate should have a minimum of 5 years of experience in ALIP Product Configuration.- This position is based at our Indore office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

3.0 - 8.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Google BigQuery Good to have skills : Microsoft SQL Server, Google Cloud Data ServicesMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting any issues that arise in the data flow and processing stages. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the optimization of data pipelines for performance and efficiency.- Collaborate with data analysts and other stakeholders to understand data needs and deliver appropriate solutions. Professional & Technical Skills: - Must To Have Skills: Proficiency in Google BigQuery.- Good To Have Skills: Experience with Microsoft SQL Server, Google Cloud Data Services.- Strong understanding of data modeling and database design principles.- Experience with data integration tools and ETL processes.- Familiarity with cloud-based data storage and processing solutions. Additional Information:- The candidate should have minimum 3 years of experience in Google BigQuery.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

1.0 - 3.0 years

2 - 5 Lacs

Mumbai

Work from Office

Relationship Manager- TASC & Institutional Business TASC & Institutional Business vertical is a part of Retail Liabilities division of our Consumer Bank segment. It is Kotaks dedicated Sales/Relationship Channel for customer segments like Trusts, Societies, Educational bodies, Associations, Embassy/ Diplomatic Missions, Govt. Departments/ PSUs, etc. Key Functions & Responsibilities: TASC Business segment comprises of Not for Profit Making Business Entities (other than Govt Departments - Central, State or Local Bodies) registered as Trusts, Societies, Sec 25/Sec 8 Companies, Cooperative Societies TASC Business Segment is further categorises into various Business Sub Segments which is a very large Universe for a TASC RM to work on Acquisition. The various Sub Segments are Education (Pre Schools, Primary Schools, Elementary Schools, Secondary Schools, Higher Secondary Schools, Colleges, Universities, Technology & Management Institutes, Professional & Technical Institutes, Coaching Institutes, Training Institutes, Examination Boards, etc), FCRA entities (those who have received permission from Ministry of Home Affairs to receive Foreign Donation), Cooperative Societies (Housing, Marketing & Credit Coop Societies), Hospitals, Clubs (Professional, Business, City, Sports, Lifestyle Clubs), NGOs & Foundations, Associations (Market, Trade, Professional, Industry, Sports Associations), Research Bodies, Religious Institutions (Temples, Gurudwaras, Mosques, Churches), Primary Agricultural Societies, CSR & Retirals (PF Trusts, Gratuity Trusts, Superannuation Trusts) Should have an eye for acquiring High Value relationships Should possess Negotiation Skills and requisite skill sets in making presentations to Senior Management for sealing Deals Create a rapport with the top management of all the clientele Good communication and presentation skills, negotiation skills with ability to interact with people at various levels of the organization and outside environment, strong sales and relationship management skills. Educational Qualifications: Should be a MBA/Graduate/Post Graduate having 1-3 years of similar profile

Posted 2 weeks ago

Apply

1.0 - 3.0 years

3 - 7 Lacs

Gurugram

Work from Office

Relationship Manager- TASC & Institutional Business TASC & Institutional Business vertical is a part of Retail Liabilities division of our Consumer Bank segment. It is Kotaks dedicated Sales/Relationship Channel for customer segments like Trusts, Societies, Educational bodies, Associations, Embassy/ Diplomatic Missions, Govt. Departments/ PSUs, etc. Key Functions & Responsibilities: TASC Business segment comprises of Not for Profit Making Business Entities (other than Govt Departments - Central, State or Local Bodies) registered as Trusts, Societies, Sec 25/Sec 8 Companies, Cooperative Societies TASC Business Segment is further categorises into various Business Sub Segments which is a very large Universe for a TASC RM to work on Acquisition. The various Sub Segments are Education (Pre Schools, Primary Schools, Elementary Schools, Secondary Schools, Higher Secondary Schools, Colleges, Universities, Technology & Management Institutes, Professional & Technical Institutes, Coaching Institutes, Training Institutes, Examination Boards, etc), FCRA entities (those who have received permission from Ministry of Home Affairs to receive Foreign Donation), Cooperative Societies (Housing, Marketing & Credit Coop Societies), Hospitals, Clubs (Professional, Business, City, Sports, Lifestyle Clubs), NGOs & Foundations, Associations (Market, Trade, Professional, Industry, Sports Associations), Research Bodies, Religious Institutions (Temples, Gurudwaras, Mosques, Churches), Primary Agricultural Societies, CSR & Retirals (PF Trusts, Gratuity Trusts, Superannuation Trusts) Should have an eye for acquiring High Value relationships Should possess Negotiation Skills and requisite skill sets in making presentations to Senior Management for sealing Deals Create a rapport with the top management of all the clientele Good communication and presentation skills, negotiation skills with ability to interact with people at various levels of the organization and outside environment, strong sales and relationship management skills. Educational Qualifications: Should be a MBA/Graduate/Post Graduate having 1-3 years of similar profile

Posted 2 weeks ago

Apply

1.0 - 3.0 years

3 - 7 Lacs

Surat

Work from Office

Relationship Manager- TASC & Institutional Business TASC & Institutional Business vertical is a part of Retail Liabilities division of our Consumer Bank segment. It is Kotaks dedicated Sales/Relationship Channel for customer segments like Trusts, Societies, Educational bodies, Associations, Embassy/ Diplomatic Missions, Govt. Departments/ PSUs, etc. Key Functions & Responsibilities: TASC Business segment comprises of Not for Profit Making Business Entities (other than Govt Departments - Central, State or Local Bodies) registered as Trusts, Societies, Sec 25/Sec 8 Companies, Cooperative Societies TASC Business Segment is further categorises into various Business Sub Segments which is a very large Universe for a TASC RM to work on Acquisition. The various Sub Segments are Education (Pre Schools, Primary Schools, Elementary Schools, Secondary Schools, Higher Secondary Schools, Colleges, Universities, Technology & Management Institutes, Professional & Technical Institutes, Coaching Institutes, Training Institutes, Examination Boards, etc), FCRA entities (those who have received permission from Ministry of Home Affairs to receive Foreign Donation), Cooperative Societies (Housing, Marketing & Credit Coop Societies), Hospitals, Clubs (Professional, Business, City, Sports, Lifestyle Clubs), NGOs & Foundations, Associations (Market, Trade, Professional, Industry, Sports Associations), Research Bodies, Religious Institutions (Temples, Gurudwaras, Mosques, Churches), Primary Agricultural Societies, CSR & Retirals (PF Trusts, Gratuity Trusts, Superannuation Trusts) Should have an eye for acquiring High Value relationships Should possess Negotiation Skills and requisite skill sets in making presentations to Senior Management for sealing Deals Create a rapport with the top management of all the clientele Good communication and presentation skills, negotiation skills with ability to interact with people at various levels of the organization and outside environment, strong sales and relationship management skills. Educational Qualifications: Should be a MBA/Graduate/Post Graduate having 1-3 years of similar profile

Posted 2 weeks ago

Apply

1.0 - 3.0 years

3 - 7 Lacs

Mumbai

Work from Office

Relationship Manager- TASC & Institutional Business TASC & Institutional Business vertical is a part of Retail Liabilities division of our Consumer Bank segment. It is Kotaks dedicated Sales/Relationship Channel for customer segments like Trusts, Societies, Educational bodies, Associations, Embassy/ Diplomatic Missions, Govt. Departments/ PSUs, etc. Key Functions & Responsibilities: TASC Business segment comprises of Not for Profit Making Business Entities (other than Govt Departments - Central, State or Local Bodies) registered as Trusts, Societies, Sec 25/Sec 8 Companies, Cooperative Societies TASC Business Segment is further categorises into various Business Sub Segments which is a very large Universe for a TASC RM to work on Acquisition. The various Sub Segments are Education (Pre Schools, Primary Schools, Elementary Schools, Secondary Schools, Higher Secondary Schools, Colleges, Universities, Technology & Management Institutes, Professional & Technical Institutes, Coaching Institutes, Training Institutes, Examination Boards, etc), FCRA entities (those who have received permission from Ministry of Home Affairs to receive Foreign Donation), Cooperative Societies (Housing, Marketing & Credit Coop Societies), Hospitals, Clubs (Professional, Business, City, Sports, Lifestyle Clubs), NGOs & Foundations, Associations (Market, Trade, Professional, Industry, Sports Associations), Research Bodies, Religious Institutions (Temples, Gurudwaras, Mosques, Churches), Primary Agricultural Societies, CSR & Retirals (PF Trusts, Gratuity Trusts, Superannuation Trusts) Should have an eye for acquiring High Value relationships Should possess Negotiation Skills and requisite skill sets in making presentations to Senior Management for sealing Deals Create a rapport with the top management of all the clientele Good communication and presentation skills, negotiation skills with ability to interact with people at various levels of the organization and outside environment, strong sales and relationship management skills. Educational Qualifications: Should be a MBA/Graduate/Post Graduate having 1-3 years of similar profile

Posted 2 weeks ago

Apply

1.0 - 3.0 years

2 - 5 Lacs

Kolkata

Work from Office

Relationship Manager- TASC & Institutional Business TASC & Institutional Business vertical is a part of Retail Liabilities division of our Consumer Bank segment. It is Kotaks dedicated Sales/Relationship Channel for customer segments like Trusts, Societies, Educational bodies, Associations, Embassy/ Diplomatic Missions, Govt. Departments/ PSUs, etc. Key Functions & Responsibilities: TASC Business segment comprises of Not for Profit Making Business Entities (other than Govt Departments - Central, State or Local Bodies) registered as Trusts, Societies, Sec 25/Sec 8 Companies, Cooperative Societies TASC Business Segment is further categorises into various Business Sub Segments which is a very large Universe for a TASC RM to work on Acquisition. The various Sub Segments are Education (Pre Schools, Primary Schools, Elementary Schools, Secondary Schools, Higher Secondary Schools, Colleges, Universities, Technology & Management Institutes, Professional & Technical Institutes, Coaching Institutes, Training Institutes, Examination Boards, etc), FCRA entities (those who have received permission from Ministry of Home Affairs to receive Foreign Donation), Cooperative Societies (Housing, Marketing & Credit Coop Societies), Hospitals, Clubs (Professional, Business, City, Sports, Lifestyle Clubs), NGOs & Foundations, Associations (Market, Trade, Professional, Industry, Sports Associations), Research Bodies, Religious Institutions (Temples, Gurudwaras, Mosques, Churches), Primary Agricultural Societies, CSR & Retirals (PF Trusts, Gratuity Trusts, Superannuation Trusts) Should have an eye for acquiring High Value relationships Should possess Negotiation Skills and requisite skill sets in making presentations to Senior Management for sealing Deals Create a rapport with the top management of all the clientele Good communication and presentation skills, negotiation skills with ability to interact with people at various levels of the organization and outside environment, strong sales and relationship management skills. Educational Qualifications: Should be a MBA/Graduate/Post Graduate having 1-3 years of similar profile

Posted 2 weeks ago

Apply

8.0 - 13.0 years

35 - 60 Lacs

Bengaluru, Mumbai (All Areas)

Work from Office

Role : The Data Engineers role is to design and build data-centric solutions that provide actionable business insights and deepen analytic capabilities in the JSW One Platform systems. The Data Engineer will be responsible for taking requirements and analysis of business processes and developing applications himself/herself or with help of vendors. This will include ETL/ELT coding, database & architecture design in cloud environment and implementation, and working closely with Data Scientists and Data Visualization developers on solution design. Responsibilities : Key skills : Architect, Plan and Develop our data strategy and work with our team to turn it into working software. Co-create the detailed application architecture strategy with the teams, aligning your strategy with the teams deliverables Take an active role in collaborating to develop strategic direction, systems roadmap, and business and operational processes by providing the required technical guidance. Work with business stakeholders, product managers and architects to understand the business data and related processes Own the communication and documentation of the strategy, syndicating with CTO, business and the development teams Get hands-on in the code building prototypes and upholding best design and engineering practice, demonstrating the patterns you would like realized Extensive knowledge in data integration architecture/ETL/Data warehouse, esp large volumes with complex integrations Help the teams slice their deliveries to allow for agile and incremental delivery of business value Consult development teams towards application development in Cloud Native way Evaluate tools / services and new technologies and suggest the right service and technology to be used. Speed as a Habit can operate in a fast-moving environment, make quick decisions and execute fiercely to deliver outcomes. Using Agile and Dev/Ops methods, build platform architecture using a variety of sources (such as cloud IaaS/ SaaS). Integrate data from a variety of business subject areas: Leads management, Customer Portal, Seller App, SAP, Sales Force etc Implement rules & automate data cleansing, mapping, transformation, logging, and exception handling. Design, build, and deploy databases and data stores Participate in cross-functional teams to promote technology strategies, analyze and test products, perform proof-of-concept, and pilot new technologies and/or methods. Establish and document standards, guidelines, and best practices for teams utilizing the solutions. Review vendor solution designs to ensure technology appropriateness, standards compliance, and platform capacity alignment. Required Skills At least 5+ years of overall experience with over 2+ years in architecting and working on big data applications and technologies Strong Object-Oriented Programming concepts. Should be proficient in Server Side (Java/Linux) technologies. Hands-on deep knowledge of Java and Spring boot Expertise in GCP (google cloud platform) and ability to operate in DevOps model. Expertise in architecting or developing features for enterprise scale systems will be an added advantage. Good experience in data and ML use cases, incl building data pipelines , document scanning and ingestion , camera feed anomaly detection, etc Experience with all relevant tools, platforms and services including Big Query, Python, Langchain, Gemini, Hugging face, etc Understands concepts of data governance, cataloging , etc

Posted 2 weeks ago

Apply

7.0 - 12.0 years

8 - 13 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Role: Python SQL Bigquery 5+ years of experience in Python is must Notice Period - Immediate to 30 Days 7+ overall years of experience is required Job Summary We are seeking a highly skilled Senior Data Engineer with extensive experience in Python, SQL, and Google BigQuery. The ideal candidate will play a critical role in designing, building, and maintaining scalable data solutions that drive business insights. This role requires strong problem-solving abilities, a keen eye for detail, and a passion for working with large datasets. Key Responsibilities: Develop, optimize, and maintain data pipelines using Python, SQL, and BigQuery. Design and implement efficient data warehouse solutions to support analytics and business intelligence needs. Ensure data quality, integrity, and security across all projects. Collaborate with data scientists, analysts, and business teams to understand data requirements. Write complex SQL queries and procedures for performance optimization. Automate data workflows and processes to improve efficiency. Monitor and troubleshoot data pipelines to ensure seamless data processing. Stay updated with industry trends and best practices in data engineering and analytics. Required Skills & Qualifications: Bachelor's or Master's degree in Computer Science, Engineering, or related field. 5+ years of experience working with Python, SQL, and BigQuery. Strong experience in data modeling, ETL development, and data warehousing concepts. Expertise in writing optimized SQL queries for large-scale datasets. Hands-on experience with cloud-based data platforms, preferably Google Cloud. Excellent problem-solving skills and ability to work independently. Strong communication and collaboration abilities.

Posted 2 weeks ago

Apply

6.0 - 8.0 years

27 - 42 Lacs

Hyderabad

Work from Office

Job Summary 6+ years of Analytics experience Demonstrated ability to work independently and collaborate within a large team of analysts as well as cross-functionally with external engineering and product stakeholders. Experience in pulling data from datasets (i.e. SQL) applying transformations and data analysis to solve business problems. Top decile SQL skills are a must (experience in BigQuery is strongly preferred) Experience in using tools like Tableau/PowerBI for creating dashboards intuitive Responsibilities Ability to thrive in ambiguity & adapt to a fast paced environment with strong organizational and coordination skills A curious self starter who does not fear failure while exploring new datasets and run tests to understand the existing data structures & infrastructure without any/much documentation/guidance to rely on Ability to conduct root cause analysis and develop structured solutions while taking constraints into account Experience to articulate translate refine and prioritize product / business requirements into structured technical data requirements and conduct feasibility studies Experience with composing SQL scripts and to create datamarts / data warehouses along with its data pipelines. Experience with creation of dashboards and reports providing insight into business data. Aggregate organize and visualize data to communicate effectively using presentation skills Strong verbal and written English communication skills with the ability to interact cross functionally with product analysts data scientists engineers program managers ops managers and other team members. Strong problem solving ability and quantitative support along with skills to think outside the box for creative solutions Ability to debug optimize existing code and drive improvements to help reduce maintenance efforts on data infrastructur

Posted 2 weeks ago

Apply

5.0 - 7.0 years

14 - 17 Lacs

Pune

Work from Office

A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. You'll work with visionaries across multiple industries to improve the hybrid cloud and Al journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio; including Software and Red Hat. In your role, you will be responsible for: Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies. End to End functional knowledge of the data pipeline/transformation implementation that the candidate has done, should understand the purpose/KPIs for which data transformation was done. Preferred technical and professional experience Experience with AEM Core Technologies OSGI Services, Apache Sling ,Granite Framework., Java Content Repository API, Java 8+, Localization Familiarity with building tools, Jenkin and Maven , Knowledge of version control tools, especially Git, Knowledge of Patterns and Good Practices to design and develop quality and clean code, Knowledge of HTML, CSS, and JavaScript , jQuery Familiarity with task management, bug tracking, and collaboration tools like JIRA and Confluence.

Posted 2 weeks ago

Apply

10.0 - 14.0 years

0 Lacs

karnataka

On-site

As a Principal Engineer at Walmart Enterprise Business Services (EBS), you will play a crucial role in shaping the engineering direction and driving architectural decisions. You will be responsible for leading the design and development of full stack applications with high scalability and resilience. Your expertise in cloud-native development on Google Cloud Platform (GCP) will be instrumental in ensuring the delivery of secure and high-performing solutions across the platform. In this role, you will be expected to architect complex cloud-native systems using a variety of GCP services, define best practices, and drive engineering excellence across teams. Your responsibilities will include building and optimizing APIs and frontend frameworks, guiding the adoption of serverless and container-based architectures, and championing CI/CD pipelines and Infrastructure as Code (IaC) practices. You will collaborate cross-functionally with product, design, and data teams to translate business requirements into scalable technical solutions. Additionally, you will act as a trusted technical advisor and mentor to staff and senior engineers, staying ahead of industry trends and evaluating new tools and frameworks to enhance productivity and performance. To be successful in this role, you should have a minimum of 10 years of experience in full stack development, with at least 2 years in a technical leadership or principal engineering role. Deep proficiency in JavaScript/TypeScript, Python, or Go is required, along with expertise in modern frontend frameworks, cloud-native systems on GCP, microservices architecture, and DevOps practices. Strong communication, leadership, and collaboration skills are essential, along with a GCP Professional Certification and experience with serverless platforms and observability tools. Joining Walmart Global Tech means working in an environment where your contributions can impact the lives of millions of people. As part of a team that values innovation and empowerment, you will have the opportunity to grow your skills and expertise while driving meaningful change in the retail industry. At Walmart, we strive to create a culture of belonging where every associate is valued for who they are. Our commitment to diversity and inclusion allows us to engage associates, strengthen our business, and better serve our customers and communities around the world. As an Equal Opportunity Employer, Walmart is dedicated to understanding, respecting, and valuing the unique experiences and identities of all individuals.,

Posted 2 weeks ago

Apply

12.0 - 16.0 years

0 Lacs

karnataka

On-site

As an Experienced Senior Data Engineer at Adobe, you will utilize Big Data and Google Cloud technologies to develop large-scale, on-cloud data processing pipelines and data warehouses. Your role will involve consulting with customers worldwide on their data engineering needs around Adobe's Customer Data Platform and supporting pre-sales discussions regarding complex and large-scale cloud data engineering solutions. You will design custom solutions on cloud by integrating Adobe's solutions in a scalable and performant manner. Additionally, you will deliver complex, large-scale, enterprise-grade on-cloud data engineering and integration solutions in a hands-on manner. To be successful in this role, you should have a total of 12 to 15 years of experience, with 3 to 4 years of experience leading Data Engineer teams in developing enterprise-grade data processing pipelines on Google Cloud. You must have led at least one project of medium to high complexity involving the migration of ETL pipelines and Data warehouses to the cloud. Your recent 3 to 5 years of experience should be with premium consulting companies. Profound hands-on expertise with Google Cloud Platform services, especially BigQuery, Dataform, Dataplex, etc., is essential. Exceptional communication skills are crucial for effectively engaging with Data Engineers, Technology, and Business leadership. Furthermore, the ability to leverage knowledge of GCP to other cloud environments is highly desirable. It would be advantageous to have experience consulting with customers in India and possess multi-cloud expertise, with knowledge of AWS and GCP. At Adobe, creativity, curiosity, and continuous learning are valued qualities that contribute to your career growth journey. To pursue a new opportunity at Adobe, ensure to update your Resume/CV and Workday profile, including your unique Adobe experiences and volunteer work. Familiarize yourself with the Internal Mobility page on Inside Adobe to understand the process and set up job alerts for roles that interest you. Prepare for interviews by following the provided tips. Upon applying for a role via Workday, the Talent Team will contact you within 2 weeks. If you progress to the official interview process with the hiring team, inform your manager to support your career growth. At Adobe, you will experience an exceptional work environment recognized globally. You will collaborate with colleagues dedicated to mutual growth through the Check-In approach, where ongoing feedback is encouraged. If you seek to make an impact, Adobe is the ideal place for you. Explore employee career experiences on the Adobe Life blog and discover the meaningful benefits offered. For individuals with disabilities or special needs requiring accommodation to navigate the Adobe.com website or complete the application process, contact accommodations@adobe.com or call (408) 536-3015.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

noida, uttar pradesh

On-site

As an Informatica IDMC Developer at Coforge, your primary responsibility will be to design, develop, and maintain resilient ETL pipelines using Informatica Intelligent Data Management Cloud (IDMC/IICS). You will work closely with data architects, analysts, and business stakeholders to comprehend data requirements and integrate data from various sources, including databases, APIs, and flat files. Your role will involve optimizing data workflows for enhanced performance, scalability, and reliability while monitoring and troubleshooting ETL jobs to address data quality issues. In addition, you will be expected to implement data governance and security best practices, ensuring compliance and confidentiality. Maintaining detailed documentation of data flows, transformations, and architecture will be essential. Active participation in code reviews and contributing to continuous improvement initiatives are also part of your responsibilities. To excel in this role, you must possess substantial hands-on experience with Informatica IDMC (IICS) and cloud-based ETL tools. Proficiency in SQL and prior experience with relational databases like Oracle, SQL Server, and PostgreSQL is necessary. Familiarity with cloud platforms such as AWS, Azure, or GCP and data warehousing concepts and tools like Snowflake, Redshift, or BigQuery is highly desirable. Strong problem-solving skills and effective communication abilities are key attributes that will contribute to your success in this position. Preferred qualifications for this role include experience with CI/CD pipelines and version control systems, knowledge of data modeling and metadata management, and certifications in Informatica or cloud platforms, which would be considered advantageous. If you have 5 to 8 years of relevant experience and possess the required skills and qualifications, we encourage you to apply for this Informatica IDMC Developer position based in Greater Noida. Kindly send your CV to Gaurav.2.Kumar@coforge.com.,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

noida, uttar pradesh

On-site

As a Cloud Engineering Team Leader at GlobalLogic, you will be responsible for providing technical guidance and career development support to a team of cloud engineers. You will define cloud architecture standards and best practices across the organization, collaborating with senior leadership to develop a cloud strategy aligned with business objectives. Your role will involve driving technical decision-making for complex cloud infrastructure projects, establishing and maintaining cloud governance frameworks, and operational procedures. With a background in technical leadership roles managing engineering teams, you will have a proven track record of successfully delivering large-scale cloud transformation projects. Experience in budget management, resource planning, and strong presentation and communication skills for executive-level reporting are essential. Preferred certifications include Google Cloud Professional Cloud Architect, Google Cloud Professional Data Engineer, and additional relevant cloud or security certifications. You will leverage your 10+ years of experience in designing and implementing enterprise-scale Cloud Solutions using GCP services to architect sophisticated cloud solutions using Python and advanced GCP services. Leading the design and deployment of solutions utilizing Cloud Functions, Docker containers, Dataflow, and other GCP services will be part of your responsibilities. Ensuring optimal performance and scalability of complex integrations with multiple data sources and systems, implementing security best practices and compliance frameworks, and troubleshooting and resolving technical issues will be key aspects of your role. Your technical skills will include expert-level proficiency in Python with experience in additional languages, deep expertise with GCP services such as Dataflow, Compute Engine, BigQuery, Cloud Functions, and others, advanced knowledge of Docker, Kubernetes, and container orchestration patterns, extensive experience in cloud security, proficiency in Infrastructure as Code tools like Terraform, Cloud Deployment Manager, and CI/CD experience with advanced deployment pipelines and GitOps practices. As part of the GlobalLogic team, you will benefit from a culture of caring, continuous learning and development opportunities, interesting and meaningful work, balance and flexibility in work arrangements, and being part of a high-trust organization. You will have the chance to work on impactful projects, engage with collaborative teammates and supportive leaders, and contribute to shaping cutting-edge solutions in the digital engineering domain.,

Posted 2 weeks ago

Apply

6.0 - 10.0 years

0 Lacs

chennai, tamil nadu

On-site

We are looking for experienced professionals with strong expertise in Google Cloud Platform (GCP) database services. The role involves designing, implementing, and troubleshooting scalable database solutions on GCP. Responsibilities: - Proven experience as a Subject Matter Expert in Google Cloud native databases and managed SQL solutions or a similar role. - In-depth knowledge of Google Cloud Platform (GCP) and its database tools, including Cloud SQL, BigQuery, and Spanner. - Strong analytical and problem-solving skills. - Excellent communication and presentation skills. - Proficiency in relevant programming languages such as SQL, Python, or Go. - Familiarity with cloud-native architectures and database best practices. - Provide technical expertise on GCP database tools. - Design and support cloud-native database architectures. - Resolve complex database issues. - Collaborate with cross-functional teams. Good to Have: - Google Cloud certifications. - Experience in DB migration. - Knowledge of data security/compliance.,

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies