Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
2.0 - 6.0 years
7 - 11 Lacs
Bengaluru
Work from Office
About The Role This is an Internal document. Job TitleSenior Data Engineer About The Role As a Senior Data Engineer, you will play a key role in designing and implementing data solutions @Kotak811. You will be responsible for leading data engineering projects, mentoring junior team members, and collaborating with cross-functional teams to deliver high-quality and scalable data infrastructure. Your expertise in data architecture, performance optimization, and data integration will be instrumental in driving the success of our data initiatives. Responsibilities 1. Data Architecture and Designa. Design and develop scalable, high-performance data architecture and data models. b. Collaborate with data scientists, architects, and business stakeholders to understand data requirements and design optimal data solutions. c. Evaluate and select appropriate technologies, tools, and frameworks for data engineering projects. d. Define and enforce data engineering best practices, standards, and guidelines. 2. Data Pipeline Development & Maintenancea. Develop and maintain robust and scalable data pipelines for data ingestion, transformation, and loading for real-time and batch-use-cases b. Implement ETL processes to integrate data from various sources into data storage systems. c. Optimise data pipelines for performance, scalability, and reliability. i. Identify and resolve performance bottlenecks in data pipelines and analytical systems. ii. Monitor and analyse system performance metrics, identifying areas for improvement and implementing solutions. iii. Optimise database performance, including query tuning, indexing, and partitioning strategies. d. Implement real-time and batch data processing solutions. 3. Data Quality and Governancea. Implement data quality frameworks and processes to ensure high data integrity and consistency. b. Design and enforce data management policies and standards. c. Develop and maintain documentation, data dictionaries, and metadata repositories. d. Conduct data profiling and analysis to identify data quality issues and implement remediation strategies. 4. ML Models Deployment & Management (is a plus) This is an Internal document. a. Responsible for designing, developing, and maintaining the infrastructure and processes necessary for deploying and managing machine learning models in production environments b. Implement model deployment strategies, including containerization and orchestration using tools like Docker and Kubernetes. c. Optimise model performance and latency for real-time inference in consumer applications. d. Collaborate with DevOps teams to implement continuous integration and continuous deployment (CI/CD) processes for model deployment. e. Monitor and troubleshoot deployed models, proactively identifying and resolving performance or data-related issues. f. Implement monitoring and logging solutions to track model performance, data drift, and system health. 5. Team Leadership and Mentorshipa. Lead data engineering projects, providing technical guidance and expertise to team members. i. Conduct code reviews and ensure adherence to coding standards and best practices. b. Mentor and coach junior data engineers, fostering their professional growth and development. c. Collaborate with cross-functional teams, including data scientists, software engineers, and business analysts, to drive successful project outcomes. d. Stay abreast of emerging technologies, trends, and best practices in data engineering and share knowledge within the team. i. Participate in the evaluation and selection of data engineering tools and technologies. Qualifications1. 3-5 years" experience with Bachelor's Degree in Computer Science, Engineering, Technology or related field required 2. Good understanding of streaming technologies like Kafka, Spark Streaming. 3. Experience with Enterprise Business Intelligence Platform/Data platform sizing, tuning, optimization and system landscape integration in large-scale, enterprise deployments. 4. Proficiency in one of the programming language preferably Java, Scala or Python 5. Good knowledge of Agile, SDLC/CICD practices and tools 6. Must have proven experience with Hadoop, Mapreduce, Hive, Spark, Scala programming. Must have in-depth knowledge of performance tuning/optimizing data processing jobs, debugging time consuming jobs. 7. Proven experience in development of conceptual, logical, and physical data models for Hadoop, relational, EDW (enterprise data warehouse) and OLAP database solutions. 8. Good understanding of distributed systems 9. Experience working extensively in multi-petabyte DW environment 10. Experience in engineering large-scale systems in a product environment
Posted 2 weeks ago
5.0 - 8.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
VAM Systems is a Business Consulting, IT Solutions and Services company. VAM Systems is currently looking for Data Engineering Analyst for our Bahrain operations with the following skillsets & terms and conditions: Qualifications : · Bachelor’s Degree · Engineer (B.E.)/ MCA · Certification in SQL/SAS Experience : 5-8 years Key Objectives · Support the finance team on data and analytics activities and Dataware House (DWH) based on a profound knowledge of banking, financial reporting, and data engineering. Analytical/Technical Skills: · Understanding finance and risk reporting systems/workflow with previous experience participating in system implementation is desirable. · Hands-on experience on MS Excel · Prior project management/stakeholder management is desired Responsibilities · Coordinate and Interact with the finance business partner to support daily finance data analysis, hierarchical mappings, and understanding (root cause analysis) of identified data issues. · Exceptional comprehension of finance, risk, and data warehousing to guarantee accurate and reconciled reporting (e.g., balance-sheet exposure, profit and loss). · Mastering the Intersection of Finance, Data Analysis and Data Engineering. · Conduct review of data quality and reconciliations for finance reports and maintenance of reporting logic/programs. · Support the finance team in ad-hoc requests and organizing data for financial/regulatory reports, data mapping and performing UAT. · Ensuring the consistency of bank's data architecture, data flows, and business logic in accordance with Data Management guidelines, development standards, and data architecture by working closely with Finance and data Engineering teams to identify issues and develop sustainable data-driven solutions. · Expertise in writing and documenting complex SQL Query, Procedures, and functions creating algorithms that automate important financial interactions and data controls. · Experience in handling SAS ETL jobs, data transformation, validation, analysis and performance tuning. · SAS skillset, with Strong Experience in SAS Management Console, SAS DI, SAS Enterprise Guide, Base SAS, SAS Web Report Studio, SAS Delivery Portal, SAS OLAP Cube Studio, SAS Information Maps, SAS BI, SAS Stored Process, SAS Datasets & Library Terms and conditions Joining time frame: (15 - 30 days) The selected candidates shall join VAM Systems – Bahrain and shall be deputed to one of the leading bank in Bahrain. Should you be interested in this opportunity, please send your latest resume at the earliest at ashiq.salahudeen@vamsystems.com
Posted 2 weeks ago
12.0 years
25 - 35 Lacs
Madurai
On-site
Dear Candidate, Greetings of the day!! I am Kantha, and I'm reaching out to you regarding an exciting opportunity with TechMango. You can connect with me on LinkedIn https://www.linkedin.com/in/kantha-m-ashwin-186ba3244/ Or Email: kanthasanmugam.m@techmango.net Techmango Technology Services is a full-scale software development services company founded in 2014 with a strong focus on emerging technologies. It holds a primary objective of delivering strategic solutions towards the goal of its business partners in terms of technology. We are a full-scale leading Software and Mobile App Development Company. Techmango is driven by the mantra “Clients Vision is our Mission”. We have a tendency to stick on to the current statement. To be the technologically advanced & most loved organization providing prime quality and cost-efficient services with a long-term client relationship strategy. We are operational in the USA - Chicago, Atlanta, Dubai - UAE, in India - Bangalore, Chennai, Madurai, Trichy. Techmangohttps://www.techmango.net/ Job Title: GCP Data Architect Location: Madurai Experience: 12+ Years Notice Period: Immediate About TechMango TechMango is a rapidly growing IT Services and SaaS Product company that helps global businesses with digital transformation, modern data platforms, product engineering, and cloud-first initiatives. We are seeking a GCP Data Architect to lead data modernization efforts for our prestigious client, Livingston, in a highly strategic project. Role Summary As a GCP Data Architect, you will be responsible for designing and implementing scalable, high-performance data solutions on Google Cloud Platform. You will work closely with stakeholders to define data architecture, implement data pipelines, modernize legacy data systems, and guide data strategy aligned with enterprise goals. Key Responsibilities: Lead end-to-end design and implementation of scalable data architecture on Google Cloud Platform (GCP) Define data strategy, standards, and best practices for cloud data engineering and analytics Develop data ingestion pipelines using Dataflow, Pub/Sub, Apache Beam, Cloud Composer (Airflow), and BigQuery Migrate on-prem or legacy systems to GCP (e.g., from Hadoop, Teradata, or Oracle to BigQuery) Architect data lakes, warehouses, and real-time data platforms Ensure data governance, security, lineage, and compliance (using tools like Data Catalog, IAM, DLP) Guide a team of data engineers and collaborate with business stakeholders, data scientists, and product managers Create documentation, high-level design (HLD) and low-level design (LLD), and oversee development standards Provide technical leadership in architectural decisions and future-proofing the data ecosystem Required Skills & Qualifications: 10+ years of experience in data architecture, data engineering, or enterprise data platforms. Minimum 3–5 years of hands-on experience in GCP Data Service. Proficient in:BigQuery, Cloud Storage, Dataflow, Pub/Sub, Composer, Cloud SQL/Spanner. Python / Java / SQL Data modeling (OLTP, OLAP, Star/Snowflake schema). Experience with real-time data processing, streaming architectures, and batch ETL pipelines. Good understanding of IAM, networking, security models, and cost optimization on GCP. Prior experience in leading cloud data transformation projects. Excellent communication and stakeholder management skills. Preferred Qualifications: GCP Professional Data Engineer / Architect Certification. Experience with Terraform, CI/CD, GitOps, Looker / Data Studio / Tableau for analytics. Exposure to AI/ML use cases and MLOps on GCP. Experience working in agile environments and client-facing roles. What We Offer: Opportunity to work on large-scale data modernization projects with global clients. A fast-growing company with a strong tech and people culture. Competitive salary, benefits, and flexibility. Collaborative environment that values innovation and leadership. Job Type: Full-time Pay: ₹2,500,000.00 - ₹3,500,000.00 per year Application Question(s): Current CTC ? Expected CTC ? Notice Period ? (If you are serving Notice period please mention the Last working day) Experience: GCP Data Architecture : 3 years (Required) BigQuery: 3 years (Required) Cloud Composer (Airflow): 3 years (Required) Location: Madurai, Tamil Nadu (Required) Work Location: In person
Posted 2 weeks ago
0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Position: ETL Developer Any , 5-8 yrs experience Location: Mumbai Mandatory Skills ETL, Etl Developer, Data Engineer, SQL, data warehousing, dw, olap, python, sql server, big data, airflow, spark, hadoop, gcp, Bigquery
Posted 2 weeks ago
7.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Experience: 7+ Years Location: Noida-Sector 64 Contract to hire Key Responsibilities: Data Architecture Design: Design, develop, and maintain the enterprise data architecture, including data models, database schemas, and data flow diagrams. Develop a data strategy and roadmap that aligns with the business objectives and ensures the scalability of data systems. Architect both transactional (OLTP) and analytical (OLAP) databases, ensuring optimal performance and data consistency. Data Integration & Management: Oversee the integration of disparate data sources into a unified data platform, leveraging ETL/ELT processes and data integration tools. Design and implement data warehousing solutions, data lakes, and/or data marts that enable efficient storage and retrieval of large datasets. Ensure proper data governance, including the definition of data ownership, security, and privacy controls in accordance with compliance standards (GDPR, HIPAA, etc.). Collaboration with Stakeholders: Work closely with business stakeholders, including analysts, developers, and executives, to understand data requirements and ensure that the architecture supports analytics and reporting needs. Collaborate with DevOps and engineering teams to optimize database performance and support large-scale data processing pipelines. Technology Leadership: Guide the selection of data technologies, including databases (SQL/NoSQL), data processing frameworks (Hadoop, Spark), cloud platforms (Azure is a must), and analytics tools. Stay updated on emerging data management technologies, trends, and best practices, and assess their potential application within the organization. Data Quality & Security: Define data quality standards and implement processes to ensure the accuracy, completeness, and consistency of data across all systems. Establish protocols for data security, encryption, and backup/recovery to protect data assets and ensure business continuity. Mentorship & Leadership: Lead and mentor data engineers, data modelers, and other technical staff in best practices for data architecture and management. Provide strategic guidance on data-related projects and initiatives, ensuring that all efforts are aligned with the enterprise data strategy. Required Skills & Experience: Extensive Data Architecture Expertise: Over 7 years of experience in data architecture, data modeling, and database management. Proficiency in designing and implementing relational (SQL) and non-relational (NoSQL) database solutions. Strong experience with data integration tools (Azure Tools are a must + any other third party tools), ETL/ELT processes, and data pipelines. Advanced Knowledge of Data Platforms: Expertise in Azure cloud data platform is a must. Other platforms such as AWS (Redshift, S3), Azure (Data Lake, Synapse), and/or Google Cloud Platform (BigQuery, Dataproc) is a bonus. Experience with big data technologies (Hadoop, Spark) and distributed systems for large-scale data processing. Hands-on experience with data warehousing solutions and BI tools (e.g., Power BI, Tableau, Looker). Data Governance & Compliance: Strong understanding of data governance principles, data lineage, and data stewardship. Knowledge of industry standards and compliance requirements (e.g., GDPR, HIPAA, SOX) and the ability to architect solutions that meet these standards. Technical Leadership: Proven ability to lead data-driven projects, manage stakeholders, and drive data strategies across the enterprise. Strong programming skills in languages such as Python, SQL, R, or Scala. Certification: Azure Certified Solution Architect, Data Engineer, Data Scientist certifications are mandatory. Pre-Sales Responsibilities: Stakeholder Engagement: Work with product stakeholders to analyze functional and non-functional requirements, ensuring alignment with business objectives. Solution Development: Develop end-to-end solutions involving multiple products, ensuring security and performance benchmarks are established, achieved, and maintained. Proof of Concepts (POCs): Develop POCs to demonstrate the feasibility and benefits of proposed solutions. Client Communication: Communicate system requirements and solution architecture to clients and stakeholders, providing technical assistance and guidance throughout the pre-sales process. Technical Presentations: Prepare and deliver technical presentations to prospective clients, demonstrating how proposed solutions meet their needs and requirements. Additional Responsibilities: Stakeholder Collaboration: Engage with stakeholders to understand their requirements and translate them into effective technical solutions. Technology Leadership: Provide technical leadership and guidance to development teams, ensuring the use of best practices and innovative solutions. Integration Management: Oversee the integration of solutions with existing systems and third-party applications, ensuring seamless interoperability and data flow. Performance Optimization: Ensure solutions are optimized for performance, scalability, and security, addressing any technical challenges that arise. Quality Assurance: Establish and enforce quality assurance standards, conducting regular reviews and testing to ensure robustness and reliability. Documentation: Maintain comprehensive documentation of the architecture, design decisions, and technical specifications. Mentoring: Mentor fellow developers and team leads, fostering a collaborative and growth-oriented environment. Qualifications: Education: Bachelor’s or master’s degree in computer science, Information Technology, or a related field. Experience: Minimum of 7 years of experience in data architecture, with a focus on developing scalable and high-performance solutions. Technical Expertise: Proficient in architectural frameworks, cloud computing, database management, and web technologies. Analytical Thinking: Strong problem-solving skills, with the ability to analyze complex requirements and design scalable solutions. Leadership Skills: Demonstrated ability to lead and mentor technical teams, with excellent project management skills. Communication: Excellent verbal and written communication skills, with the ability to convey technical concepts to both technical and non-technical stakeholders.
Posted 2 weeks ago
5.0 - 8.0 years
7 - 10 Lacs
Chennai
Work from Office
Hands-on experience in data modelling for both OLTP and OLAP systems. In-depth knowledge of Conceptual, Logical, and Physical data modelling. Strong understanding of indexing, partitioning, and data sharding with practical experience. Experience in identifying and addressing factors affecting database performance for near-real-time reporting and application interaction. Proficiency with at least one data modelling tool (preferably DB Schema). Functional knowledge of the mutual fund industry is a plus. Familiarity with GCP databases like Alloy DB, Cloud SQL, and Big Query. Willingness to work from Chennai (office presence is mandatory) Chennai customer site, requiring five days of on-site work each week. Mandatory Skills: Cloud-PaaS-GCP-Google Cloud Platform Experience: 5-8 Years
Posted 2 weeks ago
9.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
About Us: Mobileum is a leading provider of Telecom analytics solutions for roaming, core network, security, risk management, domestic and international connectivity testing, and customer intelligence. More than 1,000 customers rely on its Active Intelligence platform, which provides advanced analytics solutions, allowing customers to connect deep network and operational intelligence with real-time actions that increase revenue, improve customer experience, and reduce costs. Know our story: https://www.mobileum.com/ Headquartered in Silicon Valley, Mobileum has global offices in Australia, Dubai, Germany, Greece, India, Portugal, Singapore and UK with global HC of over 1800+. Join Mobileum Team At Mobileum we recognize that our team is the main reason for our success. What does work with us mean? Opportunities! Role: Senior Full Stack Developer About the Job: We are seeking an experienced and highly skilled Senior Full Stack Developer with 6–9 years of professional experience to join our Engineering team. This role is best suited to someone who is backend-heavy with strong Java and database design expertise (ClickHouse, PostgreSQL) while also being comfortable in modern frontend development (React, JavaScript, jQuery). You will design and implement robust, scalable systems, lead database optimization efforts, and develop intuitive user interfaces. Additionally, you will leverage Generative AI (GenAI) tools to accelerate development, improve quality, and mentor others in effective AI-assisted engineering workflows Roles & Responsibility: - System Design & Architecture Design and implement scalable, maintainable, and secure full-stack architectures. Decompose complex business requirements into clear technical designs with well-defined components. Drive architecture reviews to ensure consistency, performance, and extensibility. Apply best practices for API design, database modeling, performance tuning, and security. 2. Backend Development (Heavy Focus) Develop high-quality Java applications with clean architecture and design patterns. Design and optimize relational and analytical data models for PostgreSQL and ClickHouse. Build RESTful and event-driven APIs, microservices, and data ingestion pipelines. Solve advanced problems related to concurrency, thread safety, caching, and query performance. Integrate with third-party services and internal systems using robust, secure patterns. 3. Frontend Development Build modern, responsive user interfaces using React, JavaScript, and jQuery. Collaborate with UX/UI designers to translate wireframes into clean, efficient code. Ensure cross-browser compatibility, accessibility, and performance. Integrate frontend components with backend APIs securely and efficiently. 4. GenAI-Augmented Development Leverage GitHub Copilot, ChatGPT, or equivalent GenAI tools to: Generate code scaffolding and boilerplate. Assist in writing unit, integration, and E2E test cases. Refactor and modernize legacy codebases. Produce developer documentation and interface specs. Develop advanced prompt-engineering skills to extract maximum value from GenAI tools. Promote best practices for responsible GenAI usage, sharing knowledge with teammates. 5. Collaboration & Continuous Improvement Work cross-functionally with QA, DevOps, Product, and other engineering teams to meet delivery goals. Participate actively in Agile ceremonies: sprint planning, reviews, retrospectives. Contribute to continuous improvement by suggesting process, tooling, or architectural enhancements. Mentor junior developers, conduct code reviews, and foster a culture of high-quality engineering. Desired Profile: - Experience using GenAI in real-world development or modernization projects. Familiarity with event-driven architectures, messaging systems (Kafka, RabbitMQ), OLAP or NoSQL databases. Understanding of cloud deployment and orchestration (Kubernetes, Helm). Exposure to data warehousing, ETL pipelines, or big-data analytics platforms. Contributions to open-source projects or internal developer tooling initiatives. Why Join Us? Be part of a highly skilled, AI-enabled engineering team delivering cutting-edge applications. Lead innovation in how software is designed, developed, and maintained using GenAI. Tackle complex backend challenges while delivering elegant user-facing features. Enjoy a flexible and empowering culture that rewards curiosity, learning, and technical excellence. Competitive salary, benefits, and clear career progression in modern, AI-led software engineering Technical skills: Backend: Strong proficiency in Java, with experience in building APIs, microservices, and batch/data processing. Databases: Hands-on expertise with PostgreSQL (RDBMS) and ClickHouse (OLAP), including schema design, query optimization, and performance tuning. Frontend: Solid experience with React, JavaScript (ES6+), and jQuery. System Design: Proficiency in designing distributed systems, API strategies, caching, and data modeling. GenAI Skills: Demonstrated use of GenAI tools (e.g., GitHub Copilot, ChatGPT) for code generation, documentation, testing, or refactoring. Collaboration: Strong written and verbal communication skills, with experience working in Agile teams. DevOps Familiarity: Experience with version control (Git), CI/CD pipelines, Docker, and deployment automation Experience: 4–9 years in software development with proven full-stack delivery
Posted 2 weeks ago
12.0 years
0 Lacs
Madurai, Tamil Nadu, India
On-site
Job Title: GCP Data Architect Location: Madurai Experience: 12+ Years Notice Period: Immediate About TechMango TechMango is a rapidly growing IT Services and SaaS Product company that helps global businesses with digital transformation, modern data platforms, product engineering, and cloud-first initiatives. We are seeking a GCP Data Architect to lead data modernization efforts for our prestigious client, Livingston, in a highly strategic project. Role Summary As a GCP Data Architect, you will be responsible for designing and implementing scalable, high-performance data solutions on Google Cloud Platform. You will work closely with stakeholders to define data architecture, implement data pipelines, modernize legacy data systems, and guide data strategy aligned with enterprise goals. Key Responsibilities: Lead end-to-end design and implementation of scalable data architecture on Google Cloud Platform (GCP) Define data strategy, standards, and best practices for cloud data engineering and analytics Develop data ingestion pipelines using Dataflow, Pub/Sub, Apache Beam, Cloud Composer (Airflow), and BigQuery Migrate on-prem or legacy systems to GCP (e.g., from Hadoop, Teradata, or Oracle to BigQuery) Architect data lakes, warehouses, and real-time data platforms Ensure data governance, security, lineage, and compliance (using tools like Data Catalog, IAM, DLP) Guide a team of data engineers and collaborate with business stakeholders, data scientists, and product managers Create documentation, high-level design (HLD) and low-level design (LLD), and oversee development standards Provide technical leadership in architectural decisions and future-proofing the data ecosystem Required Skills & Qualifications: 10+ years of experience in data architecture, data engineering, or enterprise data platforms Minimum 3–5 years of hands-on experience in GCP Data Service Proficient in:BigQuery, Cloud Storage, Dataflow, Pub/Sub, Composer, Cloud SQL/Spanner Python / Java / SQL Data modeling (OLTP, OLAP, Star/Snowflake schema) Experience with real-time data processing, streaming architectures, and batch ETL pipelines Good understanding of IAM, networking, security models, and cost optimization on GCP Prior experience in leading cloud data transformation projects Excellent communication and stakeholder management skills Preferred Qualifications: GCP Professional Data Engineer / Architect Certification Experience with Terraform, CI/CD, GitOps, Looker / Data Studio / Tableau for analytics Exposure to AI/ML use cases and MLOps on GCP Experience working in agile environments and client-facing roles What We Offer: Opportunity to work on large-scale data modernization projects with global clients A fast-growing company with a strong tech and people culture Competitive salary, benefits, and flexibility Collaborative environment that values innovation and leadership
Posted 2 weeks ago
0.0 - 3.0 years
25 - 35 Lacs
Madurai, Tamil Nadu
On-site
Dear Candidate, Greetings of the day!! I am Kantha, and I'm reaching out to you regarding an exciting opportunity with TechMango. You can connect with me on LinkedIn https://www.linkedin.com/in/kantha-m-ashwin-186ba3244/ Or Email: kanthasanmugam.m@techmango.net Techmango Technology Services is a full-scale software development services company founded in 2014 with a strong focus on emerging technologies. It holds a primary objective of delivering strategic solutions towards the goal of its business partners in terms of technology. We are a full-scale leading Software and Mobile App Development Company. Techmango is driven by the mantra “Clients Vision is our Mission”. We have a tendency to stick on to the current statement. To be the technologically advanced & most loved organization providing prime quality and cost-efficient services with a long-term client relationship strategy. We are operational in the USA - Chicago, Atlanta, Dubai - UAE, in India - Bangalore, Chennai, Madurai, Trichy. Techmangohttps://www.techmango.net/ Job Title: GCP Data Architect Location: Madurai Experience: 12+ Years Notice Period: Immediate About TechMango TechMango is a rapidly growing IT Services and SaaS Product company that helps global businesses with digital transformation, modern data platforms, product engineering, and cloud-first initiatives. We are seeking a GCP Data Architect to lead data modernization efforts for our prestigious client, Livingston, in a highly strategic project. Role Summary As a GCP Data Architect, you will be responsible for designing and implementing scalable, high-performance data solutions on Google Cloud Platform. You will work closely with stakeholders to define data architecture, implement data pipelines, modernize legacy data systems, and guide data strategy aligned with enterprise goals. Key Responsibilities: Lead end-to-end design and implementation of scalable data architecture on Google Cloud Platform (GCP) Define data strategy, standards, and best practices for cloud data engineering and analytics Develop data ingestion pipelines using Dataflow, Pub/Sub, Apache Beam, Cloud Composer (Airflow), and BigQuery Migrate on-prem or legacy systems to GCP (e.g., from Hadoop, Teradata, or Oracle to BigQuery) Architect data lakes, warehouses, and real-time data platforms Ensure data governance, security, lineage, and compliance (using tools like Data Catalog, IAM, DLP) Guide a team of data engineers and collaborate with business stakeholders, data scientists, and product managers Create documentation, high-level design (HLD) and low-level design (LLD), and oversee development standards Provide technical leadership in architectural decisions and future-proofing the data ecosystem Required Skills & Qualifications: 10+ years of experience in data architecture, data engineering, or enterprise data platforms. Minimum 3–5 years of hands-on experience in GCP Data Service. Proficient in:BigQuery, Cloud Storage, Dataflow, Pub/Sub, Composer, Cloud SQL/Spanner. Python / Java / SQL Data modeling (OLTP, OLAP, Star/Snowflake schema). Experience with real-time data processing, streaming architectures, and batch ETL pipelines. Good understanding of IAM, networking, security models, and cost optimization on GCP. Prior experience in leading cloud data transformation projects. Excellent communication and stakeholder management skills. Preferred Qualifications: GCP Professional Data Engineer / Architect Certification. Experience with Terraform, CI/CD, GitOps, Looker / Data Studio / Tableau for analytics. Exposure to AI/ML use cases and MLOps on GCP. Experience working in agile environments and client-facing roles. What We Offer: Opportunity to work on large-scale data modernization projects with global clients. A fast-growing company with a strong tech and people culture. Competitive salary, benefits, and flexibility. Collaborative environment that values innovation and leadership. Job Type: Full-time Pay: ₹2,500,000.00 - ₹3,500,000.00 per year Application Question(s): Current CTC ? Expected CTC ? Notice Period ? (If you are serving Notice period please mention the Last working day) Experience: GCP Data Architecture : 3 years (Required) BigQuery: 3 years (Required) Cloud Composer (Airflow): 3 years (Required) Location: Madurai, Tamil Nadu (Required) Work Location: In person
Posted 2 weeks ago
4.0 - 6.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Information Date Opened: 06/30/2025 Job Type: Full time Industry: Software Product City: Chennai State/Province: Tamil Nadu Country: India Zip/Postal Code: 600017 AI & Data Warehouse (DWH) Pando is a global leader in supply chain technology, building the world's quickest time-to-value Fulfillment Cloud platform. Pandos Fulfillment Cloud provides manufacturers, retailers, and 3PLs with a single pane of glass to streamline end-to-end purchase order fulfillment and customer order fulfillment to improve service levels, reduce carbon footprint, and bring down costs. As a partner of choice for Fortune 500 enterprises globally, with a presence across APAC, the Middle East, and the US, Pando is recognized as a Technology Pioneer by the World Economic Forum (WEF), and as one of the fastest growing technology companies by Deloitte. Role As a Senior AI and Data Warehouse Engineer at Pando, you will be responsible for building and scaling the data and AI services team. You will drive the design and implementation of highly scalable, modular, and reusable data pipelines, leveraging big data technologies and low-code implementations. This is a senior leadership position where you will work closely with cross-functional teams to deliver solutions that power advanced analytics, dashboards, and AI-based insights. Key Responsibilities Lead the development of scalable, high-performance data pipelines using PySpark or Big Data ETL pipeline technologies. Drive data modeling efforts for analytics, dashboards, and knowledge graphs. Oversee the implementation of parquet-based data lakes. Work on OLAP databases, ensuring optimal data structure for reporting and querying. Architect and optimize large-scale enterprise big data implementations with a focus on modular and reusable low-code libraries. Collaborate with stakeholders to design and deliver AI and DWH solutions that align with business needs. Mentor and lead a team of engineers, building out the data and AI services organization. Requirements 4 to 6 years of experience in big data and AI technologies, with expertise in PySpark or similar Big Data ETL pipeline technologies. Strong proficiency in SQL and OLAP database technologies. Firsthand experience with data modeling for analytics, dashboards, and knowledge graphs. Proven experience with parquet-based data lake implementations. Expertise in building highly scalable, high-volume data pipelines. Experience with modular, reusable, low-code-based implementations. Involvement in large-scale enterprise big data implementations. Initiative-taker with strong motivation and the ability to lead a growing team. Preferred Experience leading a team or building out a new department. Experience with cloud-based data platforms and AI services. Familiarity with supply chain technology or fulfilment platforms is a plus. I'm interested Locations: Chennai, India | Posted on: 06/30/2025
Posted 2 weeks ago
24.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Information Date Opened: 07/07/2025 Job Type: Full time Industry: Software Product City: Chennai State/Province: Tamil Nadu Country: India Zip/Postal Code: 600017 Pando is a global leader in supply chain technology, building the world's quickest time-to-value Fulfillment Cloud platform. Pandos Fulfillment Cloud provides manufacturers, retailers, and 3PLs with a single pane of glass to streamline end-to-end purchase order fulfillment and customer order fulfillment to improve service levels, reduce carbon footprint, and bring down costs. As a partner of choice for Fortune 500 enterprises globally, with a presence across APAC, the Middle East, and the US, Pando is recognized as a Technology Pioneer by the World Economic Forum (WEF), and as one of the fastest growing technology companies by Deloitte. Role Overview As a Junior Data Warehouse Engineer at Pando, youll work within the Data & AI Services team to support the design, development, and maintenance of data pipelines and warehouse solutions. You'll collaborate with senior engineers and cross-functional teams to help deliver high-quality analytics and reporting solutions that power key business decisions. This is an excellent opportunity to grow your career by learning from experienced professionals and gaining hands-on experience with large-scale data systems and supply chain technologies. Key Responsibilities Assist in building and maintaining scalable data pipelines using tools like PySpark and SQL-based ETL processes. Support the development and maintenance of data models for dashboards, analytics, and reporting. Help manage parquet-based data lakes and ensure data consistency and quality. Write optimized SQL queries for OLAP database systems and support data integration efforts. Collaborate with team members to understand business data requirements and translate them into technical implementations. Document workflows, data schemas, and data definitions for internal use. Participate in code reviews, team meetings, and training sessions to continuously improve your skills. Requirements 24 years of experience working with data engineering or ETL tools (e.g., PySpark, SQL, Airflow). Solid understanding of SQL and basic experience with OLAP or data warehouse systems. Exposure to data lakes, preferably using Parquet format. Understanding of basic data modeling principles (e.g., star/snowflake schema). Good problem-solving skills and a willingness to learn and adapt. Ability to work effectively in a collaborative, fast-paced team environment. Preferred Qualifications Experience working with cloud platforms (e.g., AWS, Azure, or GCP). Exposure to low-code data tools or modular ETL frameworks. Interest or prior experience in the supply chain or logistics domain. Familiarity with dashboarding tools like Power BI, Looker, or Tableau. I'm interested Locations: Chennai, India | Posted on: 07/07/2025
Posted 2 weeks ago
5.0 years
0 Lacs
India
On-site
Staff Software Engineer, Data Ingestion The Staff Software Engineer, Data Ingestion will be a critical individual contributor responsible for designing collection strategies, developing, and maintaining robust and scalable data pipelines. This role is at the heart of our data ecosystem, deliver new analytical software solution to access timely, accurate, and complete data for insights, products, and operational efficiency. Key Responsibilities: Design, develop, and maintain high-performance, fault-tolerant data ingestion pipelines using Python. Integrate with diverse data sources (databases, APIs, streaming platforms, cloud storage, etc.). Implement data transformation and cleansing logic during ingestion to ensure data quality. Monitor and troubleshoot data ingestion pipelines, identifying and resolving issues promptly. Collaborate with database engineers to optimize data models for fast consumption. Evaluate and propose new technologies or frameworks to improve ingestion efficiency and reliability. Develop and implement self-healing mechanisms for data pipelines to ensure continuity. Define and uphold SLAs and SLOs for data freshness, completeness, and availability. Participate in on-call rotation as needed for critical data pipeline issues Key Skills: 5+ years of experience, ideally with background in computer science, working in software product companies. Extensive Python Expertise: Extensive experience in developing robust, production-grade applications with Python. Data Collection & Integration: Proven experience collecting data from various sources (REST APIs, OAuth, GraphQL, Kafka, S3, SFTP, etc.). Distributed Systems & Scalability: Strong understanding of distributed systems concepts, designing for scale, performance optimization, and fault tolerance. Cloud Platforms: Experience with major cloud providers (AWS or GCP) and their data-related services (e.g., S3, EC2, Lambda, SQS, Kafka, Cloud Storage, GKE). Database Fundamentals: Solid understanding of relational databases (SQL, schema design, indexing, query optimization). OLAP database experience is a plus (Hadoop) Monitoring & Alerting: Experience with monitoring tools (e.g., Prometheus, Grafana) and setting up effective alerts. Version Control: Proficiency with Git. Containerization (Plus): Experience with Docker and Kubernetes. Streaming Technologies (Plus): Experience with real-time data processing using Kafka, Flink, Spark Streaming.
Posted 2 weeks ago
2.0 years
1 - 5 Lacs
Patna, Bihar, India
On-site
Only candidates currently in Bihar or Open to relocate to Bihar, please apply: Job Description This is an exciting opportunity for an experienced industry professional with strong analytical and technical skills to join and add value to a dedicated and friendly team. We are looking for a Data Analyst who is driven by data-driven decision-making and insights. As a core member of the Analytics Team, the candidate will take ownership of data analysis projects by working independently with little supervision. The ideal candidate is a highly resourceful and innovative professional with extensive experience in data analysis, statistical modeling, and data visualization. The candidate must have a strong command of data analysis tools like SAS/SPSS, Power BI/Tableau, or R, along with expertise in MS Excel and MS PowerPoint. The role requires optimizing data collection procedures, generating reports, and applying statistical techniques for hypothesis testing and data interpretation. Key Responsibilities Perform data analysis using tools like SAS, SPSS, Power BI, Tableau, or R. Optimize data collection procedures and generate reports on a weekly, monthly, and quarterly basis. Utilize statistical techniques for hypothesis testing to validate data and interpretations. Apply data mining techniques and OLAP methodologies for in-depth insights. Develop dashboards and data visualizations to present findings effectively. Collaborate with cross-functional teams to define, design, and execute data-driven strategies. Ensure the accuracy and integrity of data used for analysis and reporting. Utilize advanced Excel skills to manipulate and analyze large datasets. Prepare technical documentation and presentations for stakeholders. Candidate Profile Required Qualifications: Qualification: MCA / Graduate / Post Graduate in Statistics or MCA or BE/B.Tech in Computer Science & Engineering, Information Technology, or Electronics. A minimum of 2 years' experience in data analysis using SAS/SPSS, Power BI/Tableau, or R. Proficiency in MS Office with expertise in MS Excel & MS PowerPoint. Strong analytical skills with attention to detail. Experience in data mining and OLAP methodologies. Ability to generate insights and reports based on data trends. Excellent communication and presentation skills. Desired Qualifications Experience in predictive analytics and machine learning techniques. Knowledge of SQL and database management. Familiarity with Python for data analysis. Experience in automating reporting processes. Skills:- OLAP, Data mining, Python, Tableau, PowerBI, SAS and SPSS
Posted 2 weeks ago
5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Sr Data Engineer2 We are seeking a Sr. Data Engineer to join our Data Engineering team within our Enterprise Data Insights organization to build data solutions, design and implement ETL/ELT processes and manage our data platform to enable our cross functional stakeholders. As a part of our Corporate Engineering division, our vision is to spearhead technology and data-led solutions and experiences to drive growth & innovation at scale. The ideal candidate will have a strong Data Engineering background, advanced Python knowledge and experience with cloud services and SQL/NoSQL databases. You will work closely with our cross functional stakeholders in Product, Finance and GTM along with Business and Enterprise Technology teams. As a Senior Data Engineer, You Will Collaborating closely with various stakeholders to prioritize requests, identify improvements, and offer recommendations. Taking the lead in analyzing, designing, and implementing data solutions, which involves constructing and designing data models and ETL processes. Cultivating collaboration with corporate engineering, product teams, and other engineering groups. Leading and mentoring engineering discussions, advocating for best practices. Actively participating in design and code reviews. Accessing and exploring third-party data APIs to determine the data required to meet business needs. Ensuring data quality and integrity across different sources and systems. Managing data pipelines for both analytics and operational purposes. Continuously enhancing processes and policies to improve SLA and SOX compliance. You'll be a great addition to the team if you have: Hold a B.S., M.S., or Ph.D. in Computer Science or a related technical field. Possess over 5 years of experience in Data Engineering, focusing on building and maintaining data environments. Demonstrate at least 5 years of experience in designing and constructing ETL/ELT processes, managing data solutions within an SLA-driven environment. Exhibit a strong background in developing data products, APIs, and maintaining testing, monitoring, isolation, and SLA processes. Possess advanced knowledge of SQL/NoSQL databases (such as Snowflake, Redshift, MongoDB). Proficient in programming with Python or other scripting languages. Have familiarity with columnar OLAP databases and data modeling. Experience in building ELT/ETL processes using tools like dbt, AirFlow, Fivetran, CI/CD using GitHub, and reporting in Tableau. Possess excellent communication and interpersonal skills to effectively collaborate with various business stakeholders and translate requirements. Added Bonus If You Also Have A good understanding of Salesforce & Netsuite systems Experience in SAAS environments Designed and deployed ML models Experience with events and streaming data
Posted 2 weeks ago
8.0 years
0 Lacs
Karnataka
On-site
At eBay, we're more than a global ecommerce leader — we’re changing the way the world shops and sells. Our platform empowers millions of buyers and sellers in more than 190 markets around the world. We’re committed to pushing boundaries and leaving our mark as we reinvent the future of ecommerce for enthusiasts. Our customers are our compass, authenticity thrives, bold ideas are welcome, and everyone can bring their unique selves to work — every day. We're in this together, sustaining the future of our customers, our company, and our planet. Join a team of passionate thinkers, innovators, and dreamers — and help us connect people and build communities to create economic opportunity for all. Global Payments and Risk is at the forefront of innovation, integrating ground breaking technologies into our products and services. We are dedicated to using the power of data driven solutions to solve real-world problems at internet scale, improve user experiences, and drive business outcomes. Join us in shaping the future with your expertise and passion for technology. We are seeking a motivated Software Engineer with a strong background in full stack software development. The ideal candidate should have a platform-centric approach and a proven track record to build platforms, frameworks to solve complex problems. You will be instrumental in crafting innovative applications that safeguard our marketplace to mitigate risks, and curtail financial losses. Join our collaborative team that thrives on creativity and resourcefulness to tackle sophisticated challenges. What you will accomplish: Develop high-performing solutions that align with eBay's business and technology strategies to enhance risk management, trust, and compliance.. Research, analyze, design, develop and test the solutions that are appropriate for the business and technology strategies. Participate in design discussions, code reviews, and project-related team meetings, impacting significantly to the development process. Collaborate optimally within a multi-functional team comprising of engineers, architects, product managers, and operations, to deliver innovative solutions to address business needs, performance, scale, and reliability. Acquire domain expertise and apply this knowledge to tackle product challenges, ensuring continuous improvement in the domain. Act as an onboarding buddy for new joiners, fostering a supportive and inclusive work environment. What you will bring: At least 8 years of software design and development experience with proven foundation in computer science with strong competencies in data structures, algorithms, distributed computing and scalable software design Hands on expertise with architectural and design patterns, open source platforms & frameworks, technologies and software engineering methodologies. Hands-on experience in developing applications with Spring/Spring Boot, Rest, GraphQl, Java, JEE, Spring batch. Hands-on experience with building data models in Oracle/MySql/RDBMS and NoSql databases e.g key value store, document store like Mongo, Couchbase, Cassandra. Hands-on Experience in building tools and User experience using Html5, Node.js, ReactJs, Arco design, Material design. Hands on experience in finetuning performance bottlenecks of java, node.js, javascript. Proficiency in data & streaming technologies like Hadoop, Spark, Kafka, apache Flink etc. Practiced agile development and ability to adapt changes with business priorities. Experience with building sophisticated integration solutions for internet scale traffic is a major plus. Risk domain, rule engine expertise is a major plus. Excellent with decision-making, communication and collaboration skills. Familiarity with prompt engineering and AI tools is a major plus. Experience with Prometheus, Graphana, OLAP and Devops tools for observability is required. Behaviors: Innovating effectively in a dynamic, fast-changing environment, challenges convention. Develop solutions that deliver tangible results. Strong execution and alignment with timelines and timely addressing of blocking issues when risks arise. Practices learning and collaborates effectively in a team that has multiple functions. Education: Degree in computer science or equivalent discipline with 8+ years of software application development experience. Please see the Talent Privacy Notice for information regarding how eBay handles your personal data collected when you use the eBay Careers website or apply for a job with eBay. eBay is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, sex, sexual orientation, gender identity, veteran status, and disability, or other legally protected status. If you have a need that requires accommodation, please contact us at talent@ebay.com. We will make every effort to respond to your request for accommodation as soon as possible. View our accessibility statement to learn more about eBay's commitment to ensuring digital accessibility for people with disabilities. The eBay Jobs website uses cookies to enhance your experience. By continuing to browse the site, you agree to our use of cookies. Visit our Privacy Center for more information.
Posted 2 weeks ago
12.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Title : GCP Data Architect Location : Madurai Experience : 12+ Years Notice Period : Immediate About TechMango TechMango is a rapidly growing IT Services and SaaS Product company that helps global businesses with digital transformation, modern data platforms, product engineering, and cloud-first initiatives. We are seeking a GCP Data Architect to lead data modernization efforts for our prestigious client, Livingston, in a highly strategic project. Role Summary As a GCP Data Architect, you will be responsible for designing and implementing scalable, high-performance data solutions on Google Cloud Platform. You will work closely with stakeholders to define data architecture, implement data pipelines, modernize legacy data systems, and guide data strategy aligned with enterprise goals. Key Responsibilities Lead end-to-end design and implementation of scalable data architecture on Google Cloud Platform (GCP) Define data strategy, standards, and best practices for cloud data engineering and analytics Develop data ingestion pipelines using Dataflow, Pub/Sub, Apache Beam, Cloud Composer (Airflow), and BigQuery Migrate on-prem or legacy systems to GCP (e.g., from Hadoop, Teradata, or Oracle to BigQuery) Architect data lakes, warehouses, and real-time data platforms Ensure data governance, security, lineage, and compliance (using tools like Data Catalog, IAM, DLP) Guide a team of data engineers and collaborate with business stakeholders, data scientists, and product managers Create documentation, high-level design (HLD) and low-level design (LLD), and oversee development standards Provide technical leadership in architectural decisions and future-proofing the data ecosystem Required Skills & Qualifications 10+ years of experience in data architecture, data engineering, or enterprise data platforms Minimum 3-5 years of hands-on experience in GCP Data Service Proficient in : BigQuery, Cloud Storage, Dataflow, Pub/Sub, Composer, Cloud SQL/Spanner Python / Java / SQL Data modeling (OLTP, OLAP, Star/Snowflake schema) Experience with real-time data processing, streaming architectures, and batch ETL pipelines Good understanding of IAM, networking, security models, and cost optimization on GCP Prior experience in leading cloud data transformation projects Excellent communication and stakeholder management skills Preferred Qualifications GCP Professional Data Engineer / Architect Certification Experience with Terraform, CI/CD, GitOps, Looker / Data Studio / Tableau for analytics Exposure to AI/ML use cases and MLOps on GCP Experience working in agile environments and client-facing roles What We Offer Opportunity to work on large-scale data modernization projects with global clients A fast-growing company with a strong tech and people culture Competitive salary, benefits, and flexibility Collaborative environment that values innovation and leadership (ref:hirist.tech)
Posted 3 weeks ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
You will be responsible for designing, developing, and maintaining SQL Server Analysis Services (SSAS) models to support business intelligence reporting. Your role will involve creating and managing OLAP cubes, as well as developing and implementing multidimensional and tabular data models. You will be tasked with optimizing the performance of SSAS solutions for efficient query processing. Additionally, you will be required to integrate data from various sources into SQL Server databases and SSAS models. Knowledge of AWS S3 and SQL Server Polybase is preferred. The ideal candidate should have 5 to 8 years of experience in SQL development with expertise in SSAS and OLAP. This position is based in Pan India.,
Posted 3 weeks ago
4.0 - 10.0 years
0 Lacs
noida, uttar pradesh
On-site
We are seeking a PostgreSQL Database Developer with a minimum of 4 years of experience in Database management. We are looking for an individual who is enthusiastic about technology, committed to continuous learning, and approaches every client interaction as an opportunity to deliver exceptional customer service. Qualifications: - BE/B.Tech/MCA/MS-IT/CS/B.Sc/BCA or any related degrees - Proficiency in PostgreSQL, PLSQL, Oracle, Query optimization, Performance tuning, and GCP Cloud Key Responsibilities: - Proficient in PL/SQL and PostgreSQL programming with the ability to write complex SQL Queries and Stored Procedures - Experience in migrating Database structure and data from Oracle to PostgreSQL, preferably on GCP Alloy DB or Cloud SQL - Expertise in working with Cloud SQL/Alloy DB, tuning Alloy DB/PostgreSQL for enhanced performance, and utilizing BigQuery, Fire Store, Memory Store, Spanner, and bare metal setup - Familiarity with GCP Data migration service, MongoDB, Cloud Dataflow, Database Disaster Recovery, Job scheduling, logging techniques, and OLTP/OLAP - Desirable: GCP Database Engineer Certification Additional Responsibilities: - Develop, test, and maintain data architectures - Migrate Enterprise Oracle database from On-Premises to GCP cloud with a focus on autovacuum in PostgreSQL - Performance tuning of PostgreSQL stored procedure code and queries - Converting Oracle stored procedures & queries to PostgreSQL equivalents - Create Hybrid data stores integrating Datawarehouse and NoSQL GCP solutions with PostgreSQL - Lead the database team Mandatory Skills: PostgreSQL, PLSQL, BigQuery, GCP Cloud, Tuning, and Optimization If you meet the requirements and are interested in this position, kindly share your resume with details including CTC, expected CTC, Notice period, and Last Working Day (LWD) at sonali.mangore@impetus.com.,
Posted 3 weeks ago
3.0 - 8.0 years
5 - 8 Lacs
Bengaluru
Work from Office
We are seeking a highly skilled and analytical Senior Data Analyst with a strong background in Python, SQL, and modern data infrastructure. In this role, you will play a critical part in transforming raw data into meaningful insights that drive strategic decisions across the organization. You will work closely with product, engineering, and business teams to ensure that data is accessible, reliable, and actionable. Key Responsibilities Write efficient and optimized SQL queries to support reporting and analytics. Develop and maintain robust Python scripts and data pipelines for data transformation and automation. Work with PostgreSQL and other OLAP databases to manage and query large datasets effectively. Design, implement, and optimize dashboards and visualizations using Metabase , enabling self-service analytics across teams. Understand and leverage columnar file formats (e.g., Parquet, ORC) and data compression techniques to improve storage and query performance. Apply a solid understanding of codec and compression algorithms to analyze and troubleshoot data storage and performance bottlenecks. Collaborate with data engineers and analysts to validate data models and maintain data integrity across reporting layers. Perform deep-dive analyses and generate actionable insights to improve business outcomes. Qualifications & Skills 3+ years of experience in a Data Analyst, BI, or similar role. Proficiency in Python for data wrangling, scripting, and automation. Strong hands-on expertise in SQL , including query optimization. Experience working with PostgreSQL and at least one OLAP database (e.g., ClickHouse, Redshift, BigQuery, Druid, etc.). Understanding of columnar storage formats (Parquet, ORC, etc.) and when to use them. Familiarity with data codecs and compression algorithms such as Snappy, Zstandard, LZ4, etc. Deep, practical knowledge of Metabase : dashboard creation, advanced filtering, and embedding. Strong analytical skills, with the ability to translate complex datasets into clear insights and recommendations. Excellent communication and stakeholder management skills. Nice to Have Experience with modern data stacks or orchestration tools (e.g., dbt, Airflow). Exposure to data warehousing concepts and ETL/ELT frameworks. Familiarity with cloud-based analytics platforms (e.g., AWS, GCP, Azure)
Posted 3 weeks ago
8.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
At eBay, we're more than a global ecommerce leader — we’re changing the way the world shops and sells. Our platform empowers millions of buyers and sellers in more than 190 markets around the world. We’re committed to pushing boundaries and leaving our mark as we reinvent the future of ecommerce for enthusiasts. Our customers are our compass, authenticity thrives, bold ideas are welcome, and everyone can bring their unique selves to work — every day. We're in this together, sustaining the future of our customers, our company, and our planet. Join a team of passionate thinkers, innovators, and dreamers — and help us connect people and build communities to create economic opportunity for all. Global Payments and Risk is at the forefront of innovation, integrating ground breaking technologies into our products and services. We are dedicated to using the power of data driven solutions to solve real-world problems at internet scale, improve user experiences, and drive business outcomes. Join us in shaping the future with your expertise and passion for technology. We are seeking a motivated Software Engineer with a strong background in full stack software development. The ideal candidate should have a platform-centric approach and a proven track record to build platforms, frameworks to solve complex problems. You will be instrumental in crafting innovative applications that safeguard our marketplace to mitigate risks, and curtail financial losses. Join our collaborative team that thrives on creativity and resourcefulness to tackle sophisticated challenges. What You Will Accomplish Develop high-performing solutions that align with eBay's business and technology strategies to enhance risk management, trust, and compliance.. Research, analyze, design, develop and test the solutions that are appropriate for the business and technology strategies. Participate in design discussions, code reviews, and project-related team meetings, impacting significantly to the development process. Collaborate optimally within a multi-functional team comprising of engineers, architects, product managers, and operations, to deliver innovative solutions to address business needs, performance, scale, and reliability. Acquire domain expertise and apply this knowledge to tackle product challenges, ensuring continuous improvement in the domain. Act as an onboarding buddy for new joiners, fostering a supportive and inclusive work environment. What You Will Bring At least 8 years of software design and development experience with proven foundation in computer science with strong competencies in data structures, algorithms, distributed computing and scalable software design Hands on expertise with architectural and design patterns, open source platforms & frameworks, technologies and software engineering methodologies. Hands-on experience in developing applications with Spring/Spring Boot, Rest, GraphQl, Java, JEE, Spring batch. Hands-on experience with building data models in Oracle/MySql/RDBMS and NoSql databases e.g key value store, document store like Mongo, Couchbase, Cassandra. Hands-on Experience in building tools and User experience using Html5, Node.js, ReactJs, Arco design, Material design. Hands on experience in finetuning performance bottlenecks of java, node.js, javascript. Proficiency in data & streaming technologies like Hadoop, Spark, Kafka, apache Flink etc. Practiced agile development and ability to adapt changes with business priorities. Experience with building sophisticated integration solutions for internet scale traffic is a major plus. Risk domain, rule engine expertise is a major plus. Excellent with decision-making, communication and collaboration skills. Familiarity with prompt engineering and AI tools is a major plus. Experience with Prometheus, Graphana, OLAP and Devops tools for observability is required. Behaviors Innovating effectively in a dynamic, fast-changing environment, challenges convention. Develop solutions that deliver tangible results. Strong execution and alignment with timelines and timely addressing of blocking issues when risks arise. Practices learning and collaborates effectively in a team that has multiple functions. Education Degree in computer science or equivalent discipline with 8+ years of software application development experience. Please see the Talent Privacy Notice for information regarding how eBay handles your personal data collected when you use the eBay Careers website or apply for a job with eBay. eBay is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, sex, sexual orientation, gender identity, veteran status, and disability, or other legally protected status. If you have a need that requires accommodation, please contact us at talent@ebay.com. We will make every effort to respond to your request for accommodation as soon as possible. View our accessibility statement to learn more about eBay's commitment to ensuring digital accessibility for people with disabilities. The eBay Jobs website uses cookies to enhance your experience. By continuing to browse the site, you agree to our use of cookies. Visit our Privacy Center for more information.
Posted 3 weeks ago
10.0 years
4 - 8 Lacs
Madurai
On-site
Job Location: Madurai Job Experience: 10-20 Years Model of Work: Work From Office Technologies: GCP Functional Area: Software Development Job Summary: Job Title: GCP Data Architect Location: Madurai Experience: 12+ Years Notice Period: Immediate About TechMango TechMango is a rapidly growing IT Services and SaaS Product company that helps global businesses with digital transformation, modern data platforms, product engineering, and cloud-first initiatives. We are seeking a GCP Data Architect to lead data modernization efforts for our prestigious client in USA and its a highly strategic project. Role Summary As a GCP Data Architect, you will be responsible for designing and implementing scalable, high-performance data solutions on Google Cloud Platform. You will work closely with stakeholders to define data architecture, implement data pipelines, modernize legacy data systems, and guide data strategy aligned with enterprise goals. Key Responsibilities: Lead end-to-end design and implementation of scalable data architecture on Google Cloud Platform (GCP) Define data strategy, standards, and best practices for cloud data engineering and analytics Develop data ingestion pipelines using Dataflow, Pub/Sub, Apache Beam, Cloud Composer (Airflow), and BigQuery Migrate on-prem or legacy systems to GCP (e.g., from Hadoop, Teradata, or Oracle to BigQuery) Architect data lakes, warehouses, and real-time data platforms Ensure data governance, security, lineage, and compliance (using tools like Data Catalog, IAM, DLP) Guide a team of data engineers and collaborate with business stakeholders, data scientists, and product managers Create documentation, high-level design (HLD) and low-level design (LLD), and oversee development standards Provide technical leadership in architectural decisions and future-proofing the data ecosystem Required Skills & Qualifications: 10+ years of experience in data architecture, data engineering, or enterprise data platforms Minimum 3–5 years of hands-on experience in GCP Data Service Proficient in:BigQuery, Cloud Storage, Dataflow, Pub/Sub, Composer, Cloud SQL/Spanner Python / Java / SQL Data modeling (OLTP, OLAP, Star/Snowflake schema) Experience with real-time data processing, streaming architectures, and batch ETL pipelines Good understanding of IAM, networking, security models, and cost optimization on GCP Prior experience in leading cloud data transformation projects Excellent communication and stakeholder management skills Preferred Qualifications: GCP Professional Data Engineer / Architect Certification Experience with Terraform, CI/CD, GitOps, Looker / Data Studio / Tableau for analytics Exposure to AI/ML use cases and MLOps on GCP Experience working in agile environments and client-facing roles About our Talent Acquisition Team: Arumugam Veera leads the Talent Acquisition function for both TechMango and Bautomate - SaaS Platform , driving our mission to build high-performing teams and connect top talent with exciting career opportunities. Feel free to connect with him on LinkedIn : https://www.linkedin.com/in/arumugamv/ Follow our official TechMango LinkedIn page for the latest job updates and career opportunities: https://www.linkedin.com/company/techmango-technology-services-private-limited/ Looking forward to connecting and helping you explore your next great opportunity with us!
Posted 3 weeks ago
2.0 - 5.0 years
2 - 6 Lacs
Hyderabad
Work from Office
Detailed job description - Skill Set: Technically strong hands-on Self-driven Good client communication skills Able to work independently and good team player Flexible to work in PST hour(overlap for some hours) Past development experience for Cisco client is preferred.
Posted 3 weeks ago
6.0 - 11.0 years
10 - 15 Lacs
Bengaluru
Work from Office
Project description You will be working in a cutting edge, banking environment which is now ongoing thorough upgrade program. You will be responsible for translating business data and overall data into reusable and adjustable dashboards used by senior business managers. Responsibilities Design and develop complex T-SQL queries and stored procedures for data extraction, transformation, and reporting. Build and manage ETL workflows using SSIS to support data integration across multiple sources. Create interactive dashboards and reports using Tableau and SSRS for business performance monitoring. Develop and maintain OLAP cubes using SSAS for multidimensional data analysis. Collaborate with business and data teams to understand reporting requirements and deliver scalable BI solutions. Apply strong data warehouse architecture and modeling concepts to build efficient data storage and retrieval systems. Perform performance tuning and query optimization for large datasets and improve system responsiveness. Ensure data quality, consistency, and integrity through robust validation and testing processes. Maintain documentation for data pipelines, ETL jobs, and reporting structures. Stay updated with emerging Microsoft BI technologies and best practices to continuously improve solutions. Skills Must have At least 6 years of experience with T-SQL and SQL Server (SSIS and SSRS exposure is a must. Proficient with Tableau, preferably with at least 4 years of experience creating dashboards. Experience working with businesses and delivering dashboards to senior management. Working within Data Warehouse architecture experience is a must. Exposure to Microsoft BI. Nice to have N/A
Posted 3 weeks ago
4.0 - 8.0 years
4 - 8 Lacs
Hyderabad
Work from Office
Detailed job description - Skill Set: Technically strong hands-on Self-driven Good client communication skills Able to work independently and good team player Flexible to work in PST hour(overlap for some hours) Past development experience for Cisco client is preferred.
Posted 3 weeks ago
5.0 - 9.0 years
9 - 13 Lacs
Bengaluru
Work from Office
Job Overview: We are looking for a BI & Visualization Developer who will be part of our Analytics Practice and will be expected to actively work in a multi-disciplinary fast paced environment. This role requires a broad range of skills and the ability to step into different roles depending on the size and scope of the project; its primary responsibility is to support the design, development and maintainance of business intelligence and analytics solutions. Responsibilities: \u2022 Develop reports, dashboards, and advanced visualizations. Works closely with the product managers, business analysts, clients etc. to understand the needs / requirements and develop visualizations needed. \u2022 Provide support to new of existing applications while recommending best practices and leading projects to implement new functionality. \u2022 Learn and develop new visualization techniques as required to keep up with the contemporary visualization design and presentation. \u2022 Reviews the solution requirements and architecture to ensure selection of appropriate technology, efficient use of resources and integration of multiple systems and technology. \u2022 Collaborate in design reviews and code reviews to ensure standards are met. Recommend new standards for visualizations. \u2022 Build and reuse template/components/web services across multiple dashboards \u2022 Support presentations to Customers and Partners \u2022 Advising on new technology trends and possible adoption to maintain competitive advantage \u2022 Mentoring Associates Experience Needed: \u2022 8+ years of related experience is required. \u2022 A Bachelor degree or Masters degree in Computer Science or related technical discipline is required \u2022 Highly skilled in data visualization tools like PowerBI, Tableau, Qlikview etc. \u2022 Very Good Understanding of PowerBI Tabular Model/Azure Analysis Services using large datasets. \u2022 Strong SQL coding experience with performance optimization experience for data queries. \u2022 Understands different data models like normalized, de-normalied, stars, and snowflake models. \u2022 Worked in big data environments, cloud data stores, different RDBMS and OLAP solutions. \u2022 Experience in design, development, and deployment of BI systems. \u2022 Candidates with ETL experience preferred. \u2022 Is familiar with the principles and practices involved in development and maintenance of software solutions and architectures and in service delivery. \u2022 Has strong technical background and remains evergreen with technology and industry developments. Additional Requirements \u2022 Demonstrated ability to have successfully completed multiple, complex technical projects \u2022 Prior experience with application delivery using an Onshore/Offshore model \u2022 Experience with business processes across multiple Master data domains in a services based company \u2022 Demonstrates a rational and organized approach to the tasks undertaken and an awareness of the need to achieve quality. \u2022 Demonstrates high standards of professional behavior in dealings with clients, colleagues and staff. \u2022 Strong written communication skills. Is effective and persuasive in both written and oral communication. \u2022 Experience with gathering end user requirements and writing technical documentation \u2022 Time management and multitasking skills to effectively meet deadlines under time-to-market pressure \u2022 May require occasional travel Conduent is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, creed, religion, ancestry, national origin, age, gender identity, gender expression, sex/gender, marital status, sexual orientation, physical or mental disability, medical condition, use of a guide dog or service animal, military/veteran status, citizenship status, basis of genetic information, or any other group protected by law. People with disabilities who need a reasonable accommodation to apply for or compete for employment with Conduent may request such accommodation(s) by submitting their request through this form that must be downloaded:click here to access or download the form. Complete the form and then email it as an attachment toFTADAAA@conduent.com.You may alsoclick here to access Conduent's ADAAA Accommodation Policy. At Conduent we value the health and safety of our associates, their families and our community. For US applicants while we DO NOT require vaccination for most of our jobs, we DO require that you provide us with your vaccination status, where legally permissible. Providing this information is a requirement of your employment at Conduent.
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough