Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
8.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
Role Description Role Proficiency: This role requires proficiency in data pipeline development including coding and testing data pipelines for ingesting wrangling transforming and joining data from various sources. Must be skilled in ETL tools such as Informatica Glue Databricks and DataProc with coding expertise in Python PySpark and SQL. Works independently and has a deep understanding of data warehousing solutions including Snowflake BigQuery Lakehouse and Delta Lake. Capable of calculating costs and understanding performance issues related to data solutions. Outcomes Act creatively to develop pipelines and applications by selecting appropriate technical options optimizing application development maintenance and performance using design patterns and reusing proven solutions.rnInterpret requirements to create optimal architecture and design developing solutions in accordance with specifications. Document and communicate milestones/stages for end-to-end delivery. Code adhering to best coding standards debug and test solutions to deliver best-in-class quality. Perform performance tuning of code and align it with the appropriate infrastructure to optimize efficiency. Validate results with user representatives integrating the overall solution seamlessly. Develop and manage data storage solutions including relational databases NoSQL databases and data lakes. Stay updated on the latest trends and best practices in data engineering cloud technologies and big data tools. Influence and improve customer satisfaction through effective data solutions. Measures Of Outcomes Adherence to engineering processes and standards Adherence to schedule / timelines Adhere to SLAs where applicable # of defects post delivery # of non-compliance issues Reduction of reoccurrence of known defects Quickly turnaround production bugs Completion of applicable technical/domain certifications Completion of all mandatory training requirements Efficiency improvements in data pipelines (e.g. reduced resource consumption faster run times). Average time to detect respond to and resolve pipeline failures or data issues. Number of data security incidents or compliance breaches. Outputs Expected Code Development: Develop data processing code independently ensuring it meets performance and scalability requirements. Define coding standards templates and checklists. Review code for team members and peers. Documentation Create and review templates checklists guidelines and standards for design processes and development. Create and review deliverable documents including design documents architecture documents infrastructure costing business requirements source-target mappings test cases and results. Configuration Define and govern the configuration management plan. Ensure compliance within the team. Testing Review and create unit test cases scenarios and execution plans. Review the test plan and test strategy developed by the testing team. Provide clarifications and support to the testing team as needed. Domain Relevance Advise data engineers on the design and development of features and components demonstrating a deeper understanding of business needs. Learn about customer domains to identify opportunities for value addition. Complete relevant domain certifications to enhance expertise. Project Management Manage the delivery of modules effectively. Defect Management Perform root cause analysis (RCA) and mitigation of defects. Identify defect trends and take proactive measures to improve quality. Estimation Create and provide input for effort and size estimation for projects. Knowledge Management Consume and contribute to project-related documents SharePoint libraries and client universities. Review reusable documents created by the team. Release Management Execute and monitor the release process to ensure smooth transitions. Design Contribution Contribute to the creation of high-level design (HLD) low-level design (LLD) and system architecture for applications business components and data models. Customer Interface Clarify requirements and provide guidance to the development team. Present design options to customers and conduct product demonstrations. Team Management Set FAST goals and provide constructive feedback. Understand team members' aspirations and provide guidance and opportunities for growth. Ensure team engagement in projects and initiatives. Certifications Obtain relevant domain and technology certifications to stay competitive and informed. Skill Examples Proficiency in SQL Python or other programming languages used for data manipulation. Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Azure ADF. Hands-on experience with cloud platforms like AWS Azure or Google Cloud particularly with data-related services (e.g. AWS Glue BigQuery). Conduct tests on data pipelines and evaluate results against data quality and performance specifications. Experience in performance tuning of data processes. Expertise in designing and optimizing data warehouses for cost efficiency. Ability to apply and optimize data models for efficient storage retrieval and processing of large datasets. Capacity to clearly explain and communicate design and development aspects to customers. Ability to estimate time and resource requirements for developing and debugging features or components. Knowledge Examples Knowledge Examples Knowledge of various ETL services offered by cloud providers including Apache PySpark AWS Glue GCP DataProc/DataFlow Azure ADF and ADLF. Proficiency in SQL for analytics including windowing functions. Understanding of data schemas and models relevant to various business contexts. Familiarity with domain-related data and its implications. Expertise in data warehousing optimization techniques. Knowledge of data security concepts and best practices. Familiarity with design patterns and frameworks in data engineering. Additional Comments UST is seeking a highly skilled and motivated Lead Data Engineer to join our Telecommunications vertical, leading impactful data engineering initiatives for US-based Telco clients. The ideal candidate will have 6–8 years of experience in designing and developing scalable data pipelines using Snowflake, Azure Data Factory, Azure Databricks. Proficiency in Python, PySpark, and advanced SQL is essential, with a strong focus on query optimization, performance tuning, and cost-effective architecture. A solid understanding of data integration, real-time and batch processing, and metadata management is required, along with experience in building robust ETL/ELT workflows. Candidates should demonstrate a strong commitment to data quality, validation, and consistency, with working knowledge of data governance, RBAC, encryption, and compliance frameworks considered a plus. Familiarity with Power BI or similar BI tools is also advantageous, enabling effective data visualization and storytelling. The role demands the ability to work in a dynamic, fast-paced environment, collaborating closely with stakeholders and cross-functional teams while also being capable of working independently. Strong communication skills and the ability to coordinate across multiple teams and stakeholders are critical for success. In addition to technical expertise, the candidate should bring experience in solution design and architecture planning, contributing to scalable and future-ready data platforms. A proactive mindset, eagerness to learn, and adaptability to the rapidly evolving data engineering landscape—including AI integration into data workflows—are highly valued. This is a leadership role that involves mentoring junior engineers, fostering innovation, and driving continuous improvement in data engineering practices. Skills Azure Databricks,Snowflake,python,Data Engineering
Posted 3 weeks ago
5.0 - 7.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
Equifax is where you can power your possible. If you want to achieve your true potential, chart new paths, develop new skills, collaborate with bright minds, and make a meaningful impact, we want to hear from you. You are passionate about quality and how customers experience the products you test. You have the ability to create, maintain and execute test plans in order to verify requirements. As a Quality Engineer at Equifax, you will be a catalyst in both the development and the testing of high priority initiatives. You will develop and test new products to support technology operations while maintaining exemplary standards. As a collaborative member of the team, you will deliver QA services (code quality, testing services, performance engineering, development collaboration and continuous integration). You will conduct quality control tests in order to ensure full compliance with specified standards and end user requirements. You will execute tests using established plans and scripts; documents problems in an issues log and retest to ensure problems are resolved. You will create test files to thoroughly test program logic and verify system flow. You will identify, recommend and implement changes to enhance effectiveness of QA strategies. What You Will Do Independently develop scalable and reliable automated tests and frameworks for testing software solutions. Specify and automate test scenarios and test data for a highly complex business by analyzing integration points, data flows, personas, authorization schemes and environments Develop regression suites, develop automation scenarios, and move automation to an agile continuous testing model. Pro-actively and collaboratively taking part in all testing related activities while establishing partnerships with key stakeholders in Product, Development/Engineering, and Technology Operations. What Experience You Need Bachelor's degree in a STEM major or equivalent experience 5-7 years of software testing experience Able to create and review test automation according to specifications Ability to write, debug, and troubleshoot code in Java, Springboot, TypeScript/JavaScript, HTML, CSS Creation and use of big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others with respect to software validation Created test strategies and plans Led complex testing efforts or projects Participated in Sprint Planning as the Test Lead Collaborated with Product Owners, SREs, Technical Architects to define testing strategies and plans. Design and development of micro services using Java, Springboot, GCP SDKs, GKE/Kubeneties Deploy and release software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs Cloud Certification Strongly Preferred What Could Set You Apart An ability to demonstrate successful performance of our Success Profile skills, including: Attention to Detail - Define test case candidates for automation that are outside of product specifications. i.e. Negative Testing; Create thorough and accurate documentation of all work including status updates to summarize project highlights; validating that processes operate properly and conform to standards Automation - Automate defined test cases and test suites per project Collaboration - Collaborate with Product Owners and development team to plan and and assist with user acceptance testing; Collaborate with product owners, development leads and architects on functional and non-functional test strategies and plans Execution - Develop scalable and reliable automated tests; Develop performance testing scripts to assure products are adhering to the documented SLO/SLI/SLAs; Specify the need for Test Data types for automated testing; Create automated tests and tests data for projects; Develop automated regression suites; Integrate automated regression tests into the CI/CD pipeline; Work with teams on E2E testing strategies and plans against multiple product integration points Quality Control - Perform defect analysis, in-depth technical root cause analysis, identifying trends and recommendations to resolve complex functional issues and process improvements; Analyzes results of functional and non-functional tests and make recommendation for improvements; Performance / Resilience: Understanding application and network architecture as inputs to create performance and resilience test strategies and plans for each product and platform. Conducting the performance and resilience testing to ensure the products meet SLAs / SLOs Quality Focus - Review test cases for complete functional coverage; Review quality section of Production Readiness Review for completeness; Recommend changes to existing testing methodologies for effectiveness and efficiency of product validation; Ensure communications are thorough and accurate for all work documentation including status and project updates Risk Mitigation - Work with Product Owners, QE and development team leads to track and determine prioritization of defects fixes We offer a hybrid work setting, comprehensive compensation and healthcare packages, attractive paid time off, and organizational growth potential through our online learning platform with guided career tracks. Are you ready to power your possible? Apply today, and get started on a path toward an exciting new career at Equifax, where you can make a difference! Who is Equifax? At Equifax, we believe knowledge drives progress. As a global data, analytics and technology company, we play an essential role in the global economy by helping employers, employees, financial institutions and government agencies make critical decisions with greater confidence. We work to help create seamless and positive experiences during life’s pivotal moments: applying for jobs or a mortgage, financing an education or buying a car. Our impact is real and to accomplish our goals we focus on nurturing our people for career advancement and their learning and development, supporting our next generation of leaders, maintaining an inclusive and diverse work environment, and regularly engaging and recognizing our employees. Regardless of location or role, the individual and collective work of our employees makes a difference and we are looking for talented team players to join us as we help people live their financial best. Equifax is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran.
Posted 3 weeks ago
3.0 - 7.0 years
0 Lacs
tamil nadu
On-site
As a data engineer, you will be expected to be proficient in Python, SQL, and either Java or Scala, especially for Spark/Beam pipelines. Experience with BigQuery, Dataflow, Apache Beam, Airflow, and Kafka will be beneficial for this role. You will be responsible for building scalable batch and streaming pipelines to support machine learning or campaign analytics. Familiarity with ad tech, bid logs, or event tracking pipelines is considered a plus. Your primary role will involve constructing the foundational data infrastructure to handle the ingestion, processing, and serving of bid logs, user events, and attribution data from various sources. Key responsibilities include building scalable data pipelines for real-time and batch ingestion from DSPs, attribution tools, and order management systems. You will need to design clean and queryable data models to facilitate machine learning training and campaign optimization. Additionally, you will be required to enable data joins across 1st, 2nd, and 3rd-party data sets such as device, app, geo, and segment information. Optimizing pipelines for freshness, reliability, and cost efficiency is crucial, along with supporting event-level logging of auction wins, impressions, conversions, and click paths. The ideal candidate for this role should possess skills in Apache Beam, Airflow, Kafka, Scala, SQL, BigQuery, attribution, Java, Dataflow, Spark, machine learning, and Python. If you are enthusiastic about data engineering and have a background in building scalable data pipelines, this position could be a great fit for you.,
Posted 3 weeks ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Company Overview: Maxima Tek is a well-funded and rapidly growing company specializing in software solutions for software-defined vehicles, with over four million vehicles already on the road with top global OEM brands. They are at the forefront of automotive digital transformation and have a diverse team across various international locations. This is a hybrid position based in Sunnyvale, CA, requiring 3 days in the office. The Opportunity: Sonatus is seeking a highly motivated AI Engineer to drive software innovations for next-generation software-defined vehicles. The role focuses on customer-first product development and solving real-world problems. Key Responsibilities: Lead efforts to leverage existing AI models and frameworks to solve complex business challenges. Conduct the full data modeling and algorithm development lifecycle (modeling, training, tuning, validating, deploying, maintaining). Demonstrate strong domain expertise in various AI areas including LLM, Computer Vision, Time Series, RAG, fine-tuning large models, and traditional ML models. Stay current with industry trends and advancements in data science and AI. Perform data analysis and provide insights for business decisions. Ensure adherence to data privacy and security protocols. Collaborate with cross-functional teams to translate requirements into AI/data science solutions. Document and communicate technical designs, processes, and best practices. Manage projects to ensure timely completion in a dynamic environment. Requirements: Master’s or PhD in Computer Science, Engineering, Mathematics, Applied Sciences, or related field. Strong programming skills in Python, Java, or C++, with experience in TensorFlow, PyTorch, or scikit-learn. In-depth knowledge of current machine learning algorithms, AI technologies, and platforms. Experience with data engineering/processing frameworks (e.g., Databricks, Spark, Dataflow) and SQL proficiency. Solid experience in data preprocessing, feature engineering, and model evaluation. Familiarity with cloud platforms (AWS, Azure, Google Cloud) and containerization (Docker, Kubernetes) is a plus. Strong knowledge of software development best practices, version control systems, and agile methodologies. Results-driven with excellent problem-solving, communication (verbal and written), and collaboration skills. Experience in the automotive industry is highly desirable
Posted 3 weeks ago
6.0 - 11.0 years
15 - 30 Lacs
Pune
Hybrid
Software Engineer - Specialist What youll do Demonstrate a deep understanding of cloud-native, distributed microservice-based architectures. Deliver solutions for complex business problems through standard Software Development Life Cycle (SDLC) practices. Build strong relationships with both internal and external stakeholders, including product, business, and sales partners. Demonstrate excellent communication skills with the ability to both simplify complex problems and also dive deeper if needed. Lead strong technical teams that deliver complex software solutions that scale. Work across teams to integrate our systems with existing internal systems. Participate in a tight-knit, globally distributed engineering team. Provide deep troubleshooting skills with the ability to lead and solve production and customer issues under pressure. Leverage strong experience in full-stack software development and public cloud platforms like GCP and AWS. Mentor, coach, and develop junior and senior software, quality, and reliability engineers. Ensure compliance with secure software development guidelines and best practices. Define, maintain, and report SLA, SLO, and SLIs meeting EFX engineering standards in partnership with the product, engineering, and architecture teams. Collaborate with architects, SRE leads, and other technical leadership on strategic technical direction, guidelines, and best practices. Drive up-to-date technical documentation including support, end-user documentation, and run books. Responsible for implementation architecture decision-making associated with Product features/stories and refactoring work decisions. Create and deliver technical presentations to internal and external technical and non-technical stakeholders, communicating with clarity and precision, and presenting complex information in a concise, audience-appropriate format. What experience you need Bachelor's degree or equivalent experience. 5+ years of software engineering experience. 5+ years experience writing, debugging, and troubleshooting code in mainstream Java and SpringBoot. 5+ years experience with Cloud technology: GCP, AWS, or Azure. 5+ years experience designing and developing cloud-native solutions. 5+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes. 5+ years experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others. 5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understanding infrastructure-as-code concepts, Helm Charts, and Terraform constructs. What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision. Strong communication and presentation skills. Strong leadership qualities. Demonstrated problem-solving skills and the ability to resolve conflicts. Experience creating and maintaining product and software roadmaps. Working in a highly regulated environment. Experience on GCP in Big data and distributed systems - Dataflow, Apache Beam, Pub/Sub, BigTable, BigQuery, GCS. Experience with backend technologies such as JAVA/J2EE, SpringBoot, Golang, gRPC, SOA, and Microservices. Source code control management systems (e.g., SVN/Git, Github, Gitlab), build tools like Maven & Gradle, and CI/CD like Jenkins or Gitlab. Agile environments (e.g., Scrum, XP). Relational databases (e.g., SQL Server, MySQL). Atlassian tooling (e.g., JIRA, Confluence, and Github). Developing with modern JDK (v1.7+). Automated Testing: JUnit, Selenium, LoadRunner, SoapUI.
Posted 3 weeks ago
0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Job Description Are You Ready to Make It Happen at Mondelēz International? Join our Mission to Lead the Future of Snacking. Make It With Pride. You will provide technical contributions to the data science process. In this role, you are the internally recognized expert in data, building infrastructure and data pipelines/retrieval mechanisms to support our data needs How You Will Contribute You will: Operationalize and automate activities for efficiency and timely production of data visuals Assist in providing accessibility, retrievability, security and protection of data in an ethical manner Search for ways to get new data sources and assess their accuracy Build and maintain the transports/data pipelines and retrieve applicable data sets for specific use cases Understand data and metadata to support consistency of information retrieval, combination, analysis, pattern recognition and interpretation Validate information from multiple sources. Assess issues that might prevent the organization from making maximum use of its information assets What You Will Bring A desire to drive your future and accelerate your career and the following experience and knowledge: Extensive experience in data engineering in a large, complex business with multiple systems such as SAP, internal and external data, etc. and experience setting up, testing and maintaining new systems Experience of a wide variety of languages and tools (e.g. script languages) to retrieve, merge and combine data Ability to simplify complex problems and communicate to a broad audience In This Role As a Senior Data Engineer, you will have the opportunity to design and build scalable, secure, and cost-effective cloud-based data solutions. You will develop and maintain data pipelines to extract, transform, and load data into data warehouses or data lakes, ensuring data quality and validation processes to maintain data accuracy and integrity. You will ensure efficient data storage and retrieval for optimal performance, and collaborate closely with data teams, product owners, and other stakeholders to stay updated with the latest cloud technologies and best practices. Role & Responsibilities: Design and Build: Develop and implement scalable, secure, and cost-effective cloud-based data solutions. Manage Data Pipelines: Develop and maintain data pipelines to extract, transform, and load data into data warehouses or data lakes. Ensure Data Quality: Implement data quality and validation processes to ensure data accuracy and integrity. Optimize Data Storage: Ensure efficient data storage and retrieval for optimal performance. Collaborate and Innovate: Work closely with data teams, product owners, and stay updated with the latest cloud technologies and best practices. Technical Requirements: Programming: Python, PySpark, Go/Java Database: SQL, PL/SQL ETL & Integration: DBT, Databricks + DLT, AecorSoft, Talend, Informatica/Pentaho/Ab-Initio, Fivetran. Data Warehousing: SCD, Schema Types, Data Mart. Visualization: Databricks Notebook, PowerBI (Optional), Tableau (Optional), Looker. GCP Cloud Services: Big Query, GCS, Cloud Function, PubSub, Dataflow, DataProc, Dataplex. AWS Cloud Services: S3, Redshift, Lambda, Glue, CloudWatch, EMR, SNS, Kinesis. Azure Cloud Services: Azure Datalake Gen2, Azure Databricks, Azure Synapse Analytics, Azure Data Factory, Azure Stream Analytics. Supporting Technologies: Graph Database/Neo4j, Erwin, Collibra, Ataccama DQ, Kafka, Airflow. Soft Skills: Problem-Solving: The ability to identify and solve complex data-related challenges. Communication: Effective communication skills to collaborate with Product Owners, analysts, and stakeholders. Analytical Thinking: The capacity to analyze data and draw meaningful insights. Attention to Detail: Meticulousness in data preparation and pipeline development. Adaptability: The ability to stay updated with emerging technologies and trends in the data engineering field. Within Country Relocation support available and for candidates voluntarily moving internationally some minimal support is offered through our Volunteer International Transfer Policy Business Unit Summary At Mondelēz International, our purpose is to empower people to snack right by offering the right snack, for the right moment, made the right way. That means delivering a broad range of delicious, high-quality snacks that nourish life's moments, made with sustainable ingredients and packaging that consumers can feel good about. We have a rich portfolio of strong brands globally and locally including many household names such as Oreo , belVita and LU biscuits; Cadbury Dairy Milk , Milka and Toblerone chocolate; Sour Patch Kids candy and Trident gum. We are proud to hold the top position globally in biscuits, chocolate and candy and the second top position in gum. Our 80,000 makers and bakers are located in more than 80 countries and we sell our products in over 150 countries around the world. Our people are energized for growth and critical to us living our purpose and values. We are a diverse community that can make things happen—and happen fast. Mondelēz International is an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, gender, sexual orientation or preference, gender identity, national origin, disability status, protected veteran status, or any other characteristic protected by law. Job Type Regular Data Science Analytics & Data Science
Posted 3 weeks ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description Strategy and Enterprise Analytics Team provides insights and decision support tools to a broad range of business partners in sustainability and Eelectric vehicles. We drive to deliver the most value to Ford through critical thinking, AI/ML, big data analytics and optimization techniques on Google Cloud Platform (GCP). The team leverages advanced analytics and domain knowledge to develop analytics models, transformational decision-support tools, and services to: Increase profit while meeting North American regulatory compliance Support strategic initiatives for electrification, especially in public charging, home power management, and energy services Design and build solutions that support holistic decision making at enterprise level (OGC) We are looking for a manager / product owner to lead a multi-skill team of data scientists, data engineers and software engineers in all phases of ongoing and future analytics services and product development, including problem formulation, data identification, model development, validation, and product launch. The candidate should have great independence, exceptional collaboration and leadership skills, and self-discipline to guide original applied research and choose appropriate methodologies to solve related problems. We are especially excited about candidates with supervisory experience, passion for hands-on work, strong technical skills and growth mindset who demonstrate a passion for developing talents and applying state-of-the-art solutions to novel and challenging problems. Our team utilizes a diverse set of tools and methodologies from different technical fields including Machine Learning, Statistical Analysis, Simulation, Big Data platforms and more. We understand that you cannot be an expert in everything, and the set of techniques and technologies we are using today may change over time. Given the required qualifications, we are looking for candidates who are lifelong learners, driven, and curious. Responsibilities Lead a group of data scientists, data engineers, and software engineers, to solve exciting and meaningful problems Translate business needs into analytical problems, work hands-on along with the team, judge among candidate ML models, contribute towards best practices in model development, conduct code reviews, research state-of-the art techniques and apply them for the team and business needs. Develop and apply analytical solutions to address real-world automotive and Quality challenges Initiate and manage cross-functional projects, building relationships with business partners, and influencing decision makers Ability to work well under limited supervision and use good judgment to know when to update and seek guidance from leadership Communicate and present insights to business customers and executives Collaborate internally and externally to identify new and novel data sources and explore their potential use in developing actionable business results Explore emerging technologies and analytic solutions for use in quantitative model development Develop and sustain a highly performing team Ability to collaborate, negotiate, and work effectively with coworkers at all levels Works with Product Line Owner on creating demand and aligning the requirements to the Business demands Qualifications Master’s degree in Engineering, Data Science, Computer Science, Statistics, Industrial Engineering, or other data-related fields 5+ years of domain experience in delivering analytics solutions in any of these areas (Sustainability / Regulatory, Electrification, Legal, Cycle Planning) 10+ years of experience in analytics domain with 3+ years of supervisory experience Familiarity with SQL, Spark, Hive, and other big data technologies Familiarity with Google Cloud Platform, Python, Spark, Dataflow, BigQuery, GitHub, Qlik Sense, CPLEX, Mach1ML, Power BI Demonstrated performance in working on developing analytical models and deploying them in GCP Strong drive for results, sense of urgency, and attention to detail Strong verbal and written communication skills with the ability to present to cross functional levels of management Ability to work in a fast-paced environment with global resources under short response times and changing business needs Familiarity with NLP, Deep Learning, neural network architectures including CNNs, RNNs, Embeddings, Transfer Learning, and Transformers. BEV experience a plus Deep understanding of Agile & PDO processes Software delivery experience a plus
Posted 3 weeks ago
0 years
3 - 7 Lacs
Hyderābād
On-site
Job description Some careers shine brighter than others. If you’re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Consultant Specialist. In this role, you will: The DevOps Engineering job is responsible for developing automations across the Technology delivery lifecycle including construction, testing, release and ongoing service management, and monitoring of a product or service within a Technology team. They will be required to continually enhance their skills within a number of specialisms which include CI/CD, automation, pipeline development, security, testing, and operational support. This role will carry out some or all of the following activities: The role of the DevOps engineer is to facilitate the application teams across the Bank to deploy and their applications across GCP services like GKE Container, BigQuery, Dataflow, PubSub, Kafka The DevOps Engineer should be the go-to person in case application team faces any issue during Platform adoption, onboarding, deployment and environment troubleshooting. Ensure service resilience, service sustainability and recovery time objectives are met for all the software solutions delivered. Responsible for automating the continuous integration / continuous delivery pipeline within a DevOps Product/Service team driving a culture of continuous improvement. Keep up to date and have expertise on current tools, technologies and areas like cyber security and regulations pertaining to aspects like data privacy, consent, data residency etc. that are applicable End to end accountability for a product or service, identifying and developing the most appropriate Technology solutions to meet customer needs as part of the Customer Journey Liaise with other engineers, architects, and business stakeholders to understand and drive the product or service’s direction. Analyze production errors to define and create tools that help mitigate problems in the system design stage and applying user-defined integrations, improving the user experience. Requirements To be successful in this role, you should meet the following requirements: Bachelor Degree in Computer Science or related disciplines 6 or more years of hands-on development experience building fully self-serve, observable solutions using infrastructure and Policy As A Code Proficiency developing with modern programming languages and and ability to rapidly develop proof-of-concepts Ability to work with geographically distributed and cross-functional teams Expert in code deployment tools (Jenkins, Puppet, Ansible, Git, Selenium, and Chef) Expert in automation tools (CloudFormation, Terraform, shell script, Helm, Ansible) Familiar with Containers (Docker, Docker compose, Kubernetes, GKE) Familiar with Monitoring (DATADOG, Grafana, Prometheus, AppDynamics, New Relic, Splunk) The successful candidate will also meet the following requirements: Good understanding of GCP Cloud or Hybrid Cloud approach implementations Good understanding and experience on MuleSoft / PCF/Any Gateway Server Implementations Hands on experience in Kong API Gateway platform Good understanding and experience on Middleware and MQ areas. Familiar with infrastructure support Apache Gateway, runtime Server Configurations, SSL Cert setup etc You’ll achieve more when you join HSBC. www.hsbc.com/careers HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. Issued by – HSBC Software Development India
Posted 3 weeks ago
5.0 - 8.0 years
22 - 24 Lacs
Hyderābād
On-site
Job Title: Data Engineer – GCP BigQuery, Python, CI/CD Location: Offshore Ascendion location Experience Level: 5-8 years Job Type: Full-time Start date – July 15th Budget - 25 LPA (Including Variable Pay) Job Summary: We are seeking a skilled and passionate Data Engineer to join our growing team. The ideal candidate will have strong experience in developing scalable data pipelines using Google Cloud Platform (GCP), BigQuery, Python, and CI/CD tools. You will be responsible for designing, building, and maintaining data solutions that enable advanced analytics and business insights. Key Responsibilities: Design, develop, and optimize scalable data pipelines and ETL processes using GCP services, primarily BigQuery. Write efficient and reusable Python scripts to transform, clean, and enrich large datasets. Implement CI/CD pipelines for data workflows to ensure robust and reliable deployments. Collaborate with data scientists, analysts, and business stakeholders to gather requirements and deliver data solutions. Monitor and troubleshoot data workflows and pipelines in production environments. Apply best practices in data modeling, performance tuning, and cost optimization in BigQuery. Ensure data quality, governance, and security throughout the data lifecycle. Required Skills and Qualifications: Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field. 5-8 years of experience in Data Engineering roles. Hands-on experience with Google Cloud Platform (GCP) , especially BigQuery , Cloud Storage, Dataflow, or Pub/Sub. Strong programming skills in Python . Experience in building CI/CD pipelines using tools such as Git, Jenkins, Cloud Build, or Terraform . Solid understanding of data warehousing, data modeling (star/snowflake schema), and performance optimization. Familiarity with Agile/Scrum development methodologies. Excellent problem-solving skills and the ability to work independently. Job Type: Full-time Pay: ₹2,200,000.00 - ₹2,400,000.00 per year Schedule: Day shift Fixed shift Work Location: In person
Posted 3 weeks ago
0 years
3 - 5 Lacs
Chennai
On-site
Key Responsibilities: Design and develop robust ETL pipelines using Python, PySpark, and GCP services. Build and optimize data models and queries in BigQuery for analytics and reporting. Ingest, transform, and load structured and semi-structured data from various sources. Collaborate with data analysts, scientists, and business teams to understand data requirements. Ensure data quality, integrity, and security across cloud-based data platforms. Monitor and troubleshoot data workflows and performance issues. Automate data validation and transformation processes using scripting and orchestration tools. Required Skills & Qualifications: Hands-on experience with Google Cloud Platform (GCP), especially BigQuery. Strong programming skills in Python and/or PySpark. Experience in designing and implementing ETL workflows and data pipelines. Proficiency in SQL and data modeling for analytics. Familiarity with GCP services such as Cloud Storage, Dataflow, Pub/Sub, and Composer. Understanding of data governance, security, and compliance in cloud environments. Experience with version control (Git) and agile development practices. About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.
Posted 3 weeks ago
0 years
0 Lacs
Andhra Pradesh
On-site
Design and develop robust ETL pipelines using Python, PySpark, and GCP services. Build and optimize data models and queries in BigQuery for analytics and reporting. Ingest, transform, and load structured and semi-structured data from various sources. Collaborate with data analysts, scientists, and business teams to understand data requirements. Ensure data quality, integrity, and security across cloud-based data platforms. Monitor and troubleshoot data workflows and performance issues. Automate data validation and transformation processes using scripting and orchestration tools. Required Skills & Qualifications: Hands-on experience with Google Cloud Platform (GCP), especially BigQuery. Strong programming skills in Python and/or PySpark. Experience in designing and implementing ETL workflows and data pipelines. Proficiency in SQL and data modeling for analytics. Familiarity with GCP services such as Cloud Storage, Dataflow, Pub/Sub, and Composer. Understanding of data governance, security, and compliance in cloud environments. Experience with version control (Git) and agile development practices. About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.
Posted 3 weeks ago
6.0 - 9.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
Role Description ocation: Any UST Location Experience: 6 to 9 years Mandatory Skills: PySpark, GCP, Hadoop, Hive, AWS, GCP Good to Have: CI/CD and DevOps experience Job Description We are seeking a highly skilled Senior Big Data Engineer to join our team at UST. The ideal candidate will have solid experience in Big Data technologies, cloud platforms, and data processing frameworks with a strong focus on PySpark and Google Cloud Platform (GCP). Key Responsibilities Design, develop, and maintain scalable data pipelines and ETL workflows using PySpark, Hadoop, and Hive. Deploy and manage big data workloads on cloud platforms like GCP and AWS. Work closely with cross-functional teams to understand data requirements and deliver high-quality solutions. Optimize data processing jobs for performance and cost-efficiency on cloud infrastructure. Implement automation and CI/CD pipelines to streamline deployment and monitoring of data workflows. Ensure data security, governance, and compliance in cloud environments. Troubleshoot and resolve data issues, monitoring job executions and system health. Mandatory Skills PySpark: Strong experience in developing data processing jobs and ETL pipelines. Google Cloud Platform (GCP): Hands-on experience with BigQuery, Dataflow, Dataproc, or similar services. Hadoop Ecosystem: Expertise with Hadoop, Hive, and related big data tools. AWS: Familiarity with AWS data services like S3, EMR, Glue, or Redshift. Strong SQL and data modeling skills. Good To Have Experience with CI/CD tools and DevOps practices (Jenkins, GitLab, Terraform, etc.). Containerization and orchestration knowledge (Docker, Kubernetes). Experience with Infrastructure as Code (IaC). Knowledge of data governance and data security best practices. Skills Spark,Hadoop,Hive,Gcp
Posted 3 weeks ago
6.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
Role Description Job Summary: We are seeking a Senior Data Engineer with strong hands-on experience in PySpark, Big Data technologies, and cloud platforms (preferably GCP) . The ideal candidate will design, implement, and optimize scalable data pipelines while driving technical excellence and process improvements. You will collaborate with cross-functional teams to solve complex data problems and ensure delivery of high-quality, cost-effective data solutions. Roles & Responsibilities Design & Development: Develop scalable and efficient data pipelines using PySpark, Hive, SQL, Spark, and Hadoop. Translate high-level business requirements and design documents (HLD/LLD) into technical specifications and implementation. Create and maintain architecture and design documentation. Performance Optimization & Quality Monitor, troubleshoot, and optimize data workflows for cost, performance, and reliability. Perform root cause analysis (RCA) on defects and implement mitigation strategies. Ensure adherence to coding standards, version control practices, and testing protocols. Collaboration & Stakeholder Engagement Interface with product managers, data stewards, and customers to clarify requirements. Conduct technical presentations, design walkthroughs, and product demos. Provide timely updates, escalations, and support during UAT and production rollouts. Project & People Management Manage delivery of data modules/user stories with a focus on timelines and quality. Set and review FAST goals for self and team; provide mentorship and technical guidance. Maintain team engagement and manage team member aspirations through regular feedback and career support. Compliance & Knowledge Management Ensure compliance with mandatory trainings and engineering processes. Contribute to and consume project documentation, templates, checklists, and domain-specific knowledge. Review and approve reusable assets developed by the team. Must-Have Skills 6+ years of experience in Data Engineering or related roles. Strong proficiency in PySpark, SQL, Spark, Hive, and the Hadoop ecosystem. Hands-on experience with Google Cloud Platform (GCP) or equivalent cloud services (e.g., AWS, Azure). Expertise in designing, building, testing, and deploying large-scale data processing systems. Sound understanding of data architecture, ETL frameworks, and batch/streaming data pipelines. Strong knowledge of Agile methodologies (Scrum/Kanban). Experience with code reviews, version control (Git), and CI/CD tools. Excellent communication skills – both verbal and written. Good-to-Have Skills GCP Professional Data Engineer Certification or equivalent. Experience with Airflow, Dataflow, BigQuery, or similar GCP-native tools. Knowledge of data modeling techniques and data governance. Exposure to domain-specific projects (e.g., BFSI, Healthcare, Retail). Experience with Docker, Kubernetes, or other containerization tools. Working knowledge of test automation and performance testing frameworks. Skills Spark,Hadoop,Hive,Gcp
Posted 3 weeks ago
5.0 - 10.0 years
15 - 30 Lacs
Pune, Bengaluru, Delhi / NCR
Hybrid
Mandatory Skills: Apache beam,Big-Query,Dataflow,DataProc,Composer,Airflow,Pyspark,Python,SQL.
Posted 3 weeks ago
8.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Who we are? Searce means ‘a fine sieve’ & indicates ‘to refine, to analyze, to improve’. It signifies our way of working: To improve to the finest degree of excellence, ‘solving for better’ every time. Searcians are passionate improvers & solvers who love to question the status quo. The primary purpose of all of us, at Searce, is driving intelligent, impactful & futuristic business outcomes using new-age technology. This purpose is driven passionately by HAPPIER people who aim to become better, everyday. What are we looking for? Are you a keen learner? Excellent mentor? Passionate coach? We’re looking for someone who’s all three! We’re on the lookout for someone who can design and implement our data processing pipelines for all kinds of data sources. What you'll do as a Manager - Data Engineering with us? 1. You have worked in environments of different shapes and sizes. On-premise, private cloud, public cloud, Hybrid, all windows / linux / healthy mix. Thanks to this experience, you can connect the dots quickly and understand client pain points. 2. You are curious. You keep up with the breakneck speed of innovation on Public cloud. When something new gets released or an existing service changes - you try it out and you learn. 3. You have Strong database background - relational and non-relational alike. a. MySQL, PostgreSQL, SQL Server, Oracle. b. MongoDB, Cassandra and other NoSQL databases. c. Strong SQL query writing experience. d. HA, DR, Performance tuning, Migrations. e. Experience with the cloud offerings - RDS, Aurora, CloudSQL, Azure SQL. 4. You have hands-on experience with designing, deploying, migrating enterprise data warehouses and data lakes. a. Familiarity with migrations from the likes of Netezza, Greenplum, Oracle to BigQuery/RedShift/Azure Data warehouse. b. Dimensional data modelling, reporting & analytics. c. Designing ETL pipelines. 5. You have experience with Advanced Analytics - Ability to work with the Applied AI team and assist in delivering predictive analytics, ML models etc. 6. You have experience with BigData ecosystem a. Self managed Hadoop clusters, distributions like Hortonworks and the cloud equivalents like EMR, Dataproc, HDInsight b. Apache Hudi, Hive, Presto, Spark, Flink, Kafka etc 7. You have hands-on experience with Tools: Apache Airflow, Talend, Tableau, Pandas, DataFlow, Kinesis, Stream Analytics etc. What are the must-haves to join us? 1. Is Education overrated? Yes. We believe so. However there is no way to locate you otherwise. So we might have to look for a Bachelor's or Master's Degree in engineering from a reputed institute or you should have been coding since your 6th grade. And the later is better. We will find you faster if you specify the latter in some manner. :) 2. 8-10+ years of overall IT experience with a strong data engineering and business intelligence background. 3. Minimum 3 years of experience on projects with GCP / AWS / Azure. 4. Minimum 3+ years of experience in data & analytics delivery and management consulting working with Data Migration, ETL, Business Intelligence, Data Quality, Data Analytics and AI tools. 5. 4+ years of hands-on experience with Python & SQL. 6. Experience across data solutions including data lake, warehousing, ETL, streaming, reporting and analytics tools. 7. Prior experience in recruitment, training & grooming of geeks. 8. Great to have certifications: a. GCP and/or AWS, professional level b. Your contributions to the community - tech blogs, stackover ow etc. 9. Strong communication skills to communicate across a diverse audience with varying levels of business and technical expertise. So, If you are passionate about tech, future & what you read above (we really are!), apply here to experience the ‘Art of Possible’
Posted 3 weeks ago
10.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Role : GCP Application & Infra Tech SME role Exp: 10+ Years Location: Hyderabad Notice: Immediate Knowledge/Experience: Knowledge in banking domain GCP Solution Architect certification would be desirable. Skills Required 10-12+ years of experience in cloud architecture or systems engineering Experience in driving GCP Application support and data Analytics projects/ecosystems independently Experience in GCP IaaS such as GCE, GAE, GKE, VPC, DNS, Interconnect VPN, CDN, Cloud Storage, FileStore, Firebase, Deployment Manager, Stackdriver. Experience in GCP services such as Cloud Endpoints, Dataflow, Dataproc, Datalab, Dataprep, Cloud Composer, Pub/Sub, Cloud Functions Experience on Terraform and Devops (CI/CD pipeline) Experience establishing technical strategy and architecture at the enterprise level Experience leading GCP Cloud project delivery Experience working with infrastructure as code tools such as Ansible, Chef, or Terraform Experience in publishing GCP cost Dashboards, Alerting and monitoring Experience creating apps utilizing Kubernetes and containers, particularly on the Google Cloud Platform Should have experience working in agile and devops environment using team collaboration tools such as Confluence, JIRA. Programming skills and hands-on experience in Python desirable Proficiency in working with cloud based native data stores/databases Knowledge on design patterns for GCP third party tools setup and native tools usage Experience and ability to manage a small team of tech specialists Excellent multitasking ability - Must have ability to track multiple issues, effectively manage time and competing priorities, and to drive results through partner organizations. Strong communication skills (verbal, written, and presentation of complex information and data). Experience of planning and prioritizing their own time effectively, aware of their responsibilities and committed to delivering these efficiently Coordinate activities of technical specialists to automate the setup and configuration of environments Monitor and guarantee uptime of prod environments Provide ongoing support for environments 24/7 shift model support & Service Management (Production Support).
Posted 3 weeks ago
15.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Experience - 15+ years to 19 years (Hands-on Role) Reporting to - Director Location - Pune Skills - Java/Scala/Python GCP/AWS/Azure, Bigquery, Data Engineering, Apache spark or Beam In this role, you will: Engineer the data transformations and analysis Technology SME on the real-time stream processing paradigm. Bring your experience in Low latency, High through-put, auto scaling platform design and implementation. Implementing an end-to-end platform service, assessing the operations and non-functional needs clearly. Mentor and Coach the engineering and SME talent to realize their potential and build a high-performance team. Manage complex end to end functional transformation module from planning estimations to execution. Improve the platform standards by bringing in new ideas and solutions on the table. Qualifications To be successful in this role, you should meet the following requirements: 15+ years of experience in data engineering technology and tools. Must have experience with Java / Scala based implementations for enterprise-wide platforms. Experience with Apache Beam, Google Dataflow, Apache Kafka for real-time steam processing technology stack. Complex state-full processing of events with partitioning for higher throughputs. Have dealt with fine-tuning the through-puts and improving the performance aspects on data pipelines. Experience with analytical data store optimizations, querying and managing them. Experience with alternate data engineering tools (Apache Beam, Apache Flink, Apache Spark etc). Reason and have an ability to convince the stake holders and wider technology team about your decisions. Set highest standards of integrity and ethics and lead with examples on technology implementations.
Posted 3 weeks ago
3.0 - 8.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Notice Period -0 -30 days Location -Gurgaon Experience - 3-8 years Responsibilities - The candidate should have extensive production experience (1-2 Years ) in GCP, Other cloud experience would be a strong bonus. Strong background in Data engineering 2-3 Years of exp in Big Data technologies including, Hadoop, NoSQL, Spark, Kafka etc. Exposure to Production application is a must and Operating knowledge of cloud computing platforms (GCP, especially Big Query, Dataflow, Dataproc, Storage, VMs, Networking, Pub Sub, Cloud Functions, Composer servics) Qualifications - BE / B.Tech / MCA / M.Tech / M.Com Required Skills Big Data space (Hadoop Stack like Spark, M/R, HDFS, Hive, HBase etc) and contributes to open source Big Data technologies. Must have : Operating knowledge of cloud computing platforms (GCP, especially Big Query, Dataflow, Dataproc, Storage, VMs, Networking, Pub Sub, Cloud Functions, Composer servics)
Posted 3 weeks ago
10.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
About The Organisation DataFlow Group is a pioneering global provider of specialized Primary Source Verification (PSV) solutions, and background screening and immigration compliance services that assist public and private organizations in mitigating risks to make informed, cost-effective decisions regarding their Applicants and Registrants. About The Role We are looking for a highly skilled and experienced Senior ETL & Data Streaming Engineer with over 10 years of experience to play a pivotal role in designing, developing, and maintaining our robust data pipelines. The ideal candidate will have deep expertise in both batch ETL processes and real-time data streaming technologies, coupled with extensive hands-on experience with AWS data services. A proven track record of working with Data Lake architectures and traditional Data Warehousing environments is essential. Duties And Responsibilities Design, develop, and implement highly scalable, fault-tolerant, and performant ETL processes using industry-leading ETL tools to extract, transform, and load data from various source systems into our Data Lake and Data Warehouse. Architect and build batch and real-time data streaming solutions using technologies like Talend, Informatica, Apache Kafka or AWS Kinesis to support immediate data ingestion and processing requirements. Utilize and optimize a wide array of AWS data services Collaborate with data architects, data scientists, and business stakeholders to understand data requirements and translate them into efficient data pipeline solutions. Ensure data quality, integrity, and security across all data pipelines and storage solutions. Monitor, troubleshoot, and optimize existing data pipelines for performance, cost-efficiency, and reliability. Develop and maintain comprehensive documentation for all ETL and streaming processes, data flows, and architectural designs. Implement data governance policies and best practices within the Data Lake and Data Warehouse environments. Mentor junior engineers and contribute to fostering a culture of technical excellence and continuous improvement. Stay abreast of emerging technologies and industry best practices in data engineering, ETL, and streaming. Qualifications 10+ years of progressive experience in data engineering, with a strong focus on ETL, ELT and data pipeline development. Deep expertise in ETL Tools : Extensive hands-on experience with commercial ETL tools (Talend) Strong proficiency in Data Streaming Technologies : Proven experience with real-time data ingestion and processing using platforms such as AWS Glue,Apache Kafka, AWS Kinesis, or similar. Extensive AWS Data Services Experience : Proficiency with AWS S3 for data storage and management. Hands-on experience with AWS Glue for ETL orchestration and data cataloging. Familiarity with AWS Lake Formation for building secure data lakes. Good to have experience with AWS EMR for big data processing Data Warehouse (DWH) Knowledge : Strong background in traditional data warehousing concepts, dimensional modeling (Star Schema, Snowflake Schema), and DWH design principles. Programming Languages : Proficient in SQL and at least one scripting language (e.g., Python, Scala) for data manipulation and automation. Database Skills : Strong understanding of relational databases and NoSQL databases. Version Control : Experience with version control systems (e.g., Git). Problem-Solving : Excellent analytical and problem-solving skills with a keen eye for detail. Communication : Strong verbal and written communication skills, with the ability to articulate complex technical concepts to both technical and non-technical audiences. (ref:hirist.tech)
Posted 3 weeks ago
10.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
About The Organisation DataFlow Group is a pioneering global provider of specialized Primary Source Verification (PSV) solutions, and background screening and immigration compliance services that assist public and private organizations in mitigating risks to make informed, cost-effective decisions regarding their Applicants and Registrants. About The Role Were currently searching for an experienced business analyst to help guide our organization to the future. From researching progressive systems solutions to evaluating their impacts, the ideal candidate will be a detailed planner, expert communicator, and top-notch analyst. This person should also be wholly committed to the discovery and development of innovative solutions in an ever changing digital landscape. Duties And Responsibilities Strategic Alignment : Collaborate closely with senior leadership (e.g., C-suite executives, Directors) to understand their strategic goals, key performance indicators (KPIs), and critical information needs. Requirements Elicitation & Analysis Facilitate workshops, interviews, and other elicitation techniques to gather detailed business requirements for corporate analytics dashboards. Analyze and document these requirements clearly, concisely, and unambiguously, ensuring alignment with overall business strategy. User Story & Acceptance Criteria Definition Translate high-level business requirements into detailed user stories with clear and measurable acceptance criteria for the development team. Data Understanding & Mapping Work with data owners and subject matter experts to understand underlying data sources, data quality, and data governance policies relevant to the dashboards. Collaborate with the development team on data mapping and transformation logic. Dashboard Design & Prototyping Collaboration Partner with UI/UX designers and the development team to conceptualize and prototype dashboard layouts, visualizations, and user interactions that effectively communicate key insights to senior stakeholders. Provide feedback and ensure designs meet business requirements and usability standards. Stakeholder Communication & Management Act as the central point of contact between senior leadership and the development team. Proactively communicate progress, challenges, and key decisions to all stakeholders. Manage expectations and ensure alignment throughout the project lifecycle. Prioritization & Backlog Management Work with stakeholders to prioritize dashboard development based on business value and strategic importance. Maintain and groom the product backlog, ensuring it reflects current priorities and requirements. Testing & Validation Support Support the testing phase by reviewing test plans, participating in user acceptance testing (UAT), and ensuring the delivered dashboards meet the defined requirements and acceptance criteria. Training & Documentation Develop and deliver training materials and documentation for senior users on how to effectively utilize the new dashboards and interpret the presented data. Continuous Improvement Gather feedback from users post-implementation and work with the development team to identify areas for improvement and future enhancements to the corporate analytics platform. Industry Best Practices Stay abreast of the latest trends and best practices in business intelligence, data visualization, and analytics. Project Management Develop and maintain project plans for agreed initiatives in collaboration with stakeholders. Monitor project progress against defined timelines, prepare and present regular project status reports to stakeholders. Qualifications Bachelor's degree in Business Administration, Computer Science, Information Systems, Economics, Finance, or a related field. Minimum of 10+ years of experience as a Business Analyst, with a significant focus on business intelligence, data analytics, and dashboard development projects. Proven experience in leading requirements gathering, and analysis efforts with senior leadership and executive stakeholders, and able to translate complex business requirements into clear and actionable technical specifications. Demonstrable experience in managing BI and dashboarding projects, including project planning, risk management, and stakeholder communication Strong understanding of reporting, data warehousing concepts, ETL processes and data modeling principles. Excellent knowledge of data visualization best practices and principles of effective dashboard design. Experience working with common business intelligence and data visualization tools (e.g., Tableau, Power BI, Qlik Sense). Exceptional communication (written and verbal), presentation, and interpersonal skills, with the ability to effectively communicate with both business and technical audiences. Strong facilitation and negotiation skills to lead workshops and drive consensus among diverse stakeholder groups. Excellent analytical and problem-solving skills with keen attention to detail. Ability to work independently and manage multiple priorities in a fast-paced environment. Experience with Agile methodologies (e.g., Scrum, Kanban). (ref:hirist.tech)
Posted 3 weeks ago
0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Roles And Responsibilities Contributing to designing cloud architectures and integration modules for enterprise-level systems. Managing and mentoring the technical team and ensuring quality delivery of solutions as per the process Lead engagements with partners and customers, including stakeholder management, requirements gathering, and designing solutions along with development and delivery. Collaborate with Program Management, Engineering, User Experience, and Product teams to identify gaps and work with cross-functional teams to design solutions. Essential Skills Experience developing and managing scalable, high-performance production systems. Some experience working in Artificial Intelligence (AI) or RPA Developing, designing, and maintaining high-quality production applications written in NodeJS or Python, data structures, ML algorithms, and software design. Experience with complex API integrations and application development modules. Strong skills in designing database schema; both SQL and NoSQL databases. Experience with any cloud technologies like GCP/AWS/Azure leveraging serverless architectures and technologies like Cloud functions, AWS Lambda, Google Dataflow, Google Pub/Sub etc Design, build, manage and operate the continuous delivery framework and tools, and act as a subject matter expert on CI/CD for developer teams. Fullstack development background with experience on Front-end – Angular / jQuery or any JS frameworks Should possess strong problem-solving ability. The ability or potential to multitask, and prioritize. Experience in Test Driven Development & Agile methodologies. Good communication skills. Experience using tools like Git and Jira. Confluence is a plus. A team player who can collaborate with all stakeholders with strong interpersonal skills. Self-starter with a drive to technically mentor your cohort of developers Good To Have Experience in designing reusable and scalable architecture for cloud applications Experience in managing the design and production implementation of chat and voice bots Exposure to developing, maintaining, and monitoring microservices Experience in one or more: chat/voice bot development, machine learning, Natural Language Processing (NLP), and contact center technologies. Application Security
Posted 3 weeks ago
5.0 years
0 Lacs
India
Remote
Client Type: US Client Location: Remote The hourly rate is negotiable. About the Role We’re creating a new certification: Google AI Ecosystem Architect (Gemini & DeepMind) - Subject Matter Expert . This course is designed for technical learners who want to understand and apply the capabilities of Google’s Gemini models and DeepMind technologies to build powerful, multimodal AI applications. We’re looking for a Subject Matter Expert (SME) who can help shape this course from the ground up. You’ll work closely with a team of learning experience designers, writers, and other collaborators to ensure the course is technically accurate, industry-relevant, and instructionally sound. Responsibilities As the SME, you’ll partner with learning experience designers and content developers to: Translate real-world Gemini and DeepMind applications into accessible, hands-on learning for technical professionals. Guide the creation of labs and projects that allow learners to build pipelines for image-text fusion, deploy Gemini APIs, and experiment with DeepMind’s reinforcement learning libraries. Contribute technical depth across activities, from high-level course structure down to example code, diagrams, voiceover scripts, and data pipelines. Ensure all content reflects current, accurate usage of Google’s multimodal tools and services. Be available during U.S. business hours to support project milestones, reviews, and content feedback. This role is an excellent fit for professionals with deep experience in AI/ML, Google Cloud, and a strong familiarity with multimodal systems and the DeepMind ecosystem. Essential Tools & Platforms A successful SME in this role will demonstrate fluency and hands-on experience with the following: Google Cloud Platform (GCP) Vertex AI (particularly Gemini integration, model tuning, and multimodal deployment) Cloud Functions, Cloud Run (for inference endpoints) BigQuery and Cloud Storage (for handling large image-text datasets) AI Platform Notebooks or Colab Pro Google DeepMind Technologies JAX and Haiku (for neural network modeling and research-grade experimentation) DeepMind Control Suite or DeepMind Lab (for reinforcement learning demonstrations) RLax or TF-Agents (for building and modifying RL pipelines) AI/ML & Multimodal Tooling Gemini APIs and SDKs (image-text fusion, prompt engineering, output formatting) TensorFlow 2.x and PyTorch (for model interoperability) Label Studio, Cloud Vision API (for annotation and image-text preprocessing) Data Science & MLOps DVC or MLflow (for dataset and model versioning) Apache Beam or Dataflow (for processing multimodal input streams) TensorBoard or Weights & Biases (for visualization) Content Authoring & Collaboration GitHub or Cloud Source Repositories Google Docs, Sheets, Slides Screen recording tools like Loom or OBS Studio Required skills and experience: Demonstrated hands-on experience building, deploying, and maintaining sophisticated AI powered applications using Gemini APIs/SDKs within the Google Cloud ecosystem, especially in Firebase Studio and VS Code. Proficiency in designing and implementing agent-like application patterns, including multi-turn conversational flows, state management, and complex prompting strategies (e.g., Chain-of Thought, few-shot, zero-shot). Experience integrating Gemini with Google Cloud services (Firestore, Cloud Functions, App Hosting) and external APIs for robust, production-ready solutions. Proven ability to engineer applications that process, integrate, and generate content across multiple modalities (text, images, audio, video, code) using Gemini’s native multimodal capabilities. Skilled in building and orchestrating pipelines for multimodal data handling, synchronization, and complex interaction patterns within application logic. Experience designing and implementing production-grade RAG systems, including integration with vector databases (e.g., Pinecone, ChromaDB) and engineering data pipelines for indexing and retrieval. Ability to manage agent state, memory, and persistence for multi-turn and long-running interactions. Proficiency leveraging AI-assisted coding features in Firebase Studio (chat, inline code, command execution) and using App Prototyping agents or frameworks like Genkit for rapid prototyping and structuring agentic logic. Strong command of modern development workflows, including Git/GitHub, code reviews, and collaborative development practices. Experience designing scalable, fault-tolerant deployment architectures for multimodal and agentic AI applications using Firebase App Hosting, Cloud Run, or similar serverless/cloud platforms. Advanced MLOps skills, including monitoring, logging, alerting, and versioning for generative AI systems and agents. Deep understanding of security best practices: prompt injection mitigation (across modalities), secure API key management, authentication/authorization, and data privacy. Demonstrated ability to engineer for responsible AI, including bias detection, fairness, transparency, and implementation of safety mechanisms in agentic and multimodal applications. Experience addressing ethical challenges in the deployment and operation of advanced AI systems. Proven success designing, reviewing, and delivering advanced, project-based curriculum and hands-on labs for experienced software developers and engineers. Ability to translate complex engineering concepts (RAG, multimodal integration, agentic patterns, MLOps, security, responsible AI) into clear, actionable learning materials and real world projects. 5+ years of professional experience in AI-powered application development, with a focus on generative and multimodal AI. Strong programming skills in Python and JavaScript/TypeScript; experience with modern frameworks and cloud-native development. Bachelor’s or Master’s degree in Computer Science, Data Engineering, AI, or a related technical field. Ability to explain advanced technical concepts (e.g., fusion transformers, multimodal embeddings, RAG workflows) to learners in an accessible way. Strong programming experience in Python and experience deploying machine learning pipelines Ability to work independently, take ownership of deliverables, and collaborate closely with designers and project managers Preferred: Experience with Google DeepMind tools (JAX, Haiku, RLax, DeepMind Control Suite/Lab) and reinforcement learning pipelines. Familiarity with open data formats (Delta, Parquet, Iceberg) and scalable data engineering practices. Prior contributions to open-source AI projects or technical community engagement.
Posted 3 weeks ago
7.0 years
4 - 7 Lacs
Thiruvananthapuram
On-site
Equifax is seeking creative, high-energy and driven software engineers with hands-on development skills to work on a variety of meaningful projects. Our software engineering positions provide you the opportunity to join a team of talented engineers working with leading-edge technology. You are ideal for this position if you are a forward-thinking, committed, and enthusiastic software engineer who is passionate about technology. What you’ll do Design, develop, and operate high scale applications across the full engineering stack Design, develop, test, deploy, maintain, and improve software. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.) Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit, globally distributed engineering team. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Manage sole project priorities, deadlines, and deliverables. Research, create, and develop software applications to extend and improve on Equifax Solutions Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity What experience you need Bachelor's degree or equivalent experience 7+ years of software engineering experience 7+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript 5+ years experience with Cloud technology: GCP, AWS, or Azure 5+ years experience designing and developing cloud-native solutions 5+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision. Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others UI development (e.g. HTML, JavaScript, Angular and Bootstrap) Experience with backend technologies such as JAVA/J2EE, SpringBoot, SOA and Microservices Source code control management systems (e.g. SVN/Git, Github) and build tools like Maven & Gradle. Agile environments (e.g. Scrum, XP) Relational databases (e.g. SQL Server, MySQL) Atlassian tooling (e.g. JIRA, Confluence, and Github) Developing with modern JDK (v1.7+) Automated Testing: JUnit, Selenium, LoadRunner, SoapUI
Posted 3 weeks ago
0 years
13 - 18 Lacs
India
On-site
Job Title: Pharmacist – Hospital (Oman) Location: Oman(Hospital) Job Type: Full-Time hr@meridiantradelinksuae.com +971 50 663 0283 Salary: OMR 500 (INR 115000.00) for Pharmacists OMR 550–650 (INR 115000-150000) for Licensed Pharmacists (based on qualifications and experience) About the Role: We are urgently hiring Pharmacists for a reputable hospital in Oman. The role involves dispensing medications accurately, counseling patients on medicine usage, and ensuring compliance with hospital and pharmacy standards. Key Responsibilities: Dispense prescribed medications with accuracy. Counsel patients regarding safe medication usage and side effects. Maintain accurate records and inventory of medications. Collaborate with doctors and nurses for optimal patient care. Requirements: Bachelor’s or Master’s in Pharmacy. Preferably candidates with Oman Pharmacist License. Candidates who have completed Prometric and Dataflow are highly preferred. Ability to work efficiently in a hospital setting. Strong communication and patient counseling skills. Benefits: Competitive tax-free salary based on license and experience. Opportunity to work in a reputable healthcare environment in Oman. Professional development opportunities. How to Apply: Interested candidates are invited to send their updated CV to: hr@meridiantradelinksuae.com with the subject line: “Pharmacist – Oman Application.” Job Types: Full-time, Permanent Pay: ₹115,000.00 - ₹150,000.00 per month
Posted 3 weeks ago
0 years
0 Lacs
Hyderābād
On-site
Job description Some careers shine brighter than others. If you’re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Senior Consultant Specialist. In this role, you will: The DevOps Engineering job is responsible for developing automations across the Technology delivery lifecycle including construction, testing, release and ongoing service management, and monitoring of a product or service within a Technology team. They will be required to continually enhance their skills within a number of specialisms which include CI/CD, automation, pipeline development, security, testing, and operational support. This role will carry out some or all of the following activities: The role of the DevOps engineer is to facilitate the application teams across the Bank to deploy and their applications across GCP services like GKE Container, BigQuery, Dataflow, PubSub, Kafka The DevOps Engineer should be the go-to person in case application team faces any issue during Platform adoption, onboarding, deployment and environment troubleshooting. Ensure service resilience, service sustainability and recovery time objectives are met for all the software solutions delivered. Responsible for automating the continuous integration / continuous delivery pipeline within a DevOps Product/Service team driving a culture of continuous improvement. Keep up to date and have expertise on current tools, technologies and areas like cyber security and regulations pertaining to aspects like data privacy, consent, data residency etc. that are applicable End to end accountability for a product or service, identifying and developing the most appropriate Technology solutions to meet customer needs as part of the Customer Journey Liaise with other engineers, architects, and business stakeholders to understand and drive the product or service’s direction. Analyze production errors to define and create tools that help mitigate problems in the system design stage and applying user-defined integrations, improving the user experience. Requirements To be successful in this role, you should meet the following requirements: Bachelor Degree in Computer Science or related disciplines 6 or more years of hands-on development experience building fully self-serve, observable solutions using infrastructure and Policy As A Code Proficiency developing with modern programming languages and and ability to rapidly develop proof-of-concepts Ability to work with geographically distributed and cross-functional teams Expert in code deployment tools (Jenkins, Puppet, Ansible, Git, Selenium, and Chef) Expert in automation tools (CloudFormation, Terraform, shell script, Helm, Ansible) Familiar with Containers (Docker, Docker compose, Kubernetes, GKE) Familiar with Monitoring (DATADOG, Grafana, Prometheus, AppDynamics, New Relic, Splunk) The successful candidate will also meet the following requirements: Good understanding of GCP Cloud or Hybrid Cloud approach implementations Good understanding and experience on MuleSoft / PCF/Any Gateway Server Implementations Hands on experience in Kong API Gateway platform Good understanding and experience on Middleware and MQ areas. Familiar with infrastructure support Apache Gateway, runtime Server Configurations, SSL Cert setup etc You’ll achieve more when you join HSBC. www.hsbc.com/careers HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. Issued by – HSBC Software Development India
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough