Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 5.0 years
10 - 15 Lacs
Ahmedabad
Work from Office
Design, deliver & maintain the appropriate data solution to provide the correct data for analytical development to address key issues within the organization Gather detailed data requirements with a cross-functional team to deliver quality results. Required Candidate profile Strong experience with cloud services within Azure, AWS, or GCP platforms (preferably Azure) Strong experience with analytical tool (preferably SQL, dbt, Snowflake, BigQuery, Tableau)
Posted 4 days ago
7.0 - 12.0 years
30 - 40 Lacs
Bengaluru
Work from Office
Design, develop, and deploy AI/ML models; build scalable, low-latency ML infrastructure; run experiments; optimize algorithms; collaborate with data scientists, engineers, and architects; integrate models into production to drive business value. Required Candidate profile 5–10 yrs in AI/ML, strong in model development, optimization, and deployment. Skilled in Azure, ML pipelines, data science tools, and collaboration with cross-functional teams.
Posted 5 days ago
5.0 - 10.0 years
15 - 27 Lacs
Bengaluru
Work from Office
Develop digital reservoir modeling tools and Petrel plugins. Integrate geological and geophysical data, apply ML and data engineering, and support forecasting through advanced cloud-based workflows. Required Candidate profile Earth scientist with experience in Petrel, Python, and Ocean plugin development. Strong background in reservoir modeling, digital workflows, and cloud-based tools (Azure, Power BI).
Posted 5 days ago
9.0 - 14.0 years
8 - 18 Lacs
Pune
Work from Office
Immediate opening for Lead Data Engineer @ Pune Location: EXp: 8 + YRS CTC: ECTC: NP: Immediate to 1 week Location: Pune (Hinjewdi) Location!!! JD: Lead Data Engineer Data Engineer, Sql, Snowflake, PowerBI If interested candidates kindly share me your resume srinivasan.jayaraman@servion.com
Posted 5 days ago
3.0 - 5.0 years
10 - 12 Lacs
Bengaluru
Hybrid
Notice Period: Immediate Key Responsibilities: Design and implement data models and schemas to support business intelligence and analytics. Perform data engineering tasks as needed to support analytical activities. Develop clear, concise, and insightful data visualizations and dashboards. Interpret complex datasets and communicate findings through compelling visualizations and storytelling. Work closely with stakeholders to understand data requirements and deliver actionable insights. Maintain documentation and ensure data quality and integrity. Reporting Structure: Direct reporting to Senior Data Engineer Dotted-line reporting to CTO Required Skills & Qualifications: Proficiency in Power BI, Microsoft Fabric, and Power Query. Experience designing and implementing data models and schemas. Familiarity with basic data engineering tasks. Advanced SQL skills for querying and analyzing data. Exceptional ability to translate complex data into clear, actionable insights. Strong ability to communicate complex data insights effectively to technical and non-technical audiences. Preferred Skills: Experience with Python for data manipulation and analysis. Experience in the finance, tax, or professional services industries. Familiarity with Salesforce data models and integrations.
Posted 5 days ago
4.0 - 6.0 years
10 - 14 Lacs
Kolkata, Pune, Chennai
Work from Office
Location: Remote / Pan India- Delhi / NCR,Bangalore/Bengaluru,Hyderabad/Secunderabad,Chennai,Pune,Kolkata,Ahmedabad,Mumbai Job Responsibilities: To transition legacy Rules from Python using the Polars Library to SparkSQL To create new Rules using SparkSQL based on written requirements. Must Have Skills: Understanding of Polars library Understanding of SparkSQL (this is more important than Polars) Good English communication Ability to work in a collaborative environment Experience with US healthcare data preferred Understanding of SparkSQL (this is more important than Polars)
Posted 5 days ago
5.0 - 10.0 years
8 - 18 Lacs
Bengaluru
Work from Office
Data Engineers/ Analysts that can create the data models for Apromore Ingestion
Posted 5 days ago
8.0 - 14.0 years
25 - 30 Lacs
Bengaluru
Work from Office
At Juniper, we believe the network is the single greatest vehicle for knowledge, understanding, and human advancement the world has ever known.. To achieve real outcomes, we know that experience is the most important requirement for networking teams and the people they serve. Delivering an experience-first, AI-Native Network pivots on the creativity and commitment of our people. It requires a consistent and committed practice, something we call the Juniper Way.. Future of X:. The Future of X (FoX) team is dedicated to transforming and creating a digital-first experience for support and services at Juniper. We are investing in a state-of-the-art digital technology stack, leveraging AI and automation to simplify customer journeys, enhance support experiences, and accelerate processes—often by enabling self-service capabilities. Key components of our solution include omnichannel platforms, portals, and digital suite of automation tools, providing customers with seamless, low-effort engagement options and enabling faster issue resolution, often without the need to open a case.. To build these complex solutions, our team consists of domain experts, architects, data scientists, data engineers, and MLOps professionals, managing the entire solution lifecycle—from concept to deployment. On the GenAI/AI front, we develop solutions tailored to Juniper’s business objectives.. Finally, every solution we bring to market is fully integrated into Juniper’s customer support and services technology stack, ensuring a seamless and impactful digital experience.. Responsibilities:. We are seeking an experienced and strategic leader to own and lead AI Center of Excellence for Future of X with charter to transform digital customer experience. Develop solutions that aligns with the organizational goals and objectives with traditional AI/ML, Generative AI and Large Language Models.. Lead the evolution of the Data Engineering, Machine Learning and AI capabilities through the solution lifecycle. Collaborate with project teams, data science teams and other development teams to drive the technical roadmap and guide development and implementation of new data driven business solutions.. Create playbooks, frameworks, and IP to strengthen delivery.. Enable customers to adopt solutions to achieve improved business outcomes.. Attract, develop, and retain talent with relevant skills.. Develop technical governance for data solution design and implementation leveraging leading industry practices.. Drive latest innovations and market trends in AI/ML, NLP, Information retrieval, Generative AI, Large Language Models to influence technology decisions, strategy and incorporate emerging practices into our solutions.. Qualification and Desired Experiences:. 15+ years in data and analytics with expertise across AI, ML, data platforms, BI tools, and data engineering. 12+ years of experience in a technical leadership role; exceptional people management, supervising strategic projects. Bachelor’s or Master’s degree in Data Science, Computer Science or related disciplines. Experience with leading and architecting and building infrastructure to manage the Data/AI model lifecycle. Deep understanding of technology trends, architectures and integrations related to Generative AI. Hands-on experience with advanced analytics, predictive modelling, NLP, information retrieval, deep learning etc. Strong track record of managing high-performing teams and shaping data practices.. Demonstrated experience of successfully delivering complex initiatives with tight timelines. Tech skills :. AWS , Databricks, Snowflake, Python, Pyspark, Docker, Kubernetes, Terraform, Ansible, Prometheus, Grafana, ELK, Hadoop, Spark, Kafka, Elastic Search, SQL, NoSQL databases, Postgres, Cassandra, Salesforce. Personal Skills:. Strong acumen with ability to translate data insights into business impact. Entrepreneurial mindset combined with strong problem solving, execution and solution design skills. Build an open, authentic, positive working culture for the team.. Ability to engage and communicate technical subjects to both technical and business audiences. Self-motivated and innovative; confident when working independently and an excellent team player with a growth-oriented personality. Maturity to deal with ambiguity and uncertainty of change. Effective time management skills which enable you to work successfully across functions in a dynamic and solution-oriented environment while meeting deadlines. Ability to collaborate cross-functionally in a fast-paced environment and build sound working relationships within all levels of the organization. Ability to handle sensitive information with keen attention to detail and accuracy. Passion for data handling ethics and solve complex problems with creative solutions. Demonstrate perseverance and resilience to overcome obstacles when presented with a complex problem.. Juniper is an Equal Opportunity workplace and Affirmative Action employer. We do not discriminate in employment decisions on the basis of race, color, religion, gender (including pregnancy), national origin, political affiliation, sexual orientation, gender identity or expression, marital status, disability, genetic information, age, veteran status, or any other applicable legally protected characteristic. All employment decisions are made on the basis of individual qualifications, merit, and business need.. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.. About Juniper Networks. Juniper Networks challenges the inherent complexity that comes with networking and security in the multicloud era. We do this with products, solutions and services that transform the way people connect, work and live. We simplify the process of transitioning to a secure and automated multicloud environment to enable secure, AI-driven networks that connect the world. Additional information can be found at Juniper Networks (www.juniper.net) or connect with Juniper on Twitter, LinkedIn and Facebook.. WHERE WILL YOU DO YOUR BEST WORK?. Wherever you are in the world, whether it's downtown Sunnyvale or London, Westford or Bengaluru, Juniper is a place that was founded on disruptive thinking where colleague innovation is not only valued, but expected. We believe that the great task of delivering a new network for the next decade is delivered through the creativity and commitment of our people. The Juniper Way is the commitment to all our colleagues that the culture and company inspire their best work-their life's work. At Juniper we believe this is more than a job it's an opportunity to help change the world.. At Juniper Networks, we are committed to elevating talent by creating a trust-based environment where we can all thrive together. If you think you have what it takes, but do not necessarily check every single box, please consider applying. We’d love to speak with you.. Additional Information for United States jobs:. ELIGIBILITY TO WORK AND E-VERIFY. In compliance with federal law, all persons hired will be required to verify identity and eligibility to work in the United States and to complete the required employment eligibility verification form upon hire.. Juniper Networks participates in the E-Verify program. E-Verify is an Internet-based system operated by the Department of Homeland Security (DHS) in partnership with the Social Security Administration (SSA) that allows participating employers to electronically verify the employment eligibility of new hires and the validity of their Social Security Numbers.. Information for applicants about E-Verify / E-Verify Informacin en espaol: This Company Participates in E-Verify / Este Empleador Participa en E-Verify. Immigrant and Employee Rights Section (IER) The Right to Work / El Derecho a Trabajar. E-Verify® is a registered trademark of the U.S. Department of Homeland Security.. Juniper is an Equal Opportunity workplace. We do not discriminate in employment decisions on the basis of race, color, religion, gender (including pregnancy), national origin, political affiliation, sexual orientation, gender identity or expression, marital status, disability, genetic information, age, veteran status, or any other applicable legally protected characteristic. All employment decisions are made on the basis of individual qualifications, merit, and business need.. Show more Show less
Posted 5 days ago
5.0 - 7.0 years
7 - 9 Lacs
Bengaluru
Work from Office
Company Overview. Docusign brings agreements to life. Over 1.5 million customers and more than a billion people in over 180 countries use Docusign solutions to accelerate the process of doing business and simplify people’s lives. With intelligent agreement management, Docusign unleashes business-critical data that is trapped inside of documents. Until now, these were disconnected from business systems of record, costing businesses time, money, and opportunity. Using Docusign’s Intelligent Agreement Management platform, companies can create, commit, and manage agreements with solutions created by the #1 company in e-signature and contract lifecycle management (CLM), What you'll do. Docusign is seeking a talented and results-oriented Director, Data Engineering to lead and grow the Bengaluru hub of our Global Data & Analytics Organisation. Your mission is to build the modern, trusted data foundation that powers Enterprise analytics, internal decision-making, and customer-facing insights. As a key leader, you will partner closely with Engineering, Product, GTM, Finance, and Customer Success to set the data strategy for Docusign India, focusing on data architecture, enterprise data foundation and data ingestion. During a typical day, you will drive the development of governed data products, operationalize quality and observability, and lead a high-performing team of data engineers and architects. The ideal candidate will demonstrate strong leadership, a passion for innovation in AI & data technologies, and the drive to achieve "five-nines" reliability for our data platforms, This position is a people manager role reporting to the Senior Director, Global Data & Analytics, Responsibility. Own Snowflake architecture, performance and cost governance; define modelling standards (star, data vault, or ELT-first) and enforce security/RBAC best practices. Scale our dbt codebase—design project structure, modular macros, end-to-end CI/CD and automated testing so every pull-request ships with quality gates and lineage metadata. Drive ingestion excellence via Fivetran (50 + connectors today, growing fast). Establish SLAs for freshness, completeness and incident response. Embed with business stakeholders (Finance, Sales Ops, Product, Legal & Compliance) to translate ambiguous questions into governed data products and trusted KPIs—especially SaaS ARR, churn, agreement throughput and IAM adoption metrics. Lead and inspire a high-performing team of data engineers and architects; hire, coach, set OKRs. Operationalise quality and observability using dbt tests, Great Expectations, lineage graphs and alerting so we achieve “five-nines” reliability. Partner on AI initiatives—deliver well-modelled features to data scientists. Evaluate semantic layers, data contracts and cost-optimisation techniques. Job Designation. Hybrid: Employee divides their time between in-office and remote work. Access to an office location is required. (Frequency: Minimum 2 days per week; may vary by team but will be weekly in-office expectation). Positions at Docusign are assigned a job designation of either In Office, Hybrid or Remote and are specific to the role/job. Preferred job designations are not guaranteed when changing positions within Docusign. Docusign reserves the right to change a position's job designation depending on business needs and as permitted by local law, What you bring. Basic. E. Advanced SQL and Python programming skills. 12+ years in data engineering with demonstrable success in an enterprise environment. 5+ years of experience as a people manager, Comfortable with Git-based DevOps. Preferred. Expert in Snowflake, dbt (production CI/CD, macros, tests) and Fivetran (setup, monitoring, log-based replication). Proven ability to model SaaS business processes—ARR/ACV, funnel, usage, billing, marketing, security and compliance. Track record building inclusive, global distributed teams. Adept at executive communication and prioritisation. Experience operationalizing data quality and observability using tools like dbt tests, data lineage, and alerting. Experience partnering on AI initiatives and delivering features for data scientists. Experience evaluating and implementing semantic layers, data contracts, and cost-optimization techniques. Life at Docusign. Working here. Docusign is committed to building trust and making the world more agreeable for our employees, customers and the communities in which we live and work. You can count on us to listen, be honest, and try our best to do what’s right, every day. At Docusign, everything is equal, We each have a responsibility to ensure every team member has an equal opportunity to succeed, to be heard, to exchange ideas openly, to build lasting relationships, and to do the work of their life. Best of all, you will be able to feel deep pride in the work you do, because your contribution helps us make the world better than we found it. And for that, you’ll be loved by us, our customers, and the world in which we live, Accommodation. Docusign is committed to providing reasonable accommodations for qualified individuals with disabilities in our job application procedures. If you need such an accommodation, or a religious accommodation, during the application process, please contact us at accommodations@docusign,, If you experience any issues, concerns, or technical difficulties during the application process please get in touch with our Talent organization at taops@docusign, for assistance, Applicant and Candidate Privacy Notice. Show more Show less
Posted 5 days ago
8.0 - 13.0 years
25 - 30 Lacs
Chennai
Work from Office
Join us in bringing joy to customer experience. Five9 is a leading provider of cloud contact center software, bringing the power of cloud innovation to customers worldwide, Living our values everyday results in our team-first culture and enables us to innovate, grow, and thrive while enjoying the journey together. We celebrate diversity and foster an inclusive environment, empowering our employees to be their authentic selves, The Data Engineer will help design and implement a Google Cloud Platform (GCP) Data Lake, build scalable data pipelines, and ensure seamless access to data for business intelligence and data science tools. They will support a wide range of projects while collaborating closely with management teams and business leaders. The ideal candidate will have a strong understanding of data engineering principles, data warehousing concepts, and the ability to document technical knowledge into clear processes and procedures, This position is based out of one of the offices of our affiliate Acqueon Technologies in India, and will adopt the hybrid work arrangements of that location. You will be a member of the Acqueon team with responsibilities supporting Five9 products, collaborating with global teammates based primarily in the United States, Responsibilities. Design, implement, and maintain a scalable Data Lake on GCP to centralize structured and unstructured data from various sources (databases, APIs, cloud storage), Utilize GCP services including BigQuery, Dataflow, Pub/Sub, and Cloud Storage to optimize and manage data workflows, ensuring scalability, performance, and security, Collaborate closely with data analytics and data science teams to understand data needs, ensuring data is properly prepared for consumption by various systems (e-g. DOMO, Looker, Databricks). Implement best practices for data quality, consistency, and governance across all data pipelines and systems, ensuring compliance with internal and external standards, Continuously monitor, test, and optimize data workflows to improve performance, cost efficiency, and reliability, Maintain comprehensive technical documentation of data pipelines, systems, and architecture for knowledge sharing and future development, Requirements. Bachelor's degree in Computer Science, Data Engineering, Data Science, or a related quantitative field (e-g. Mathematics, Statistics, Engineering), 3+ years of experience using GCP Data Lake and Storage Services. Certifications in GCP are preferred (e-g. Professional Cloud Developer, Professional Cloud Database Engineer), Advanced proficiency with SQL, with experience in writing complex queries, optimizing for performance, and using SQL in large-scale data processing workflows, Proficiency in programming languages such as Python, Java, or Scala, with practical experience building data pipelines, automating data workflows, and integrating APIs for data ingestion, Five9 embraces diversity and is committed to building a team that represents a variety of backgrounds, perspectives, and skills. The more inclusive we are, the better we are. Five9 is an equal opportunity employer, View our privacy policy, including our privacy notice to California residents here: https://www,five9,/pt-pt/legal, Note: Five9 will never request that an applicant send money as a prerequisite for commencing employment with Five9, Show more Show less
Posted 5 days ago
8.0 - 13.0 years
25 - 30 Lacs
Chennai
Work from Office
Join us in bringing joy to customer experience. Five9 is a leading provider of cloud contact center software, bringing the power of cloud innovation to customers worldwide, Living our values everyday results in our team-first culture and enables us to innovate, grow, and thrive while enjoying the journey together. We celebrate diversity and foster an inclusive environment, empowering our employees to be their authentic selves, The Data Engineer will help design and implement a Google Cloud Platform (GCP) Data Lake, build scalable data pipelines, and ensure seamless access to data for business intelligence and data science tools. They will support a wide range of projects while collaborating closely with management teams and business leaders. The ideal candidate will have a strong understanding of data engineering principles, data warehousing concepts, and the ability to document technical knowledge into clear processes and procedures, This position is based out of one of the offices of our affiliate Acqueon Technologies in India, and will adopt the hybrid work arrangements of that location. You will be a member of the Acqueon team with responsibilities supporting Five9 products, collaborating with global teammates based primarily in the United States, Responsibilities. Design, implement, and maintain a scalable Data Lake on GCP to centralize structured and unstructured data from various sources (databases, APIs, cloud storage), Utilize GCP services including BigQuery, Dataflow, Pub/Sub, and Cloud Storage to optimize and manage data workflows, ensuring scalability, performance, and security, Collaborate closely with data analytics and data science teams to understand data needs, ensuring data is properly prepared for consumption by various systems (e-g. DOMO, Looker, Databricks). Implement best practices for data quality, consistency, and governance across all data pipelines and systems, ensuring compliance with internal and external standards, Continuously monitor, test, and optimize data workflows to improve performance, cost efficiency, and reliability, Maintain comprehensive technical documentation of data pipelines, systems, and architecture for knowledge sharing and future development, Requirements. Bachelor's degree in Computer Science, Data Engineering, Data Science, or a related quantitative field (e-g. Mathematics, Statistics, Engineering), 4+ years of experience using GCP Data Lake and Storage Services. Certifications in GCP are preferred (e-g. Professional Cloud Developer, Professional Cloud Database Engineer), Advanced proficiency with SQL, with experience in writing complex queries, optimizing for performance, and using SQL in large-scale data processing workflows, Proficiency in programming languages such as Python, Java, or Scala, with practical experience building data pipelines, automating data workflows, and integrating APIs for data ingestion, Five9 embraces diversity and is committed to building a team that represents a variety of backgrounds, perspectives, and skills. The more inclusive we are, the better we are. Five9 is an equal opportunity employer, View our privacy policy, including our privacy notice to California residents here: https://www,five9,/pt-pt/legal, Note: Five9 will never request that an applicant send money as a prerequisite for commencing employment with Five9, Show more Show less
Posted 5 days ago
4.0 - 6.0 years
15 - 18 Lacs
Bengaluru
Hybrid
We are looking for a Junior Machine Learning Engineer with 4 to 6 years of experience to join our AI/ML team. The candidate will wok on reinforcement learning agents, end-to-end ML workflows, and MLOps practices to build, deploy, and monitor scalable AI solutions. Key Responsibilities: Develop and deploy ML models and RL agents (e.g., Q-learning, policy gradients). Preprocess data, perform feature engineering, and tune models. Operationalize models using Docker, CI/CD (GitHub Actions, Jenkins). Monitor model performance and system health in production. Collaborate with data scientists, engineers, and DevOps teams. Stay updated on trends in RL, AI, and MLOps. Required Skills: Bachelors/Masters in Computer Science or related field. Proficiency in Python and ML libraries (TensorFlow, PyTorch, Scikit-learn). Hands-on experience with RL frameworks (OpenAI Gym, Stable Baselines). Familiarity with Docker, Git, CI/CD tools, and REST APIs. Understanding of ML lifecycle and cloud deployment (AWS/GCP/Azure). Preferred Skills: Experience with SageMaker, Vertex AI, or similar tools. Knowledge of deep learning, NLP, or microservices. Strong analytical and collaborative skills.
Posted 5 days ago
10.0 - 15.0 years
0 - 3 Lacs
Pune
Work from Office
Designation: Senior Recruitment Manager / Recruitment Manager Core responsibilities: Drive sourcing capability & capacity across a geographically dispersed team of partners within to proactively build diverse candidate pools while leveraging all available talent channels. Work closely with business leaders to influence and deliver quality assessment and high touch candidate experience through all aspects of the recruitment funnel. Partner with key stakeholders (Business Leaders, Hiring Managers, HR Business Partners and Recruiting Managers) to determine future talent needs and set and drive enabling sourcing strategies; this requires a deep understanding through extensive market research of the channels where we can find the best, diverse talent who fit our technical and cultural demands. Build, engage, manage and develop a team of high-performing staff in an extremely fast- paced and ambiguous environment. Set team performance goals and metrics, timelines and a formal tracking process to measure and manage progress. Keep track of recruiting metrics (e.g. time-to-hire and cost-per-hire) Develop and execute plans to identify and drive productivity improvements that enable the team to deliver to hiring goals without having to scale deployed resources at a rate faster than the business is growing. Periodically lead and/or participate in cross-business/cross-company special projects and initiatives related to talent acquisition. Basic Qualifications 10+ years of recruitment/HR experience, with a minimum of 5 years experience managing a large, multi-site recruiting team. Track record of success in owning and executing the process to identify and attract talent for immediate business needs, as well as for critical long-term talent pipelines. Experience working with essential tools of the trade, including ATS, resume databases, and internet sourcing tools, along with coaching a recruiting team to deliver results across all channels. Experience creating, measuring and scaling workflow between candidates, hiring managers and the recruiting team. Knowledge of labor legislation Demonstrated business acumen, and experience working with large, complex organizations during periods of growth and change. Proven written and verbal communication, as well as influencing skills. Strong decision making skills Demonstrated experience in building recruiting and business teams from the ground up
Posted 5 days ago
1.0 - 3.0 years
3 - 6 Lacs
Mangaluru
Work from Office
Job Description. Job Title: Sr. Data Analyst. Location: Remote. Department: Data & Analytics. Who You Are. Bachelor's degree in a relevant field such as computer science, mathematics, statistics, or a related discipline. A master's degree is a plus.. 5+ years of experience in data analysis or a similar role, preferably in the technology, marketing, or SaaS industry.. Proficiency in SQL and experience with data manipulation, extraction, and analysis using SQLbased tools.. Strong programming skills in Python, R, or other relevant languages for data analysis and statistical modeling.. Experience with data visualization tools such as Tableau, Power BI, or similar platforms.. Solid understanding of statistical analysis techniques and their practical application.. Familiarity with cloud-based data platforms (e.g., AWS, Google Cloud Platform) and their associated services.. Excellent problem-solving skills and ability to work with large and complex datasets.. Strong communication and presentation skills, with the ability to convey complex ideas to both technical and non-technical audiences.. Proven ability to work collaboratively in cross-functional teams and manage multiple projects simultaneously.. Experience with email marketing, digital marketing analytics, or customer analytics is a plus.. Knowledge of data privacy and security regulations (e.g., GDPR, CCPA) is desirable.. What You'll Do. Utilize advanced data analytics techniques to extract actionable insights from large and complex datasets.. Collaborate with stakeholders to identify and define key business questions and objectives.. Design and develop data models, algorithms, and visualizations to effectively communicate findings and recommendations to both technical and non-technical stakeholders.. Conduct thorough analysis of customer and market data to identify trends, patterns, and opportunities for business growth.. Collaborate with product teams to develop and optimize data-driven product features and enhancements.. Work closely with data engineering teams to ensure data integrity, quality, and availability for analysis.. Provide mentorship and guidance to junior analysts, fostering their professional growth and development.. Stay up to date with industry trends and emerging technologies in data analytics and contribute innovative ideas to enhance our analytical capabilities.. Support the development and implementation of data governance practices and policies.. Collaborate with internal teams to define and track key performance indicators (KPIs) and metrics.. Show more Show less
Posted 5 days ago
1.0 - 3.0 years
3 - 6 Lacs
Bengaluru
Work from Office
We are seeking a highly skilled and motivated System Ops Engineer with a strong background in computer science or statistics, and at least 5 years of professional experience. The ideal candidate will possess deep expertise in cloud computing (AWS), data engineering, Big Data applications, AI/ML, and SysOps. A strong technical foundation, proactive mindset, and ability to work in a fast-paced environment are essential.. Key Responsibilities. Cloud Expertise:. Proficient in AWS services including EC2, VPC, Lambda, DynamoDB, API Gateway, EBS, S3, IAM, and more.. Design, implement, and maintain scalable, secure, and efficient cloud-based solutions.. Execute optimized configurations for cloud infrastructure and services.. Data Engineering. Develop, construct, test, and maintain data architectures, such as databases and processing systems.. Write efficient Spark and Python code for data processing and manipulation.. Administer and manage multiple ETL applications, ensuring seamless data flow.. Big Data Applications. Lead end-to-end Big Data projects, from design to deployment.. Monitor and optimize Big Data systems for performance, reliability, and scalability.. AI/ML Applications. Hands-on experience developing and deploying AI/ML models, especially in Natural Language Processing (NLP), Computer Vision (CV), and Generative AI (GenAI).. Collaborate with data scientists to productionize ML models and support ongoing model performance tuning.. DevOps And IaaS. Utilize DevOps tools for continuous integration and deployment (CI/CD).. Design and maintain Infrastructure as a Service (IaaS) ensuring scalability, fault-tolerance, and automation.. SysOps Responsibilities. Manage servers and network infrastructure to ensure system availability and security.. Configure and maintain virtual machines and cloud-based system environments.. Monitor system logs, alerts, and performance metrics.. Install and update software packages and apply security patches.. Troubleshoot network connectivity issues and resolve infrastructure problems.. Implement and enforce security policies, protocols, and procedures.. Conduct regular data backups and disaster recovery tests.. Optimize systems for speed, efficiency, and reliability.. Collaborate with IT and development teams to support integration of new systems and applications. Qualifications. Bachelor's degree in Computer Science, Statistics, or a related field.. 5+ years of experience in cloud computing, data engineering, and related technologies.. In-depth knowledge of AWS services and cloud architecture.. Strong programming experience in Spark and Python.. Proven track record in Big Data applications and pipelines.. Applied experience in AI/ML models, particularly in NLP, CV, and GenAI domains.. Skilled in managing and administering ETL tools and workflows.. Experience with DevOps pipelines, CI/CD tools, and cloud automation.. Demonstrated experience with SysOps or cloud infrastructure/system operations.. Show more Show less
Posted 5 days ago
1.0 - 3.0 years
3 - 7 Lacs
Hyderabad
Work from Office
Job Title: Backend Developer Python. Job Type: Full-time. Location: On-site, Hyderabad, Telangana, India. Job Summary:. Join one of our top customer's team as a Backend Developer and help drive scalable, high-performance solutions at the intersection of machine learning and data engineering. You’ll collaborate with skilled professionals to design, implement, and maintain backend systems powering advanced AI/ML applications in a dynamic, onsite environment.. Key Responsibilities:. Develop, test, and deploy robust backend components and microservices using Python and PySpark.. Implement and optimize data pipelines leveraging Databricks and distributed computing frameworks.. Design and maintain efficient databases with MySQL, ensuring data integrity and high availability.. Integrate machine learning models into production-ready backend systems supporting AI-driven features.. Collaborate closely with data scientists and engineers to deliver end-to-end solutions aligned with business goals.. Monitor, troubleshoot, and enhance system performance, utilizing Redis for caching and improved scalability.. Write clear and maintainable documentation, and communicate effectively with team members both verbally and in writing.. Required Skills and Qualifications:. Proficiency in Python programming for backend development.. Hands-on experience with Databricks and PySpark in a production environment.. Strong understanding of MySQL database design, querying, and performance tuning.. Practical background in machine learning concepts and deploying ML models.. Experience with Redis for caching and state management.. Excellent written and verbal communication skills, with a keen attention to detail.. Demonstrated ability to work effectively in an on-site, collaborative setting in Hyderabad.. Preferred Qualifications:. Previous experience in high-growth AI/ML or data engineering projects.. Familiarity with additional backend technologies or cloud platforms.. Demonstrated leadership or mentorship in technical teams.. Show more Show less
Posted 5 days ago
2.0 - 6.0 years
5 - 9 Lacs
Noida
Work from Office
About Us :. At TELUS Digital, we enable customer experience innovation through spirited teamwork, agile thinking, and a caring culture that puts customers first. TELUS Digital is the global arm of TELUS Corporation, one of the largest telecommunications service providers in Canada. We deliver contact center and business process outsourcing (BPO) solutions to some of the world's largest corporations in the consumer electronics, finance, telecommunications and utilities sectors. With global call center delivery capabilities, our multi-shore, multi-language programs offer safe, secure infrastructure, value-based pricing, skills-based resources and exceptional customer service all backed by TELUS, our multi-billion dollar telecommunications parent.. Required Skills :. Minimum 6 years of experience in Architectecture, Design and building data extraction, loading, and transformation pipelines and data products across on-prem and cloud platforms.. Perform application impact assessments, requirements reviews, and develop work estimates.. Develop test strategies and site reliability engineering measures for data products and solutions.. Lead agile development "scrums" and solution reviews.. Mentor junior Data Engineering Specialists.. Lead the resolution of critical operations issues, including post-implementation reviews.. Perform technical data stewardship tasks, including metadata management, security, and privacy by design.. Demonstrate expertise in SQL and database proficiency in various data engineering tasks.. Automate complex data workflows by setting up DAGs in tools like Control-M, Apache Airflow, and Prefect.. Develop and manage Unix scripts for data engineering tasks.. Intermediate proficiency in infrastructure-as-code tools like Terraform, Puppet, and Ansible to automate infrastructure deployment.. Proficiency in data modeling to support analytics and business intelligence.. Working knowledge of ML Ops to integrate machine learning workflows with data pipelines.. Extensive expertise in GCP technologies, including BigQuery, Cloud SQL, Dataflow, Data Catalog, Cloud. Composer, Google Cloud Storage, IAM, Compute Engine, Cloud Data Fusion, Dataproc (good to have), and BigTable.. Advanced proficiency in programming languages (Python).. Qualifications:. Bachelor's degree in Software Engineering, Computer Science, Business, Mathematics, or related field.. Analytics certification in BI or AI/ML.. 6+ years of data engineering experience.. 4 years of data platform solution architecture and design experience.. GCP Certified Data Engineer (preferred).. Show more Show less
Posted 5 days ago
1.0 - 3.0 years
1 - 5 Lacs
Chennai
Work from Office
hackajob is collaborating with Comcast to connect them with exceptional tech professionals for this role.. Comcast brings together the best in media and technology. We drive innovation to create the world's best entertainment and online experiences. As a Fortune 50 leader, we set the pace in a variety of innovative and fascinating businesses and create career opportunities across a wide range of locations and disciplines. We are at the forefront of change and move at an amazing pace, thanks to our remarkable people, who bring cutting-edge products and services to life for millions of customers every day. If you share in our passion for teamwork, our vision to revolutionize industries and our goal to lead the future in media and technology, we want you to fast-forward your career at Comcast.. Job Summary. We are looking for an experienced and proactive ETL Lead to oversee and guide our ETL testing and data validation efforts. This role requires a deep understanding of ETL processes, strong technical expertise in tools such as SQL, Oracle, MongoDB, AWS, and Python/Pyspark, and proven leadership capabilities. The ETL Lead will be responsible for ensuring the quality, accuracy, and performance of our data pipelines while mentoring a team of testers and collaborating with cross-functional stakeholders.. Job Description. Key Responsibilities:. Lead the planning, design, and execution of ETL testing strategies across multiple projects.. Oversee the development and maintenance of test plans, test cases, and test data for ETL processes.. Ensure data integrity, consistency, and accuracy across all data sources and destinations.. Collaborate with data engineers, developers, business analysts, and project managers to define ETL requirements and testing scope.. Mentor and guide a team of ETL testers, providing technical direction and support.. Review and approve test deliverables and ensure adherence to best practices and quality standards.. Identify and resolve complex data issues, bottlenecks, and performance challenges.. Drive continuous improvement in ETL testing processes, tools, and methodologies.. Provide regular status updates, test metrics, and risk assessments to stakeholders.. Stay current with emerging trends and technologies in data engineering and ETL testing.. Requirements. 6+ years of experience in ETL testing, with at least 2 years in a lead or senior role.. Strong expertise in ETL concepts, data warehousing, and data validation techniques.. Hands-on experience with Oracle, MongoDB, AWS services (e.g., S3, Redshift, Glue), and Python/Pyspark scripting.. Advanced proficiency in SQL and other query languages.. Proven ability to lead and mentor a team of testers.. Excellent problem-solving, analytical, and debugging skills.. Strong communication and stakeholder management abilities.. Experience with Agile/Scrum methodologies is a plus.. Ability to manage multiple priorities and deliver high-quality results under tight deadlines.. Disclaimer. This information has been designed to indicate the general nature and level of work performed by employees in this role. It is not designed to contain or be interpreted as a comprehensive inventory of all duties, responsibilities and qualifications.. Comcast is proud to be an equal opportunity workplace. We will consider all qualified applicants for employment without regard to race, color, religion, age, sex, sexual orientation, gender identity, national origin, disability, veteran status, genetic information, or any other basis protected by applicable law.. Base pay is one part of the Total Rewards that Comcast provides to compensate and recognize employees for their work. Most sales positions are eligible for a Commission under the terms of an applicable plan, while most non-sales positions are eligible for a Bonus. Additionally, Comcast provides best-in-class Benefits to eligible employees. We believe that benefits should connect you to the support you need when it matters most, and should help you care for those who matter most. That’s why we provide an array of options, expert guidance and always-on tools, that are personalized to meet the needs of your reality to help support you physically, financially and emotionally through the big milestones and in your everyday life.. Education. Bachelor's Degree. While possessing the stated degree is preferred, Comcast also may consider applicants who hold some combination of coursework and experience, or who have extensive related professional experience.. Relevant Work Experience. 7-10 Years. Show more Show less
Posted 5 days ago
8.0 - 13.0 years
25 - 30 Lacs
Hyderabad
Work from Office
Bachelor’s degree in computer science, engineering, or a related field. Master’s degree preferred.. Data: 6+ years of experience with data analytics and data warehousing. Sound knowledge of data warehousing concepts.. SQL: 6+ years of hands-on experience on SQL and query optimization for data pipelines.. ELT/ETL: 6+ years of experience in Informatica/ 3+ years of experience in IICS/IDMC. Migration Experience: Experience Informatica on prem to IICS/IDMC migration. Cloud: 5+ years’ experience working in AWS cloud environment. Python: 5+ years of hands-on experience of development with Python. Workflow: 4+ years of experience in orchestration and scheduling tools (e.g. Apache Airflow). Advanced Data Processing: Experience using data processing technologies such as Apache Spark or Kafka. Troubleshooting: Experience with troubleshooting and root cause analysis to determine and remediate potential issues. Communication: Excellent communication, problem-solving and organizational and analytical skills. Able to work independently and to provide leadership to small teams of developers.. Reporting: Experience with data reporting (e.g. MicroStrategy, Tableau, Looker) and data cataloging tools (e.g. Alation). Experience in Design and Implementation of ETL solutions with effective design and optimized performance, ETL Development with industry standard recommendations for jobs recovery, fail over, logging, alerting mechanisms.. Show more Show less
Posted 5 days ago
3.0 - 6.0 years
9 - 13 Lacs
Gurugram
Work from Office
Dentsply Sirona is the world’s largest manufacturer of professional dental products and technologies, with a 130-year history of innovation and service to the dental industry and patients worldwide. Dentsply Sirona develops, manufactures, and markets a comprehensive solutions offering including dental and oral health products as well as other consumable medical devices under a strong portfolio of world class brands. Dentsply Sirona’s products provide innovative, high-quality and effective solutions to advance patient care and deliver better and safer dentistry. Dentsply Sirona’s global headquarters is located in Charlotte, North Carolina, USA. The company’s shares are listed in the United States on NASDAQ under the symbol XRAY.. Bringing out the best in people. As advanced as dentistry is today, we are dedicated to making it even better. Our people have a passion for innovation and are committed to applying it to improve dental care. We live and breathe high performance, working as one global team, bringing out the best in each other for the benefit of dental patients, and the professionals who serve them. If you want to grow and develop as a part of a team that is shaping an industry, then we’re looking for the best to join us.. Working At Dentsply Sirona You Are Able To. Develop faster with our commitment to the best professional development.. Perform better as part of a high-performance, empowering culture.. Shape an industry with a market leader that continues to drive innovation.. Make a difference -by helping improve oral health worldwide.. Scope. The Global Data, Business Intelligence and Analytics department is responsible for the development and implementation of global data, BI tools, and reports in Dentsply Sirona working cross-functionally across the enterprise.. As a technical expert you will work with data modelers, data engineers and the project manager to define and deliver optimum solutions and participating in data profiling and integration, as well as supporting the project manager in stakeholder meeting where required. This role provides 2nd Line support for BI Solutions, BI Platforms and Data Loading when required.. This role is expected to keep abreast of advancements in the BI Tools landscape and works with BI Development manager evaluate their usefulness in the Dentsply Sirona environment.. Key Responsibilities. Development of BI Solutions delivered through Power BI and SSAS Tabular. Education and training of internal users on BI Solutions. Technical user support and updating user documentation.. Participant in the data modelling process for BI deliverables. Participant in the delivery of a new BI self-service strategy and roll out to the different global functions.. Leading part in design and development of new BI solutions primarily using Power BI and SSAS Tabular. Evaluation and improvement of existing BI solutions and applications. Technical Implementation of BI Solution within assigned projects. Sizing of Work Items within assigned projects. Additional responsibilities as assigned.. Education. An academic background, with relevant university degree within Management Information System or similar.. Years And Type Of Experience. Minimum 5 year or relevant work experience. Extensive experience working in a BI environment preferably using Microsoft BI components. Key Skills, Knowledge & Capabilities. Demonstrates pro-active and collaborative relationships with team members and stakeholders. Demonstrates the Dentsply Sirona Core Values. Communicates clearly and concisely to both technical and non-technical audience.. Strong analytical skills. Strong presentation skills. English language – proficiency in verbal and written communication. How We Lead The DS Way. Actively articulates and promotes Dentsply Sirona’s vision, mission and values.. Advocates on behalf of the customer.. Promotes high performance, innovation and continual improvement.. Consistently meets Company standards, ethics and compliance requirements.. Clear and effective communication with stake holders, which span across multiple levels, socio-geographic areas and functional expertise.. DentsplySirona is an Equal Opportunity/ Affirmative Action employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, age, sexual orientation, disability, or protected Veteran status. We appreciate your interest in DentsplySirona.. If you need assistance with completing the online application due to a disability, please send an accommodation request to careers@dentsplysirona.com. Please be sure to include “Accommodation Request” in the subject.. Show more Show less
Posted 5 days ago
5.0 - 9.0 years
9 - 13 Lacs
Hyderabad
Work from Office
Job Summary. ServCrust is a rapidly growing technology startup with the vision to revolutionize India's infrastructure. by integrating digitization and technology throughout the lifecycle of infrastructure projects.. About The Role. As a Data Science Engineer, you will lead data-driven decision-making across the organization. Your. responsibilities will include designing and implementing advanced machine learning models, analyzing. complex datasets, and delivering actionable insights to various stakeholders. You will work closely with. cross-functional teams to tackle challenging business problems and drive innovation using advanced. analytics techniques.. Responsibilities. Collaborate with strategy, data engineering, and marketing teams to understand and address business requirements through advanced machine learning and statistical models.. Analyze large spatiotemporal datasets to identify patterns and trends, providing insights for business decision-making.. Design and implement algorithms for predictive and causal modeling.. Evaluate and fine-tune model performance.. Communicate recommendations based on insights to both technical and non-technical stakeholders.. Requirements. A Ph.D. in computer science, statistics, or a related field. 5+ years of experience in data science. Experience in geospatial data science is an added advantage. Proficiency in Python (Pandas, Numpy, Sci-Kit Learn, PyTorch, StatsModels, Matplotlib, and Seaborn); experience with GeoPandas and Shapely is an added advantage. Strong communication and presentation skills. Show more Show less
Posted 5 days ago
3.0 - 5.0 years
12 - 16 Lacs
Bengaluru
Work from Office
Job Description. We are seeking a highly skilled and motivated Cloud Data Engineer with a strong background in computer science or statistics, coupled with at least 5 years of professional experience. The ideal candidate will possess a deep understanding of cloud computing, particularly in AWS, and should have a proven track record in data engineering, Big Data applications, and AI/ML applications.. Responsibilities. Cloud Expertise:. Proficient in AWS services such as EC2, VPC, Lambda, DynamoDB, API Gateway, EBS, S3, IAM, and more.. Design, implement, and maintain scalable cloud-based solutions.. Execute efficient and secure cloud infrastructure configurations.. Data Engineering:. Develop, construct, test, and maintain architectures, such as databases and processing systems.. Utilize coding skills in Spark and Python for data processing and manipulation.. Administer multiple ETL applications to ensure seamless data flow.. Big Data Applications:. Work on end-to-end Big Data application projects, from conception to deployment.. Optimize and troubleshoot Big Data solutions to ensure high performance.. AI/ML Applications:. Experience in developing and deploying AI/ML applications based on NLP, CV, and GenAI.. Collaborate with data scientists to implement machine learning models into production environments.. DevOps and Infrastructure as a Service (IaaS):. Possess knowledge and experience with DevOps applications for continuous integration and deployment.. Set up and maintain infrastructure as a service, ensuring scalability and reliability.. Qualifications. Bachelor’s degree in computer science, Statistics, or a related field.. 5+ years of professional experience in cloud computing, data engineering, and related fields.. Proven expertise in AWS services, with a focus on EC2, VPC, Lambda, DynamoDB, API Gateway, EBS, S3, IAM, etc.. Proficient coding skills in Spark and Python for data processing.. Hands-on experience with Big Data application projects.. Experience in AI/ML applications, particularly in NLP, CV, and GenAI.. Administration experience with multiple ETL applications.. Knowledge and experience with DevOps tools and processes.. Ability to set up and maintain infrastructure as a service.. Soft Skills. Strong analytical and problem-solving skills.. Excellent communication and collaboration abilities.. Ability to work effectively in a fast-paced and dynamic team environment.. Proactive mindset with a commitment to continuous learning and improvement.. Show more Show less
Posted 5 days ago
2.0 - 6.0 years
5 - 9 Lacs
Noida
Work from Office
About Us :. At TELUS Digital, we enable customer experience innovation through spirited teamwork, agile thinking, and a caring culture that puts customers first. TELUS Digital is the global arm of TELUS Corporation, one of the largest telecommunications service providers in Canada. We deliver contact center and business process outsourcing (BPO) solutions to some of the world's largest corporations in the consumer electronics, finance, telecommunications and utilities sectors. With global call center delivery capabilities, our multi-shore, multi-language programs offer safe, secure infrastructure, value-based pricing, skills-based resources and exceptional customer service all backed by TELUS, our multi-billion dollar telecommunications parent.. Required Skills:. Design, develop, and support data pipelines and related data products and platforms.. Design and build data extraction, loading, and transformation pipelines and data products across on-prem and cloud platforms.. Perform application impact assessments, requirements reviews, and develop work estimates.. Develop test strategies and site reliability engineering measures for data products and solutions.. Participate in agile development "scrums" and solution reviews.. Mentor junior Data Engineers.. Lead the resolution of critical operations issues, including post-implementation reviews.. Perform technical data stewardship tasks, including metadata management, security, and privacy by design.. Design and build data extraction, loading, and transformation pipelines using Python and other GCP Data Technologies. Demonstrate SQL and database proficiency in various data engineering tasks.. Automate data workflows by setting up DAGs in tools like Control-M, Apache Airflow, and Prefect.. Develop Unix scripts to support various data operations.. Model data to support business intelligence and analytics initiatives.. Utilize infrastructure-as-code tools such as Terraform, Puppet, and Ansible for deployment automation.. Expertise in GCP data warehousing technologies, including BigQuery, Cloud SQL, Dataflow, Data Catalog,. Cloud Composer, Google Cloud Storage, IAM, Compute Engine, Cloud Data Fusion and Dataproc (good to have).. Qualifications:. Bachelor's degree in Software Engineering, Computer Science, Business, Mathematics, or related field.. 4+ years of data engineering experience.. 2 years of data solution architecture and design experience.. GCP Certified Data Engineer (preferred).. Show more Show less
Posted 5 days ago
3.0 - 6.0 years
9 - 13 Lacs
Gurugram
Work from Office
Dentsply Sirona is the world’s largest manufacturer of professional dental products and technologies, with a 130-year history of innovation and service to the dental industry and patients worldwide. Dentsply Sirona develops, manufactures, and markets a comprehensive solutions offering including dental and oral health products as well as other consumable medical devices under a strong portfolio of world class brands. Dentsply Sirona’s products provide innovative, high-quality and effective solutions to advance patient care and deliver better and safer dentistry. Dentsply Sirona’s global headquarters is located in Charlotte, North Carolina, USA. The company’s shares are listed in the United States on NASDAQ under the symbol XRAY.. Bringing out the best in people. As advanced as dentistry is today, we are dedicated to making it even better. Our people have a passion for innovation and are committed to applying it to improve dental care. We live and breathe high performance, working as one global team, bringing out the best in each other for the benefit of dental patients, and the professionals who serve them. If you want to grow and develop as a part of a team that is shaping an industry, then we’re looking for the best to join us.. Working At Dentsply Sirona You Are Able To. Develop faster with our commitment to the best professional development.. Perform better as part of a high-performance, empowering culture.. Shape an industry with a market leader that continues to drive innovation.. Make a difference -by helping improve oral health worldwide.. Scope. The Global Data, Business Intelligence and Analytics department is responsible for the development and implementation of global data, BI tools, and reports in Dentsply Sirona working cross-functionally across the enterprise.. As a technical expert you will work with data modelers, data engineers and the project manager to define and deliver optimum solutions and participating in data profiling and integration, as well as supporting the project manager in stakeholder meeting where required. This role provides 2nd Line support for BI Solutions, BI Platforms and Data Loading when required.. This role is expected to keep abreast of advancements in the BI Tools landscape and works with BI Development manager evaluate their usefulness in the Dentsply Sirona environment.. Key Responsibilities. Development of BI Solutions delivered through Power BI and SSAS Tabular. Education and training of internal users on BI Solutions. Technical user support and updating user documentation.. Participant in the data modelling process for BI deliverables. Participant in the delivery of a new BI self-service strategy and roll out to the different global functions.. Leading part in design and development of new BI solutions primarily using Power BI and SSAS Tabular. Evaluation and improvement of existing BI solutions and applications. Technical Implementation of BI Solution within assigned projects. Sizing of Work Items within assigned projects. Additional responsibilities as assigned.. Education. An academic background, with relevant university degree within Management Information System or similar.. Years And Type Of Experience. Minimum 5 year or relevant work experience. Extensive experience working in a BI environment preferably using Microsoft BI components. Key Skills, Knowledge & Capabilities. Demonstrates pro-active and collaborative relationships with team members and stakeholders. Demonstrates the Dentsply Sirona Core Values. Communicates clearly and concisely to both technical and non-technical audience.. Strong analytical skills. Strong presentation skills. English language – proficiency in verbal and written communication. How We Lead The DS Way. Actively articulates and promotes Dentsply Sirona’s vision, mission and values.. Advocates on behalf of the customer.. Promotes high performance, innovation and continual improvement.. Consistently meets Company standards, ethics and compliance requirements.. Clear and effective communication with stake holders, which span across multiple levels, socio-geographic areas and functional expertise.. DentsplySirona is an Equal Opportunity/ Affirmative Action employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, age, sexual orientation, disability, or protected Veteran status. We appreciate your interest in DentsplySirona.. If you need assistance with completing the online application due to a disability, please send an accommodation request to careers@dentsplysirona.com. Please be sure to include “Accommodation Request” in the subject.. Show more Show less
Posted 5 days ago
2.0 - 4.0 years
8 - 12 Lacs
Pune
Work from Office
FlexTrade Systems is a provider of customized multi-asset execution and order management trading solutions for buyand sell-side financial institutions. Through deep client partnerships with some of the world's largest, most complex and demanding capital markets firms, we develop the flexible tools, technology and innovation that deliver our clients a competitive edge. Our globally distributed engineering teams focus on adaptable technology and open architecture to develop highly sophisticated trading solutions that can automate and scale with your business strategies.. At FlexTrade, we hold our values close to heart, with pride and gratitude, as they guide us in everything that we do. We are dedicated to giving our clients a competitive edge, taking ownership of our responsibilities, being flexible to adapt to ever changing environment and technology, bringing integrity to ever interaction and we continue to improve, grow together and collaborate as one team. All of these while having Fun truly makes FlexTrade a wonderful place to work.. About you:. Data Engineer (Python / SQL) –FlexTCA Join a dynamic FlexTCA (FlexTrade Transaction Cost Analysis) development team. FlexTrade is looking for a Data Engineer to be a part of a rapidly evolving technology group supplying top quality solutions to our growing top tier global client base. This role offers the opportunity to work closely with data scientists and data engineers within FlexTrade’s Quantitative Solutions team, as well as exposure to cutting edge analytics and machine learning technologies.. The Product:. FlexTCA is our Post & Pre-Trade transaction cost analysis and execution quality management solution offering historical and real-time analytics for trading portfolios and single securities across global equities, FX, futures and fixed income. FlexTCA is used by investment managers and brokerages to analyze, evaluate, and improve trader, algo, broker, and venue performance. The product includes an intuitive and flexible web interface for data visualization, exploration, and analysis.. CORE RESPONSIBILITIES:. The candidate will be responsible for design, implementation, and maintenance of FlexTCA’s data processing, data management, and BI tooling.. They will also be asked to contribute to original research on FlexTrade’s proprietary cost models.. The product exposure offers the opportunity for in-depth learning as it relates to both trading and analytics technologies, as well as the associated development life cycle.. Key Skills. Should be able to write queries to facilitate data warehouse integrity checks. The individual will contribute to the design, development, and management of order, execution, and market data along with bucketed timeseries ticks for various assets classes.. Create and Maintain Extract, Transform, and Load (ETL) workflows for a range of datasets.. Write and maintain API interfaces with third party data vendors.. Will be responsible for all Production, QA and Dev ETL and data integrity checks on a Linux environment.. Provide automation support and backup to DBA team.. Help automate Business Intelligence environment maintenance and visualization roll-out using native Python wrapper.. Bachelor’s Degree in Computer Science or equivalent industry experience acceptable. 2+ years experience required.. Excellent SQL skills are required.. Strong experience Python and UNIX / Shell scripting.. 2+ Years Experience with any business intelligence tool, optimally Sisense, will be strongly preferred.. 2+ Years Experience with numpy / pandas .. 2+ years experience conducting trading research, understanding sampling, validation, and statistics .. Excellent communication and problem-solving skills. Show more Show less
Posted 5 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
19947 Jobs | Dublin
Wipro
9475 Jobs | Bengaluru
EY
7894 Jobs | London
Accenture in India
6317 Jobs | Dublin 2
Amazon
6141 Jobs | Seattle,WA
Uplers
6077 Jobs | Ahmedabad
Oracle
5820 Jobs | Redwood City
IBM
5736 Jobs | Armonk
Tata Consultancy Services
3644 Jobs | Thane
Capgemini
3598 Jobs | Paris,France