Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
0 years
2 - 2 Lacs
Hyderābād
On-site
Role Summary & Role Description: Technical Manager with specific Oracle, PLSQL and design, develop, and optimize data workflows on the Databricks platform. The ideal candidate will have deep expertise in Apache Spark, Pyspark, Python, job orchestration, and CI/CD integration to support scalable data engineering and analytics solutions. Analyzes, designs, develops and maintains software applications to support business units. Expected to spend 80% of the time on hands-on development, design and architecture and remaining 20% on guiding the team on technology and removing other impediments Capital Markets Projects experience preferred Provides advanced technical expertise in analyzing, designing, estimating, and developing software applications to project schedule. Oversees systems design and implementation of most complex design components. Creates project plans and deliverables and monitors task deadlines. Oversees, maintains and supports existing software applications. Provides subject matter expertise in reviewing, analyzing, and resolving complex issues. Designs and executes end to end system tests of new installations and/or software prior to release to minimize failures and impact to business and end users. Responsible for resolution, communication, and escalation of critical technical issues. Prepares user and systems documentation as needed. Identifies and recommends Industry best practices. Serves as a mentor to junior staff. Acts as a technical lead/mentor for developers in day to day and overall project areas. Ability to lead a team of agile developers. Worked in a complex deadline driven projects with minimal supervision. Ability to architect/design/develop with minimum requirements by effectively coordinating activities between business analysts, scrum leads, developers and managers. Ability to provide agile status notes on day to day project tasks. Technical Skills Design and implement robust ETL pipelines using Databricks notebooks and workflows. Proficiency in Python, Scala, Apache Spark, SQL, and Spark DataFrames. Experience with job orchestration tools and scheduling frameworks. Optimize Spark jobs for performance and cost-efficiency. Develop and manage job orchestration strategies using Databricks Jobs and Workflows. Familiarity with CI/CD practices and tools. Monitor and troubleshoot production jobs, ensuring reliability and data quality. Implement security and governance best practices including access control and encryption. Strong Practical experience using Scrum, Agile modelling and adaptive software development. Ability to understand and grasp the big picture of system components. Experience building environment and architecture and design guides and architecture and application blueprints. Strong understanding of data modeling, warehousing, and performance tuning. Excellent problem-solving and communication skills. Core/Must have skills: Oracle, SQL, PLSQL, Python, Scala, Apache Spark, Spark Streaming, CI CD pipeline, AWS cloud experience Good to have skills: Airflow Work Schedule: 12 PM IST to 9 PM (IST) About State Street: What we do. State Street is one of the largest custodian banks, asset managers and asset intelligence companies in the world. From technology to product innovation, we’re making our mark on the financial services industry. For more than two centuries, we’ve been helping our clients safeguard and steward the investments of millions of people. We provide investment servicing, data & analytics, investment research & trading and investment management to institutional clients. Work, Live and Grow. We make all efforts to create a great work environment. Our benefits packages are competitive and comprehensive. Details vary by location, but you may expect generous medical care, insurance and savings plans, among other perks. You’ll have access to flexible Work Programs to help you match your needs. And our wealth of development programs and educational support will help you reach your full potential. Inclusion, Diversity and Social Responsibility. We truly believe our employees’ diverse backgrounds, experiences and perspectives are a powerful contributor to creating an inclusive environment where everyone can thrive and reach their maximum potential while adding value to both our organization and our clients. We warmly welcome candidates of diverse origin, background, ability, age, sexual orientation, gender identity and personality. Another fundamental value at State Street is active engagement with our communities around the world, both as a partner and a leader. You will have tools to help balance your professional and personal life, paid volunteer days, matching gift programs and access to employee networks that help you stay connected to what matters to you. State Street is an equal opportunity and affirmative action employer. Discover more at StateStreet.com/careers
Posted 5 days ago
5.0 - 10.0 years
4 - 8 Lacs
Pune, Gurugram
Work from Office
What youll do: Lead the technical team consisting of Developers, Senior Engineers, etc. to solve the business problems and lead end to end technical delivery in collaboration with Project Manager Ownership to ensure the proposed design/ architecture, deliverables meets the client expectation and solves the business problem with high degree of quality Translate complex client expectations and business problem into technical requirements. Own end to end responsibility to lead project across all phases right from discovery/ POC through build, SIT and UAT phases Lead one or more projects at a time, based on the role and time commitment required for the project Partner with Senior Leadership team and assist in project management responsibility i.e. Project planning, staffing management, people growth, etc.; Work in tandem with global counterparts in planning and accomplishing planned activities, identification or risks and mitigation strategy; Build relationship with client stakeholders and lead presentations related to project deliverables, design brainstorming / discussion, status updates, innovation/ improvements, etc.; Collaborate with other ZS internal expertise teams - Architects, Validation/ testing, etc. to ensure best in class technology solution; Outlook for continuous improvement, innovation and provide necessary mentorship and guidance to the team; Liaison with Staffing partner, HR business partners for team building/ planning; Assist Senior Leadership on building POV on new technology or problem solving, Innovation to build firm intellectual capital: Lead the project deliverables such as business case development, solution vision and design, user requirements, solution mockup, prototypes, and technical architecture, test cases, deployment plans, operations strategy and planning, etc.; Actively lead unstructured problem solving to design and build complex solutions, tune to meet expected performance and functional requirements; Lead appropriate documentation of systems design, procedure, SOP, etc.; Build cloud applications using serverless technologies like custom web applications/ETL pipelines/ real time/stream analytics applications etc Leverage expertise / experience of both traditional and modern data architecture and processing concepts, including relational databases. What youll bring: Bachelor's/Master's degree with specialization in Computer Science, MIS, IT or other computer related disciplines; 5+ years of relevant consulting-industry experience working on medium-large scale technology solution delivery engagements 5+ years of hands on experience of designing, implementation Data processing/data management solutions. Should have strong expertise in creating High Level and Detailed Design documents Good handle on working in distributed computing and cloud services platform (but not limited to) - AWS, Azure, GCP. Should have experience of working on Agile delivery framework and can mentor and coach team to follow agile best practices Expertise in one of the Programming languages like Python, Scala, etc. and should be able to review the codes created by developers Expertise in commonly used AWS services (or equivalent services in Azure) is preferred EMR, Glue, EC2, Glue ETL, Managed Airflow, S3, LakeFormation, SageMaker Studio, Athena, Redshift, RDS, AWS Neptune Experience in building project plans/sprint plans for the technical team, estimating project timelines and effort, work distribution to junior team members and tracking project progress Lead project teams in driving end to end activities to meet set milestones and provide necessary mentorship/ guidance for the team growth; Is comfortable with all the SDLC documentation required to support technical delivery and can work with relevant teams to build all these SDLC documents Additional skills: Capable of managing a virtual global team for the timely delivery of multiple projects Experienced in analyzing and troubleshooting interactions between databases, operating systems, and applications Travel to global offices as required to collaborate with clients and internal project teams
Posted 5 days ago
8.0 years
3 - 9 Lacs
Hyderābād
On-site
Senior Data Scientist Hyderabad, Telangana, India Date posted Jul 08, 2025 Job number 1844017 Work site Microsoft on-site only Travel 0-25 % Role type Individual Contributor Profession Research, Applied, & Data Sciences Discipline Data Science Employment type Full-Time Overview Security represents the most critical priorities for our customers in a world awash in digital threats, regulatory scrutiny, and estate complexity. Microsoft Security aspires to make the world a safer place for all. We want to reshape security and empower every user, customer, and developer with a security cloud that protects them with end to end, simplified solutions. The Microsoft Security organization accelerates Microsoft’s mission and bold ambitions to ensure that our company and industry is securing digital technology platforms, devices, and clouds in our customers’ heterogeneous environments, as well as ensuring the security of our own internal estate. Our culture is centered on embracing a growth mindset, a theme of inspiring excellence, and encouraging teams and leaders to bring their best each day. In doing so, we create life-changing innovations that impact billions of lives around the world. We are looking for a Senior Data Scientist to join our Detection Engineering team and lead the development of AI/ML models that enhance the efficiency and impact of Microsoft’s Security Operations Center (SOC). In this role, you will drive the design and optimization of scalable, data-driven solutions that transform massive security signal data into actionable intelligence. You’ll bring deep technical expertise to guide model architecture, platform integration, and best practices in security engineering, while collaborating closely with analysts, responders, and engineers to align on goals, scope, and strategy. Beyond technical leadership, you’ll continuously evolve our advanced detection frameworks to improve accuracy, reduce false positives, and stay ahead of emerging threats. You’ll mentor early-in-career engineers, foster a culture of learning and innovation, and contribute to a strong, inclusive team environment grounded in Microsoft’s values. If you’re passionate about applying data science to real-world security challenges and thrive in a fast-paced, mission-driven space, we’d love to hear from you. Microsoft’s mission is to empower every person and every organization on the planet to achieve more. As employees we come together with a growth mindset, innovate to empower others, and collaborate to realize our shared goals. Each day we build on our values of respect, integrity, and accountability to create a culture of inclusion where everyone can thrive at work and beyond. In alignment with our Microsoft values, we are committed to cultivating an inclusive work environment for all employees to positively impact our culture every day. Qualifications 8+ years of experience in Data Science, machine learning, natural language processing, and deep learning preferably with a focus on Cyber Security or related fields. Experience in programming languages such as Python, R, or Scala, with hands-on experience in data analysis, experimental design principles and visualization. Experience in translating complex data into actionable insights and recommendations that drive business impact. Excellent technical design skills and proven ability to drive large scale system designs for complex projects or products. Expertise in machine learning frameworks and libraries (e.g., TensorFlow, PyTorch, scikit-learn and others). In-depth knowledge of cybersecurity principles, threats, and attack vectors. Experience with big data technologies (e.g., Hadoop, Spark, Kafka) and data processing. Strong analytical and problem-solving skills with the ability to think creatively. Excellent communication skills with the ability to explain complex concepts stakeholders. Master's Degree in Data Science, Mathematics, Statistics, Econometrics, Economics, Operations Research, Computer Science, or related field AND 5+ years data-science experience (e.g., managing structured and unstructured data, applying statistical techniques and reporting results) Bachelor's Degree in Data Science, Mathematics, Statistics, Econometrics, Economics, Operations Research, Computer Science, or related field AND 8+ years data-science experience (e.g., managing structured and unstructured data, applying statistical techniques and reporting results) or equivalent experience. Preferred Qualifications: Experience in developing and deploying machine learning models for Cyber security applications. Experience in Big Data preferably in the cybersecurity or SaaS industry. Experience with data science workloads with the Azure tech stack; Synapse, Azure ML, etc. Knowledge of anomaly detection, fraud detection, and other related areas. Familiarity with security fundamentals and attack vectors Publications or contributions to the field of data science or cybersecurity. Excellent track record of cross team collaboration. Ambitious, self-motivated. Agile, can-do attitude and great at dealing with ambiguity. Responsibilities Develop and implement machine learning models and algorithms to detect security threats and attacks within Microsoft. Analyse large and complex datasets generated to identify patterns and anomalies indicative of security risks. Collaborate with security experts to understand threat landscapes and incorporate domain knowledge into models. Continuously monitor and improve the performance of security models to adapt to evolving threats. Lead the design and implementation of data-driven security solutions and tools. Mentor and guide junior data scientists in best practices and advanced techniques. Communicate findings and insights to stakeholders, including senior leadership and technical teams. Stay up to date with the latest advancements in data science, machine learning, and cybersecurity. Benefits/perks listed below may vary depending on the nature of your employment with Microsoft and the country where you work. Industry leading healthcare Educational resources Discounts on products and services Savings and investments Maternity and paternity leave Generous time away Giving programs Opportunities to network and connect Microsoft is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations.
Posted 5 days ago
4.0 years
0 Lacs
Hyderābād
Remote
Software Engineer II Hyderabad, Telangana, India Date posted Jul 28, 2025 Job number 1851616 Work site Up to 50% work from home Travel 0-25 % Role type Individual Contributor Profession Software Engineering Discipline Software Engineering Employment type Full-Time Overview The Purview team is dedicated to protecting and governing the enterprise digital estate on a global scale. Our mission involves developing cloud solutions that offer premium features such as security, compliance, data governance, data loss prevention and insider risk management. These solutions are fully integrated across Office 365 services and clients, as well as Windows. We create global-scale services to transport, store, secure, and manage some of the most sensitive data on the planet, leveraging Azure, Exchange, and other cloud platforms, along with Office applications like Outlook. The IDC arm of our team is expanding significantly and seeks talented, highly motivated engineers. This is an excellent opportunity for those looking to build expertise in cloud distributed systems, security, and compliance. Our team will develop cloud solutions that meet the demands of a vast user base, utilizing state-of-the-art technologies to deliver comprehensive protection. Office 365, the industry leader in hosted productivity suites, is the fastest-growing business at Microsoft, with over 100 million seats hosted in multiple data centers worldwide. The Purview Engineering team provides leadership, direction, and accountability for application architecture, cloud design, infrastructure development, and end-to-end implementation. You will independently determine and develop architectural approaches and infrastructure solutions, conduct business reviews, and operate our production services. Strong collaboration skills are essential to work closely with other engineering teams, ensuring our services and systems are highly stable, performant, and meet the expectations of both internal and external customers and users. Microsoft’s mission is to empower every person and every organization on the planet to achieve more. As employees we come together with a growth mindset, innovate to empower others, and collaborate to realize our shared goals. Each day we build on our values of respect, integrity, and accountability to create a culture of inclusion where everyone can thrive at work and beyond. Qualifications Qualifications - Required: Solid understanding of Object-Oriented Programming (OOP) and common Design Patterns. Minimum of 4+ years of software development experience, with proficiency in C#, Java, or scala. Hands-on experience with cloud platforms such as Azure, AWS, or Google Cloud; experience with Azure Services is a plus. Familiarity with DevOps practices, CI/CD pipelines, and agile methodologies. Strong skills in distributed systems and data processing. Excellent communication and collaboration abilities, with the capacity to handle ambiguity and prioritize effectively. A BS or MS degree in Computer Science or Engineering, or equivalent work experience. Qualifications - Other Requirements: Ability to meet Microsoft, customer and/or government security screening requirements are required for this role. These requirements include, but are not limited to the following specialized security screenings: Microsoft Cloud Background Check: This position will be required to pass the Microsoft background and Microsoft Cloud background check upon hire/transfer and every two years thereafter. Responsibilities Build cloud-scale services that process and analyze massive volumes of organizational signals in real time. Harness the power of Apache Spark for high-performance data processing and scalable pipelines. machine learning to uncover subtle patterns and anomalies that signal insider threats. Craft intelligent user experiences using React and AI-driven insights to help security analysts act with confidence. Work with a modern tech stack and contribute to a product that’s mission-critical for some of the world’s largest organizations. Collaborate across disciplines—from data science to UX to cloud infrastructure—in a fast-paced, high-impact environment. Design and deliver end-to-end features including system architecture, coding, deployment, scalability, performance, and quality. Develop large-scale distributed software services and solutions that are modular, secure, reliable, diagnosable, and reusable. Conduct investigations and drive investments in complex technical areas to improve systems and services. Ensure engineering excellence by writing effective code, unit tests, debugging, code reviews, and building CI/CD pipelines. Troubleshoot and optimize Live Site operations, focusing on automation, reliability, and monitoring. Benefits/perks listed below may vary depending on the nature of your employment with Microsoft and the country where you work. Industry leading healthcare Educational resources Discounts on products and services Savings and investments Maternity and paternity leave Generous time away Giving programs Opportunities to network and connect Microsoft is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations.
Posted 5 days ago
5.0 years
3 - 5 Lacs
Gurgaon
On-site
With 5 years of experience in Python, PySpark, and SQL, you will have the necessary skills to handle a variety of tasks. You will also have hands-on experience with AWS services, including Glue, EMR, Lambda, S3, EC2, and Redshift. Your work mode will be based out of the Virtusa office, allowing you to collaborate with a team of experts. Your main skills should include Scala, Kafka, PySpark, and AWS Native Data Services, as these are mandatory for the role. Additionally, having knowledge in Big Data will be a nice to have skill that will set you apart from other candidates. About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.
Posted 5 days ago
10.0 years
0 Lacs
Kolkata, West Bengal, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D&A) – Cloud Architect - Manager As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance. The opportunity We’re looking for SeniorManagers (GTM +Cloud/ Big Data Architects) with strong technology and data understanding having proven delivery capability in delivery and pre sales. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team. Your Key Responsibilities Have proven experience in driving Analytics GTM/Pre-Sales by collaborating with senior stakeholder/s in the client and partner organization in BCM, WAM, Insurance. Activities will include pipeline building, RFP responses, creating new solutions and offerings, conducting workshops as well as managing in flight projects focused on cloud and big data. Need to work with client in converting business problems/challenges to technical solutions considering security, performance, scalability etc. [ 10- 15 years] Need to understand current & Future state enterprise architecture. Need to contribute in various technical streams during implementation of the project. Provide product and design level technical best practices Interact with senior client technology leaders, understand their business goals, create, architect, propose, develop and deliver technology solutions Define and develop client specific best practices around data management within a Hadoop environment or cloud environment Recommend design alternatives for data ingestion, processing and provisioning layers Design and develop data ingestion programs to process large data sets in Batch mode using HIVE, Pig and Sqoop, Spark Develop data ingestion programs to ingest real-time data from LIVE sources using Apache Kafka, Spark Streaming and related technologies Skills And Attributes For Success Architect in designing highly scalable solutions Azure, AWS and GCP. Strong understanding & familiarity with all Azure/AWS/GCP /Bigdata Ecosystem components Strong understanding of underlying Azure/AWS/GCP Architectural concepts and distributed computing paradigms Hands-on programming experience in Apache Spark using Python/Scala and Spark Streaming Hands on experience with major components like cloud ETLs,Spark, Databricks Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Knowledge of Spark and Kafka integration with multiple Spark jobs to consume messages from multiple Kafka partitions Solid understanding of ETL methodologies in a multi-tiered stack, integrating with Big Data systems like Cloudera and Databricks. Strong understanding of underlying Hadoop Architectural concepts and distributed computing paradigms Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Good knowledge in apache Kafka & Apache Flume Experience in Enterprise grade solution implementations. Experience in performance bench marking enterprise applications Experience in Data security [on the move, at rest] Strong UNIX operating system concepts and shell scripting knowledge To qualify for the role, you must have Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Ability to multi-task under pressure and work independently with minimal supervision. Strong verbal and written communication skills. Must be a team player and enjoy working in a cooperative and collaborative team environment. Adaptable to new technologies and standards. Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support Responsible for the evaluation of technical risks and map out mitigation strategies Experience in Data security [on the move, at rest] Experience in performance bench marking enterprise applications Working knowledge in any of the cloud platform, AWS or Azure or GCP Excellent business communication, Consulting, Quality process skills Excellent Consulting Skills Excellence in leading Solution Architecture, Design, Build and Execute for leading clients in Banking, Wealth Asset Management, or Insurance domain. Minimum 7 years hand-on experience in one or more of the above areas. Minimum 10 years industry experience Ideally, you’ll also have Strong project management skills Client management skills Solutioning skills What We Look For People with technical experience and enthusiasm to learn new things in this fast-moving environment What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 5 days ago
3.0 years
0 Lacs
Guwahati
On-site
Scala/Java Development MySQL and Redis Git Mongodb Backend Developer Jenkins Scala Backend Development Job Title: Senior Software Engineer – Backend Location: Guwahati Experience: 3–4+ years Education: BE/B.Tech or higher in Computer Science or a related field About Vantage Circle: Vantage Circle is a leading SaaS platform offering AI-powered employee engagement solutions to top organizations worldwide. We’re growing fast and looking for passionate technologists to help shape scalable backend services that power our products. Role Overview: We are seeking a skilled Senior Software Engineer (Backend) with a strong foundation in building high-performance, scalable backend systems. You will play a key role in designing, developing, and deploying critical backend components while mentoring team members and driving technical excellence. Key Responsibilities: Technical Excellence: Design and develop robust, scalable backend systems and APIs, delivering high-quality, well-tested code aligned with industry best practices. Architectural Contributions: Take ownership of complex backend architecture and systems design; contribute to technology roadmaps that support business objectives. Project Leadership: Lead end-to-end development of critical features and services with minimal supervision, ensuring timely delivery. Mentorship & Coaching: Support junior and mid-level engineers through code reviews, pair programming, and knowledge sharing to elevate overall team performance. Cross-Functional Collaboration: Work closely with product managers, designers, frontend engineers, and DevOps to build cohesive and impactful features. Problem Solving & Innovation: Proactively identify bottlenecks and architectural challenges, and propose/implement innovative solutions to enhance system performance and maintainability. Preferred Tech Stack: Programming Languages: Scala (preferred), Java Frameworks: Play Framework or similar Java-based frameworks Databases: MySQL, MongoDB Caching/Data Stores: Redis Tools: Git, Jenkins CI/CD, Docker (bonus) What We’re Looking For: Strong understanding of object-oriented and functional programming paradigms Experience in designing RESTful APIs and building scalable microservices Good understanding of relational and NoSQL databases Familiarity with performance tuning and distributed system design Ability to thrive in a fast-paced, collaborative, agile environment Passion for clean code, testing, and continuous improvement
Posted 5 days ago
4.0 - 8.0 years
6 - 10 Lacs
Bengaluru
Work from Office
Overview Join Intuits Business Intelligence (BI) Platform team as we reimagine the next generation of scalable, intelligent data infrastructure We serve over 240TB of data, 2 billion records daily, and deliver 200+ million report requests through 20+ complex pipelines?supporting enterprise and mid-market customers on their most critical decisions, We are seeking a Senior Data Engineer to join our Data Platform team, with a focus on designing robust data models, building scalable ETL/ELT pipelines, and enabling trustworthy, high-quality data for analytics, reporting, and intelligent systems In this role, you will play a critical part in evolving our data architecture, ensuring data quality, and building integrations that power analytics and decision-making across the business, What you'll bring 6+ years of hands-on experience in data engineering or data platform development, Strong experience in building and optimizing data pipelines using Spark and Flink, Proficiency with DBT for transformation workflows and Kafka for event-driven ingestion, Solid understanding of data modeling principles and best practices in relational and analytical systems, Proven track record in creating and maintaining historical, delta, and snapshot data structures, Familiarity with data quality frameworks and tools for validation and anomaly detection, Experience working with columnar file formats and scalable data storage systems, Strong coding skills in Python or Scala, and familiarity with SQL at scale, Bachelors or Masters degree in Computer Science, Data Engineering, or a related field, How you will lead Design and implement scalable ETL and ELT pipelines using tools like Apache Spark, DBT, and Kafka, Own the development of data models that support reporting, analytics, and machine learning use cases, Build and maintain historical, delta, and snapshot tables optimized for large-scale data processing and access patterns, Work with columnar storage formats ( e-g , Parquet, ORC) to optimize performance and storage efficiency, Integrate and automate data validation and quality checks, ensuring trust and accuracy across pipelines, Partner with data platform and product teams to design and deliver seamless data integrations across systems and domains, Contribute to data governance practices, schema evolution, and performance tuning, Show
Posted 5 days ago
6.0 - 10.0 years
8 - 12 Lacs
Gurugram
Work from Office
Join us as a Data Engineer Were looking for someone to build effortless, digital first customer experiences to help simplify our organisation and keep our data safe and secure Day-to-day, youll develop innovative, data-driven solutions through data pipelines, modelling and ETL design while inspiring to be commercially successful through insights If youre ready for a new challenge, and want to bring a competitive edge to your career profile by delivering streaming data ingestions, this could be the role for you We're offering this role at associate vice president level What youll do Your daily responsibilities will include you developing a comprehensive knowledge of our data structures and metrics, advocating for change when needed for product development Youll also provide transformation solutions and carry out complex data extractions, Well expect you to develop a clear understanding of data platform cost levels to build cost-effective and strategic solutions Youll also source new data by using the most appropriate tooling before integrating it into the overall solution to deliver it to our customers, Youll Also Be Responsible For Driving customer value by understanding complex business problems and requirements to correctly apply the most appropriate and reusable tools to build data solutions Participating in the data engineering community to deliver opportunities to support our strategic direction Carrying out complex data engineering tasks to build a scalable data architecture and the transformation of data to make it usable to analysts and data scientists Building advanced automation of data engineering pipelines through the removal of manual stages Leading on the planning and design of complex products and providing guidance to colleagues and the wider team when required The skills youll need To be successful in this role, youll have an understanding of data usage and dependencies with wider teams and the end customer Youll also have experience of extracting value and features from large scale data, Well expect you to have experience of ETL technical design, data quality testing, cleansing and monitoring, data sourcing, exploration and analysis, and data warehousing and data modelling capabilities, Youll Also Need Experience of using programming language such as Python for developing custom operators and sensors in Airflow, improving workflow capabilities and reliability Good knowledge of Kafka and Kinesis for effective real-time data processing, Scala and Spark to enhance data processing efficiency and scalability, Great communication skills with the ability to proactively engage with a range of stakeholders Show
Posted 5 days ago
2.0 - 5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Overview Come join the "Unified Ingestion Platform (UIP)" under A2D org as a "Data Engineer 2". UIP is a designated paved platform at Intuit for data ingestion/movement from one hosting location to another. As a Data Engineer, you will be working on cutting edge technologies to create a world class data movement platform. This is the place to be if it is your passion to build highly reliable and scalable ingestion capabilities on cloud and push the boundaries of automation! What you'll bring BE/B.Tech/MS in Computer Science (or equivalent) 2 to 5 years of experience in a Data Engineering role with good knowledge of Software engineering. Strong CS fundamentals including data structures, algorithms and distributed systems. Demonstrate robust problem solving, decision-making, and analytical skills. Expert level experience in designing high throughput data solutions / services. Hands-on experience on AWS (EC2, EMR, S3, Athena, EMR, Kinesis, Lambda etc). Knowledge of GCP(DataProc, GCS, BigQuery etc) is a plus. Strong programming knowledge in one of the languages - Java, Scala or Python. Expert level experience in developing data pipelines/solutions using processing engines like Hive, Spark,Spark Streaming, Flink etc Good knowledge on Lake House architecture for data persistence. DeltaLake, Iceberg or Hudi knowledge isa plus Adequate experience with RESTful web services and micro service architectures How you will lead Design and build capabilities to support Batch and Realtime ingestion at scale using open source technologies which are fault tolerant. Design solutions that involve complex, multi-system and multi cloud integration, possibly across BUs or domains End to end engineering – design, development, testing, deployment and operations Ability to work in a dynamic environment, adapt to business requirements using Agile methodologies and DevOps culture Conducts code reviews to ensure code quality, consistency and best practices adherence. Conducts quick Proof of Concept (POCs) for feasibility studies and take it to the prod Lead by example, demonstrating best practices for unit testing, CI/CD, performance testing, capacity planning, documentation, monitoring, alerting, and incident response
Posted 5 days ago
4.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
The Purview team is dedicated to protecting and governing the enterprise digital estate on a global scale. Our mission involves developing cloud solutions that offer premium features such as security, compliance, data governance, data loss prevention and insider risk management. These solutions are fully integrated across Office 365 services and clients, as well as Windows. We create global-scale services to transport, store, secure, and manage some of the most sensitive data on the planet, leveraging Azure, Exchange, and other cloud platforms, along with Office applications like Outlook. The IDC arm of our team is expanding significantly and seeks talented, highly motivated engineers. This is an excellent opportunity for those looking to build expertise in cloud distributed systems, security, and compliance. Our team will develop cloud solutions that meet the demands of a vast user base, utilizing state-of-the-art technologies to deliver comprehensive protection. Office 365, the industry leader in hosted productivity suites, is the fastest-growing business at Microsoft, with over 100 million seats hosted in multiple data centers worldwide. The Purview Engineering team provides leadership, direction, and accountability for application architecture, cloud design, infrastructure development, and end-to-end implementation. You will independently determine and develop architectural approaches and infrastructure solutions, conduct business reviews, and operate our production services. Strong collaboration skills are essential to work closely with other engineering teams, ensuring our services and systems are highly stable, performant, and meet the expectations of both internal and external customers and users. Microsoft’s mission is to empower every person and every organization on the planet to achieve more. As employees we come together with a growth mindset, innovate to empower others, and collaborate to realize our shared goals. Each day we build on our values of respect, integrity, and accountability to create a culture of inclusion where everyone can thrive at work and beyond. Responsibilities Build cloud-scale services that process and analyze massive volumes of organizational signals in real time. Harness the power of Apache Spark for high-performance data processing and scalable pipelines. Apply machine learning to uncover subtle patterns and anomalies that signal insider threats. Craft intelligent user experiences using React and AI-driven insights to help security analysts act with confidence. Work with a modern tech stack and contribute to a product that’s mission-critical for some of the world’s largest organizations. Collaborate across disciplines—from data science to UX to cloud infrastructure—in a fast-paced, high-impact environment. Design and deliver end-to-end features including system architecture, coding, deployment, scalability, performance, and quality. Develop large-scale distributed software services and solutions that are modular, secure, reliable, diagnosable, and reusable. Conduct investigations and drive investments in complex technical areas to improve systems and services. Ensure engineering excellence by writing effective code, unit tests, debugging, code reviews, and building CI/CD pipelines. Troubleshoot and optimize Live Site operations, focusing on automation, reliability, and monitoring. Qualifications Qualifications - Required: Solid understanding of Object-Oriented Programming (OOP) and common Design Patterns. Minimum of 4+ years of software development experience, with proficiency in C#, Java, or scala. Hands-on experience with cloud platforms such as Azure, AWS, or Google Cloud; experience with Azure Services is a plus. Familiarity with DevOps practices, CI/CD pipelines, and agile methodologies. Strong skills in distributed systems and data processing. Excellent communication and collaboration abilities, with the capacity to handle ambiguity and prioritize effectively. A BS or MS degree in Computer Science or Engineering, or equivalent work experience. Qualifications - Other Requirements Ability to meet Microsoft, customer and/or government security screening requirements are required for this role. These requirements include, but are not limited to the following specialized security screenings: Microsoft Cloud Background Check: This position will be required to pass the Microsoft background and Microsoft Cloud background check upon hire/transfer and every two years thereafter. Microsoft is an equal opportunity employer. Consistent with applicable law, all qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations.
Posted 5 days ago
10.0 years
0 Lacs
Kanayannur, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D&A) – Cloud Architect - Manager As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance. The opportunity We’re looking for SeniorManagers (GTM +Cloud/ Big Data Architects) with strong technology and data understanding having proven delivery capability in delivery and pre sales. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team. Your Key Responsibilities Have proven experience in driving Analytics GTM/Pre-Sales by collaborating with senior stakeholder/s in the client and partner organization in BCM, WAM, Insurance. Activities will include pipeline building, RFP responses, creating new solutions and offerings, conducting workshops as well as managing in flight projects focused on cloud and big data. Need to work with client in converting business problems/challenges to technical solutions considering security, performance, scalability etc. [ 10- 15 years] Need to understand current & Future state enterprise architecture. Need to contribute in various technical streams during implementation of the project. Provide product and design level technical best practices Interact with senior client technology leaders, understand their business goals, create, architect, propose, develop and deliver technology solutions Define and develop client specific best practices around data management within a Hadoop environment or cloud environment Recommend design alternatives for data ingestion, processing and provisioning layers Design and develop data ingestion programs to process large data sets in Batch mode using HIVE, Pig and Sqoop, Spark Develop data ingestion programs to ingest real-time data from LIVE sources using Apache Kafka, Spark Streaming and related technologies Skills And Attributes For Success Architect in designing highly scalable solutions Azure, AWS and GCP. Strong understanding & familiarity with all Azure/AWS/GCP /Bigdata Ecosystem components Strong understanding of underlying Azure/AWS/GCP Architectural concepts and distributed computing paradigms Hands-on programming experience in Apache Spark using Python/Scala and Spark Streaming Hands on experience with major components like cloud ETLs,Spark, Databricks Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Knowledge of Spark and Kafka integration with multiple Spark jobs to consume messages from multiple Kafka partitions Solid understanding of ETL methodologies in a multi-tiered stack, integrating with Big Data systems like Cloudera and Databricks. Strong understanding of underlying Hadoop Architectural concepts and distributed computing paradigms Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Good knowledge in apache Kafka & Apache Flume Experience in Enterprise grade solution implementations. Experience in performance bench marking enterprise applications Experience in Data security [on the move, at rest] Strong UNIX operating system concepts and shell scripting knowledge To qualify for the role, you must have Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Ability to multi-task under pressure and work independently with minimal supervision. Strong verbal and written communication skills. Must be a team player and enjoy working in a cooperative and collaborative team environment. Adaptable to new technologies and standards. Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support Responsible for the evaluation of technical risks and map out mitigation strategies Experience in Data security [on the move, at rest] Experience in performance bench marking enterprise applications Working knowledge in any of the cloud platform, AWS or Azure or GCP Excellent business communication, Consulting, Quality process skills Excellent Consulting Skills Excellence in leading Solution Architecture, Design, Build and Execute for leading clients in Banking, Wealth Asset Management, or Insurance domain. Minimum 7 years hand-on experience in one or more of the above areas. Minimum 10 years industry experience Ideally, you’ll also have Strong project management skills Client management skills Solutioning skills What We Look For People with technical experience and enthusiasm to learn new things in this fast-moving environment What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 5 days ago
3.0 years
15 - 18 Lacs
Mumbai Metropolitan Region
Remote
Job Title: Business Intelligence (BI) Developer 27165 Location: Goregaon East, Mumbai (Nesco) Work Mode: Hybrid (4 days from office, 1 day remote) Work Hours: 12:30 PM – 9:30 PM IST Interview Process: 2 In-person rounds Notice Period: Immediate joiners preferred Travel Time: Should not exceed 1 hour Role Summary A key engineering role within the Technology Organization, this position supports the Information Solutions function, responsible for delivering and operating data architecture, pipelines, platforms, and analytic solutions in alignment with overall data strategy. The BI Developer will collaborate with stakeholders across the enterprise to implement modern data engineering practices, data integration, and insightful business intelligence solutions. Key Responsibilities Design and develop ETL routines to integrate data from diverse source systems. Define source-to-target mappings, transformation logic, and business rules. Develop dashboards and paginated reports using tools like Power BI and SSRS. Ensure data quality through process adherence and appropriate tooling. Contribute to and maintain data catalogs and dictionaries. Build and maintain data marts and data lakes to support enterprise initiatives. Perform business analysis to identify data requirements and drive insights. Translate business problems into data models and actionable solutions. Lead the design, development, and maintenance of complex BI reports and dashboards. Collaborate with business users to validate data outputs and train teams in BI tool usage. Initiate and lead analytics projects focused on business optimization. Provide technical support for the Microsoft BI stack. Conduct comprehensive data analysis and support warehouse/data mart solutions. Lead end-to-end design sessions for data integration projects. Deliver scalable and reusable BI solutions aligned with evolving business needs. Required Skills & Experience SQL Server Integration Services (SSIS): 3+ years SQL Server Reporting Services (SSRS): 3+ years SQL Server Analysis Services (SSAS): 3+ years Microsoft Business Intelligence (MSBI): 2+ years SQL Server databases: 4+ years Experience in data warehousing and data integration processes. Expertise in developing data models and visualizations with Power BI. Strong knowledge of paginated reports and SSRS design. Proficiency in data analysis, ETL optimization, and automation. Experience in designing dashboards, visual analytics, and reports. Solid command of Microsoft Excel. Excellent communication, documentation, and stakeholder engagement skills. Ability to work independently in a fast-paced environment. Nice To Have Experience with advanced analytics tools such as R, Python, SAS, or Scala. Exposure to DevOps practices and CI/CD pipelines. Familiarity with cloud platforms like Azure, AWS, or Snowflake. Experience with Power BI Report Server setup and administration. Background in data lake architecture and reference data management. Exposure to other BI tools such as Tableau, QlikView/QlikSense, or MicroStrategy. Skills: integration,data,design,data analysis,sql server integration services (ssis),intelligence,sql server reporting services (ssrs),etl optimization,business intelligence,dashboard design,data integration,microsoft excel,data warehousing,sql server analysis services (ssas),visualizations,microsoft business intelligence (msbi),sql server databases,power bi,data models
Posted 5 days ago
10.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D&A) – Cloud Architect - Manager As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance. The opportunity We’re looking for SeniorManagers (GTM +Cloud/ Big Data Architects) with strong technology and data understanding having proven delivery capability in delivery and pre sales. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team. Your Key Responsibilities Have proven experience in driving Analytics GTM/Pre-Sales by collaborating with senior stakeholder/s in the client and partner organization in BCM, WAM, Insurance. Activities will include pipeline building, RFP responses, creating new solutions and offerings, conducting workshops as well as managing in flight projects focused on cloud and big data. Need to work with client in converting business problems/challenges to technical solutions considering security, performance, scalability etc. [ 10- 15 years] Need to understand current & Future state enterprise architecture. Need to contribute in various technical streams during implementation of the project. Provide product and design level technical best practices Interact with senior client technology leaders, understand their business goals, create, architect, propose, develop and deliver technology solutions Define and develop client specific best practices around data management within a Hadoop environment or cloud environment Recommend design alternatives for data ingestion, processing and provisioning layers Design and develop data ingestion programs to process large data sets in Batch mode using HIVE, Pig and Sqoop, Spark Develop data ingestion programs to ingest real-time data from LIVE sources using Apache Kafka, Spark Streaming and related technologies Skills And Attributes For Success Architect in designing highly scalable solutions Azure, AWS and GCP. Strong understanding & familiarity with all Azure/AWS/GCP /Bigdata Ecosystem components Strong understanding of underlying Azure/AWS/GCP Architectural concepts and distributed computing paradigms Hands-on programming experience in Apache Spark using Python/Scala and Spark Streaming Hands on experience with major components like cloud ETLs,Spark, Databricks Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Knowledge of Spark and Kafka integration with multiple Spark jobs to consume messages from multiple Kafka partitions Solid understanding of ETL methodologies in a multi-tiered stack, integrating with Big Data systems like Cloudera and Databricks. Strong understanding of underlying Hadoop Architectural concepts and distributed computing paradigms Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Good knowledge in apache Kafka & Apache Flume Experience in Enterprise grade solution implementations. Experience in performance bench marking enterprise applications Experience in Data security [on the move, at rest] Strong UNIX operating system concepts and shell scripting knowledge To qualify for the role, you must have Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Ability to multi-task under pressure and work independently with minimal supervision. Strong verbal and written communication skills. Must be a team player and enjoy working in a cooperative and collaborative team environment. Adaptable to new technologies and standards. Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support Responsible for the evaluation of technical risks and map out mitigation strategies Experience in Data security [on the move, at rest] Experience in performance bench marking enterprise applications Working knowledge in any of the cloud platform, AWS or Azure or GCP Excellent business communication, Consulting, Quality process skills Excellent Consulting Skills Excellence in leading Solution Architecture, Design, Build and Execute for leading clients in Banking, Wealth Asset Management, or Insurance domain. Minimum 7 years hand-on experience in one or more of the above areas. Minimum 10 years industry experience Ideally, you’ll also have Strong project management skills Client management skills Solutioning skills What We Look For People with technical experience and enthusiasm to learn new things in this fast-moving environment What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 5 days ago
10.0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Position Overview Position Overview: Apollo is seeking a ESG Analytics Platform Developer to join the Global Technology team. This individual will report into the Head of ESG Tech & Data Strategy and serve as a key developer resource to address ESG data & tech needs from investment business by building out ESG data pipelines, microservices, interactive frontend UIs, dashboards and applications. Additionally, the individual will also work with other members of the Global Technology team, investment teams and dedicated ESG team to produce proprietary and unique ESG insights. The successful candidate will be an integral part of both the Global Technology and the ESG/Sustainability platform, working collaboratively across the Equity, Hybrid and Yield business lines. Process Engineering: Working across Equity, Hybrid and Yield business lines along with relevant technology teams to design and implement solutions to further the integration of ESG considerations into the investment process, product development and reporting. Data Quality: Identify and address ESG data matters, such as approaches to missing data, data quality issues and leveraging structured/unstructured data. Reporting & Analytics: Collaborate with the ESG team to further develop ESG reporting & analytics along with broader communication with LPs and clients. Member of the Investment Technology team, responsible for full lifecycle of development tasks - analyze, design and code business-related solutions, as well as core architectural changes, using an Agile programming approach resulting in software delivered to tight deadlines. Participate and contribute to design discussions and code reviews. Understand changing priorities and be forward-thinking in context switching. Build positive relationships with other team members, collaborate, and communicate effectively to reach successful outcomes Utilize problem-solving skills to help your peers in the research and selection of tools, products, and frameworks (which is vital to support business initiatives) Qualifications & Experience Bachelor’s and or Master’s degree in Computer Science or another STEM field 10+ years of proven hands-on full stack development expertise 1-3 years in an ESG technology focused role; Knowledge of financial instruments (e.g., fixed income, alternatives, equities and derivatives). Proficient in python, and an object-oriented language such as Java, C/C++core is desirable Proficient in JavaScript, Node JS, html, css and/or other front-end development languages Experience in implementing data visualizations in Tableau, Dash or Power BI Working knowledge of Scala, Java, and/or kdb/Q. SQL experience and the handling of large datasets, preferably in a Snowflake environment Working knowledge of Jupyter notebooks, Ipywidget, Voila and associated python packages is a plus Comfortable working in an agile software delivery environment, with Git and with exposure to CI / CD tools (e.g., GitHub, Docker, Jenkins) Experience with messaging systems across application stack (e.g., Kafka) Strong software experience in developing and a proven track record of delivering full-stack cloud-based applications Demonstrated ability to build and deploy microservices in a scalable high throughout workflow Passion for clean, maintainable code and are always looking to improve your engineering skills in fast-paced, ambiguous environments Superior interpersonal skills; builds and maintains strong relationships/credibility with external counterparties. Strong communication and influencing skills with partners and peers at all levels of the organization. Apollo provides equal employment opportunities regardless of age, disability, gender reassignment, marital or civil partner status, pregnancy or maternity, race, colour, nationality, ethnic or national origin, religion or belief, veteran status, gender/sex or sexual orientation, or any other criterion or circumstance protected by applicable law, ordinance, or regulation. The above criteria are intended to be used as a guide only – candidates who do not meet all the above criteria may still be considered if they are deemed to have relevant experience/ equivalent levels of skill or knowledge to fulfil the requirements of the role. Any job offer will be conditional upon and subject to satisfactory reference and background screening checks, all necessary corporate and regulatory approvals or certifications as required from time to time, and entering into definitive contractual documentation satisfactory to Apollo.
Posted 5 days ago
6.0 - 10.0 years
16 - 25 Lacs
Pune, Chennai
Work from Office
JOB DETAILS: job title Data Pipeline Development: Design, develop, test, and deploy robust, high-performance, and scalable ETL/ELT data pipelines using Scala and Apache Spark to ingest, process, and transform large volumes of structured and unstructured data from diverse sources. Big Data Expertise: Leverage expertise in the Hadoop ecosystem (HDFS, Hive, etc.) and distributed computing principles to build efficient and fault-tolerant data solutions. Advanced SQL: Write complex, optimized SQL queries and stored procedures. Performance Optimization: Continuously monitor, analyze, and optimize the performance of data pipelines and data stores. Troubleshoot complex data-related issues, identify bottlenecks, and implement solutions for improved efficiency and reliability. Data Quality & Governance: Implement data quality checks, validation rules, and reconciliation processes to ensure the accuracy, completeness, and consistency of data. Contribute to data governance and security best practices. Automation & CI/CD: Implement automation for data pipeline deployment, monitoring, and alerting using tools like Apache Airflow, Jenkins, or similar CI/CD platforms. Documentation: Create and maintain comprehensive technical documentation for data architectures, pipelines, and processes. Required Skills & Qualifications: Bachelor's or Master's degree in Computer Science, Engineering, or a related quantitative field. Minimum 5 years of professional experience in Data Engineering, with a strong focus on big data technologies. Proficiency in Scala for developing big data applications and transformations, especially with Apache Spark. Expert-level proficiency in SQL; ability to write complex queries, optimize performance, and understand database internals. Extensive hands-on experience with Apache Spark (Spark SQL, DataFrames, RDDs) for large-scale data processing and analytics. Solid understanding of distributed computing concepts and experience with the Hadoop ecosystem (HDFS, Hive). Experience with building and optimizing ETL/ELT processes and data warehousing concepts. Strong understanding of data modeling techniques (e.g., Star Schema, Snowflake Schema). Familiarity with version control systems (e.g., Git). Excellent problem-solving, analytical, and communication skills. Ability to work independently and collaboratively in an Agile team environment.
Posted 5 days ago
5.0 - 10.0 years
25 - 35 Lacs
Bengaluru
Hybrid
Job Title: SDE 3 Senior Data Engineer Location: Bengaluru (Hybrid 3 days/week in office) Experience: 8-11 Years Type: Full-time Apply: Share your resume with the details listed below to vijay.s@xebia.com Availability: Immediate joiners or max 2 weeks' notice period only About the Role Xebia is looking for an experienced and hands-on SDE 3 Senior Data Engineer to lead the development of scalable data solutions. As a senior IC (individual contributor), you’ll influence architecture decisions, coach teams, and deliver high-performance data engineering systems for large-scale, enterprise environments. You’ll work across the full data lifecycle—from ingestion and storage to transformation and analytics—leveraging technologies like Spark, Scala, SQL, Cloud Native tools, and Hadoop , in a fast-paced, agile environment. Key Responsibilities Lead the design and implementation of data pipelines using Apache Spark and Scala Architect cloud-native, scalable, and fault-tolerant data platforms (Azure preferred) Drive development of streaming pipelines using Kafka/Event Hub/Spark Streaming Guide system design with a focus on scalability, low-latency, and performance Work on structured and unstructured data, Data Lakes, and Medallion Architecture Collaborate with stakeholders, mentor junior engineers, and lead Agile squads Implement best practices for CI/CD, containerization (Docker/Kubernetes), and orchestration (Airflow/Oozie) Must-Have Skills Apache Spark, Scala (or Java with strong preference for Scala) SQL and Data Structures, Query Optimization Hadoop and Distributed Systems Cloud-native architecture (Azure preferred) System Design, Big Data Design Patterns CI/CD: Git, Jenkins, Docker, Kubernetes Kafka/Structured Streaming Experience with NoSQL, Messaging Queues, Orchestration tools Good-to-Have Skills Apache Iceberg, Parquet, Ceph, Kafka Connect Experience with Data Governance tools: Alation, Collibra Data Lakes and Medallion Architecture Metadata Management, Master Data Management Data Quality and Lineage frameworks Why Xebia? At Xebia, you’ll work with passionate technologists solving large-scale problems using modern data stacks. We foster innovation, cross-functional learning, and continuous growth. Be part of a dynamic team that delivers real impact in data-driven enterprises. To Apply Please share your updated resume and include the following details in your email to vijay.s@xebia.com : Full Name: Total Experience: Current CTC: Expected CTC: Current Location: Preferred Xebia Location: Bengaluru Notice Period / Last Working Day (if serving): Primary Skills: LinkedIn Profile URL: Note: Only candidates who can join immediately or within 2 weeks will be considered. Join Xebia and shape the future of data engineering in enterprise systems.
Posted 5 days ago
5.0 - 10.0 years
25 - 35 Lacs
Gurugram
Hybrid
Job Title: Data Engineer Apache Spark, Scala, GCP & Azure Location: Gurugram (Hybrid 3 days/week in office) Experience: 5–10 Years Type: Full-time Apply: Share your resume with the details listed below to vijay.s@xebia.com Availability: Immediate joiners or max 2 weeks' notice period only About the Role Xebia is looking for a skilled Data Engineer to join our fast-paced team in Gurugram. You will work on building and optimizing scalable data pipelines, processing large datasets using Apache Spark and Scala , and deploying on cloud platforms like GCP and Azure . If you're passionate about clean architecture, high-quality data flow, and performance tuning, this is the opportunity for you. Key Responsibilities Design and develop robust ETL pipelines using Apache Spark Write clean and efficient data processing code in Scala Handle large-scale data movement, transformation, and storage Build solutions on Google Cloud Platform (GCP) and Microsoft Azure Collaborate with teams to define data strategies and ensure data quality Optimize jobs for performance and cost on distributed systems Document technical designs and ETL flows clearly for the team Must-Have Skills Apache Spark Scala ETL design & development Cloud platforms: GCP & Azure Strong understanding of Data Engineering best practices Solid communication and collaboration skills Good-to-Have Skills Apache tools (Kafka, Beam, Airflow, etc.) Knowledge of data lake and data warehouse concepts CI/CD for data pipelines Exposure to modern data monitoring and observability tools Why Xebia? At Xebia, you’ll be part of a forward-thinking, tech-savvy team working on high-impact, global data projects. We prioritize clean code, scalable solutions, and continuous learning. Join us to build real-time, cloud-native data platforms that power business intelligence across industries. To Apply Please share your updated resume and include the following details in your email to vijay.s@xebia.com : Full Name: Total Experience: Current CTC: Expected CTC: Current Location: Preferred Xebia Location: Gurugram Notice Period / Last Working Day (if serving): Primary Skills: LinkedIn Profile URL: Note: Only candidates who can join immediately or within 2 weeks will be considered. Build intelligent, scalable data solutions with Xebia – let’s shape the future of data together.
Posted 5 days ago
10.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About the Job The Director Data Engineering will lead the development and implementation of a comprehensive data strategy that aligns with the organization’s business goals and enables data driven decision making. Roles and Responsibilitie s Build and manage a team of talented data managers and engineers with the ability to not only keep up with, but also pioneer, in this space Collaborate with and influence leadership to directly impact company strategy and direction Develop new techniques and data pipelines that will enable various insights for internal and external customers Develop deep partnerships with client implementation teams, engineering and product teams to deliver on major cross-functional measurements and testing Communicate effectively to all levels of the organization, including executives Provide success in partnering teams with dramatically varying backgrounds, from the highly technical to the highly creative Design a data engineering roadmap and execute the vision behind it Hire, lead, and mentor a world-class data team Partner with other business areas to co-author and co-drive strategies on our shared roadmap Oversee the movement of large amounts of data into our data lake Establish a customer-centric approach and synthesize customer needs Own end-to-end pipelines and destinations for the transfer and storage of all data Manage 3rd-party resources and critical data integration vendors Promote a culture that drives autonomy, responsibility, perfection and mastery. Maintain and optimize software and cloud expenses to meet financial goals of the company Provide technical leadership to the team in design and architecture of data products and drive change across process, practices, and technology within the organization Work with engineering managers and functional leads to set direction and ambitious goals for the Engineering department Ensure data quality, security, and accessibility across the organization Skills You Will Need 10+ years of experience in data engineering 5+ years of experience leading data teams of 30+ resources or more, including selection of talent planning / allocating resources across multiple geographies and functions. 5+ years of experience with GCP tools and technologies, specifically, Google BigQuery, Google cloud composer, Dataflow, Dataform, etc. Experience creating large-scale data engineering pipelines, data-based decision-making and quantitative analysis tools and software Experience with hands-on to code version control systems (git) Experience with CICD, data architectures, pipelines, quality, and code management Experience with complex, high volume, multi-dimensional data, based on unstructured, structured, and streaming datasets Experience with SQL and NoSQL databases Experience creating, testing, and supporting production software and systems Proven track record of identifying and resolving performance bottlenecks for production systems Experience designing and developing data lake, data warehouse, ETL and task orchestrating systems Strong leadership, communication, time management and interpersonal skills Proven architectural skills in data engineering Experience leading teams developing production-grade data pipelines on large datasets Experience designing a large data lake and lake house experience, managing data flows that integrate information from various sources into a common pool implementing data pipelines based on the ETL model Experience with common data languages (e.g. Python, Scala) and data warehouses (e.g. Redshift, BigQuery, Snowflake, Databricks) Extensive experience on cloud tools and technologies - GCP preferred Experience managing real-time data pipelines Successful track record and demonstrated thought-leadership and cross-functional influence and partnership within an agile / water-fall development environment. Experience in regulated industries or with compliance frameworks (e.g., SOC 2, ISO 27001). Nice to have: HR services industry experience Experience in data science, including predictive modeling Experience leading teams across multiple geographies
Posted 5 days ago
2.0 - 15.0 years
0 Lacs
Mysore, Karnataka, India
On-site
Job Responsibilities Conduct classroom training / virtual training Develop teaching materials including exercises & assignments Design assessments for various proficiency levels in each competency Enhance course material & course delivery based on feedback to improve training effectiveness Gather feedback from stakeholders, identify actions based on feedback and implement changes Program Management and Governance Location: Mysore, Bangalore Description of the Profile We are looking for trainers with 2 to15 years of teaching or IT experience and technology know-how in one or more of the following areas: Java – Java programming, Spring, Spring Boot, Angular / React, Bootstrap Open source – Python, PHP, Unix / Linux, MySQL, Apache, HTML5, CSS3, JavaScript Data Science – Python for data science, Machine learning, Exploratory data analysis, Statistics & Probability Big Data – Python programming, Hadoop, Spark, Scala, Mongo DB, NoSQL Microsoft – C# programming, SQL Server, ADO.NET, ASP.NET, MVC design pattern, Azure, SharePoint etc. MEAN / MERN stacks SAP – SAP ABAP programming / SAP MM / SAP SD /SAP BI / SAP S4 HANA Oracle – Oracle E-Business Suite (EBS) / PeopleSoft / Siebel CRM / Oracle Cloud / OBIEE / Fusion Middleware Cloud & Infrastructure Management – Network administration / Database administration / Windows administration / Linux administration / Middleware administration / End User Computing / ServiceNow, Cloud platforms like AWS / GCP/ Azure / Oracle Cloud, Virtualization DBMS – Oracle / SQL Server / MySQL / DB2 / NoSQL Testing – Selenium, Microfocus - UFT, Microfocus-ALM tools, SOA testing, SOAPUI, Rest assured, Appium API and integration – API, Microservices, TIBCO, APIGee, Mule Digital Commerce – SalesForce, Adobe Experience Manager Digital Process Automation - PEGA, Appian, Camunda, Unqork, UIPath Training-related experience Must have Teaching experience : conducting training sessions in classroom and dynamically responding to different capabilities of learners; experience in analyzing the feedback from sessions and identifying action areas for self-improvement Developing teaching material : Experience in gathering training needs, identifying learning objectives and designing training curriculum; experience in developing teaching material, including exercises and assignments Good presentation skills, excellent oral / written communication skills Nice to have Teaching experience : Experience in delivering session over virtual classrooms Program managing training : Practical experience in addressing organizational training needs by leading a team of educators; set goals, monitor progress, evaluate performance, and communicate to stakeholders Instructional Design: Developing engaging content Designing Assessments: Experience in designing assessments to evaluate the effectiveness of training and gauging the proficiency of the learner Participated in activities of the software development lifecycle like development, testing, configuration management and roll-out Educational Qualification & Experience Must have Bachelor’s / Master’s degree in Engineering or Master’s degree in Science / Computer Applications with consistently good academic record 2 to 15 years of relevant experience in training Nice to have Technology certification from any major certifying authorities like Microsoft, Oracle, Google, Amazon, Scrum, etc. Certification in teaching or eLearning content development
Posted 5 days ago
2.0 - 15.0 years
0 Lacs
Mysore, Karnataka, India
On-site
Job Responsibilities Conduct classroom training / virtual training Develop teaching materials including exercises & assignments Design assessments for various proficiency levels in each competency Enhance course material & course delivery based on feedback to improve training effectiveness Gather feedback from stakeholders, identify actions based on feedback and implement changes Program Management and Governance Location: Mysore, Bangalore Description of the Profile We are looking for trainers with 2 to15 years of teaching or IT experience and technology know-how in one or more of the following areas: Java – Java programming, Spring, Spring Boot, Angular / React, Bootstrap Open source – Python, PHP, Unix / Linux, MySQL, Apache, HTML5, CSS3, JavaScript Data Science – Python for data science, Machine learning, Exploratory data analysis, Statistics & Probability Big Data – Python programming, Hadoop, Spark, Scala, Mongo DB, NoSQL Microsoft – C# programming, SQL Server, ADO.NET, ASP.NET, MVC design pattern, Azure, SharePoint etc. MEAN / MERN stacks SAP – SAP ABAP programming / SAP MM / SAP SD /SAP BI / SAP S4 HANA Oracle – Oracle E-Business Suite (EBS) / PeopleSoft / Siebel CRM / Oracle Cloud / OBIEE / Fusion Middleware Cloud & Infrastructure Management – Network administration / Database administration / Windows administration / Linux administration / Middleware administration / End User Computing / ServiceNow, Cloud platforms like AWS / GCP/ Azure / Oracle Cloud, Virtualization DBMS – Oracle / SQL Server / MySQL / DB2 / NoSQL Testing – Selenium, Microfocus - UFT, Microfocus-ALM tools, SOA testing, SOAPUI, Rest assured, Appium API and integration – API, Microservices, TIBCO, APIGee, Mule Digital Commerce – SalesForce, Adobe Experience Manager Digital Process Automation - PEGA, Appian, Camunda, Unqork, UIPath Training-related experience Must have Teaching experience : conducting training sessions in classroom and dynamically responding to different capabilities of learners; experience in analyzing the feedback from sessions and identifying action areas for self-improvement Developing teaching material : Experience in gathering training needs, identifying learning objectives and designing training curriculum; experience in developing teaching material, including exercises and assignments Good presentation skills, excellent oral / written communication skills Nice to have Teaching experience : Experience in delivering session over virtual classrooms Program managing training : Practical experience in addressing organizational training needs by leading a team of educators; set goals, monitor progress, evaluate performance, and communicate to stakeholders Instructional Design: Developing engaging content Designing Assessments: Experience in designing assessments to evaluate the effectiveness of training and gauging the proficiency of the learner Participated in activities of the software development lifecycle like development, testing, configuration management and roll-out Educational Qualification & Experience Must have Bachelor’s / Master’s degree in Engineering or Master’s degree in Science / Computer Applications with consistently good academic record 2 to 15 years of relevant experience in training Nice to have Technology certification from any major certifying authorities like Microsoft, Oracle, Google, Amazon, Scrum, etc. Certification in teaching or eLearning content development
Posted 5 days ago
6.0 - 10.0 years
20 - 27 Lacs
Pune, Chennai
Work from Office
Mandatory - Experience and knowledge in designing, implementing, and managing non-relational data stores (e.g., MongoDB, Cassandra, DynamoDB), focusing on flexible schema design, scalability, and performance optimization for handling large volumes of unstructured or semi-structured data. Mainly client needs No SQL DB, either MongoDB or HBase Data Pipeline Development: Design, develop, test, and deploy robust, high-performance, and scalable ETL/ELT data pipelines using Scala and Apache Spark to ingest, process, and transform large volumes of structured and unstructured data from diverse sources. Big Data Expertise: Leverage expertise in the Hadoop ecosystem (HDFS, Hive, etc.) and distributed computing principles to build efficient and fault-tolerant data solutions. Advanced SQL: Write complex, optimized SQL queries and stored procedures. Performance Optimization: Continuously monitor, analyze, and optimize the performance of data pipelines and data stores. Troubleshoot complex data-related issues, identify bottlenecks, and implement solutions for improved efficiency and reliability. Data Quality & Governance: Implement data quality checks, validation rules, and reconciliation processes to ensure the accuracy, completeness, and consistency of data. Contribute to data governance and security best practices. Automation & CI/CD: Implement automation for data pipeline deployment, monitoring, and alerting using tools like Apache Airflow, Jenkins, or similar CI/CD platforms. Documentation: Create and maintain comprehensive technical documentation for data architectures, pipelines, and processes. Required Skills & Qualifications: Bachelor's or Master's degree in Computer Science, Engineering, or a related quantitative field. Minimum 5 years of professional experience in Data Engineering, with a strong focus on big data technologies. Proficiency in Scala for developing big data applications and transformations, especially with Apache Spark. Expert-level proficiency in SQL; ability to write complex queries, optimize performance, and understand database internals. Extensive hands-on experience with Apache Spark (Spark SQL, DataFrames, RDDs) for large-scale data processing and analytics. Mandatory - Experience and knowledge in designing, implementing, and managing non-relational data stores (e.g., MongoDB, Cassandra, DynamoDB), focusing on flexible schema design, scalability, and performance optimization for handling large volumes of unstructured or semi-structured data. Solid understanding of distributed computing concepts and experience with the Hadoop ecosystem (HDFS, Hive). Experience with building and optimizing ETL/ELT processes and data warehousing concepts. Strong understanding of data modeling techniques (e.g., Star Schema, Snowflake Schema). Familiarity with version control systems (e.g., Git). Excellent problem-solving, analytical, and communication skills. Ability to work independently and collaboratively in an Agile team environment.
Posted 5 days ago
4.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Role Overview We are looking for highly skilled with 4 to 5 years experienced Generative AI Engineer to design and deploy enterprise-grade GenAI systems. This role blends platform architecture, LLM integration, and operationalization—ideal for engineers with strong hands-on experience in large language models, RAG pipelines, and AI orchestration. Responsibilities Platform Leadership: Architect GenAI platforms powering copilots, document AI, multi-agent systems, and RAG pipelines. LLM Expertise: Build/fine-tune GPT, Claude, Gemini, LLaMA 2/3, Mistral; deep in RLHF, transformer internals, and multi-modal integration. RAG Systems: Develop scalable pipelines with embeddings, hybrid retrieval, prompt orchestration, and vector DBs (Pinecone, FAISS, pgvector). Orchestration & Hosting: Lead LLM hosting, LangChain/LangGraph/AutoGen orchestration, AWS SageMaker/Bedrock integration. Responsible AI: Implement guardrails for PII redaction, moderation, lineage, and access aligned with enterprise security standards. LLMOps/MLOps: Deploy CI/CD pipelines, automate tuning/rollout, handle drift, rollback, and incidents with KPI dashboards. Cost Optimization: Reduce TCO via dynamic routing, GPU autoscaling, context compression, and chargeback tooling. Agentic AI: Build autonomous, critic-supervised agents using MCP, A2A, LGPL patterns. Evaluation: Use LangSmith, BLEU, ROUGE, BERTScore, HIL to track hallucination, toxicity, latency, and sustainability. Skills Required 4–5 years in AI/ML (2+ in GenAI) Strong Python, PySpark, Scala; APIs via FastAPI, GraphQL, gRPC Proficiency with MLflow, Kubeflow, Airflow, Prompt flow Experience with LLMs, vector DBs, prompt engineering, MLOps Solid foundation in applied mathematics & statistics Nice to Have Open-source contributions, AI publications Hands-on with cloud-native GenAI deployment Deep interest in ethical AI and AI safety 2 Days WFO Mandatory Don't meet every job requirement? That's okay! Our company is dedicated to building a diverse, inclusive, and authentic workplace. If you're excited about this role, but your experience doesn't perfectly fit every qualification, we encourage you to apply anyway. You may be just the right person for this role or others.
Posted 5 days ago
6.0 - 11.0 years
12 - 17 Lacs
Pune
Work from Office
Roles and Responsibility The Senior Tech Lead - Databricks leads the design, development, and implementation of advanced data solutions. Has To have extensive experience in Databricks, cloud platforms, and data engineering, with a proven ability to lead teams and deliver complex projects. Responsibilities: Lead the design and implementation of Databricks-based data solutions. Architect and optimize data pipelines for batch and streaming data. Provide technical leadership and mentorship to a team of data engineers. Collaborate with stakeholders to define project requirements and deliverables. Ensure best practices in data security, governance, and compliance. Troubleshoot and resolve complex technical issues in Databricks environments. Stay updated on the latest Databricks features and industry trends. Key Technical Skills & Responsibilities Experience in data engineering using Databricks or Apache Spark-based platforms. Proven track record of building and optimizing ETL/ELT pipelines for batch and streaming data ingestion. Hands-on experience with Azure services such as Azure Data Factory, Azure Data Lake Storage, Azure Databricks, Azure Synapse Analytics, or Azure SQL Data Warehouse. Proficiency in programming languages such as Python, Scala, SQL for data processing and transformation. Expertise in Spark (PySpark, Spark SQL, or Scala) and Databricks notebooks for large-scale data processing. Familiarity with Delta Lake, Delta Live Tables, and medallion architecture for data lakehouse implementations. Experience with orchestration tools like Azure Data Factory or Databricks Jobs for scheduling and automation. Design and implement the Azure key vault and scoped credentials. Knowledge of Git for source control and CI/CD integration for Databricks workflows, cost optimization, performance tuning. Familiarity with Unity Catalog, RBAC, or enterprise-level Databricks setups. Ability to create reusable components, templates, and documentation to standardize data engineering workflows is a plus. Ability to define best practices, support multiple projects, and sometimes mentor junior engineers is a plus. Must have experience of working with streaming data sources and Kafka (preferred) Eligibility Criteria: Bachelors degree in Computer Science, Data Engineering, or a related field Extensive experience with Databricks, Delta Lake, PySpark, and SQL Databricks certification (e.g., Certified Data Engineer Professional) Experience with machine learning and AI integration in Databricks Strong understanding of cloud platforms (AWS, Azure, or GCP) Proven leadership experience in managing technical teams Excellent problem-solving and communication skills Our Offering Global cutting-edge IT projects that shape the future of digital and have a positive impact on environment. Wellbeing programs & work-life balance - integration and passion sharing events. Attractive Salary and Company Initiative Benefits Courses and conferences Attractive Salary Hybrid work culture
Posted 5 days ago
3.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform, Microsoft Azure Databricks, Microsoft Azure Data Services Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will engage in the design, construction, and configuration of applications tailored to fulfill specific business processes and application requirements. Your typical day will involve collaborating with cross-functional teams to gather requirements, developing application features, and ensuring that the applications are optimized for performance and usability. You will also participate in testing and debugging processes to deliver high-quality solutions that meet the needs of the organization and its stakeholders. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Assist in the documentation of application processes and workflows. - Engage in continuous learning to stay updated with the latest technologies and best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform, Microsoft Azure Databricks, Microsoft Azure Data Services. - Strong understanding of data integration techniques and ETL processes. - Experience with cloud-based data storage solutions and data management. - Familiarity with programming languages such as Python or Scala. - Ability to work with data visualization tools to present insights effectively. Additional Information: - The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform. - This position is based at our Hyderabad office. - A 15 years full time education is required.
Posted 5 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough