Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 6.0 years
5 - 10 Lacs
Jaipur
Work from Office
Role & responsibilities: Design, develop, and maintain robust ETL/ELT pipelines to ingest and process data from multiple sources. Build and maintain scalable and reliable data warehouses, data lakes, and data marts. Collaborate with data scientists, analysts, and business stakeholders to understand data needs and deliver solutions. Ensure data quality, integrity, and security across all data systems. Optimize data pipeline performance and troubleshoot issues in a timely manner. Implement data governance and best practices in data management. Automate data validation, monitoring, and reporting processes. Preferred candidate profile: Bachelor's or Masters degree in Computer Science, Engineering, Information Systems, or related field. Proven experience (X+ years) as a Data Engineer or similar role. Strong programming skills in Python, Java, or Scala. Proficiency with SQL and working knowledge of relational databases (e.g., PostgreSQL, MySQL). Hands-on experience with big data technologies (e.g., Spark, Hadoop). Familiarity with cloud platforms such as AWS, GCP, or Azure (e.g., S3, Redshift, BigQuery, Data Factory). Experience with orchestration tools like Airflow or Prefect. Knowledge of data modeling, warehousing, and architecture design principles. Strong problem-solving skills and attention to detail. Perks and benefits Free Meals PF and Gratuity Medical and Term Insurance
Posted 1 week ago
3.0 - 5.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
About The Advanced Analytics Team The central Advanced Analytics team at the Abbott Established Pharma Division’s (EPD) headquarters in Basel helps define and lead the transformation towards becoming a global, data-driven company with the help of data and advanced technologies (e.g., Machine Learning, Deep Learning, Generative AI, Computer Vision). To us, Advanced Analytics is an important lever to reach our business targets, now and in the future; It helps differentiate ourselves from our competition and ensure sustainable revenue growth at optimal margins. Hence the central AA team is an integral part of the Strategy Management Office at EPD that has a very close link and regular interactions with the EPD Senior Leadership Team. Primary Job Function With the above requirements in mind, EPD is looking to fill a role of a Cloud Engineer reporting to the Head of AA Product Development. The Cloud Engineer will be responsible for developing applications leveraging AWS services. This role involves leading cloud initiatives, ensuring robust cloud infrastructure, and driving innovation in cloud technologies to support the business's advanced analytics needs. Core Job Responsibilities Support the development and maintenance of company-wide frameworks and libraries that enable faster, better, and more informed decision-making within the business, creating significant business value from data & analytics. Ensure data availability and accessibility for prioritized Advanced Analytics scope, and maintain stable, scalable, and modular data science pipelines from data exploration to deployment. Acquire, ingest, and process data from multiple sources and systems into our cloud platform (AWS), ensuring data integrity and security. Collaborate with data scientists to map data fields to hypotheses, and curate, wrangle, and prepare data for advanced analytical models. Implement and manage robust security measures to ensure compliant handling and management of data, including access strategies aligned with Information Security, Cyber Security, and Data Privacy principles. Develop and deploy smart automation tools based on cloud technologies, aligned with business priorities and needs. Oversee the timely delivery of Advanced Analytics solutions in coordination with the rest of the team and per requirements and timelines, ensuring alignment with business goals. Collaborate closely with the Data Science team and AI Engineers to understand platform needs and lead the development of solutions that support their work. Troubleshoot and resolve issues related to the AWS platform, ensuring minimal downtime and optimal performance. Define and document best practices and strategies regarding application deployment and infrastructure maintenance. Drive continuous improvement of the AWS Cloud platform by contributing and implementing new ideas and processes. Supervisory/Management Responsibilities Direct Reports: None. Indirect Reports: None. Position Accountability/Scope The Cloud Engineer is accountable for delivering targeted business impact per initiative in collaboration with key stakeholders. This role involves significant responsibility for the architecture and management of Abbott's strategic cloud platforms and AI/AA programs, enabling faster, better, and more informed decision-making within the business. Minimum Education Master in relevant field (e.g., computer science, electrical engineering) Minimum Experience/Training Required At least 3-5 years of relevant experience, with a strong track record in building solutions/applications using AWS services Proven ability to work across structured, semi-structured, and unstructured data, extracting information and identifying linkages across disparate data sets. Proficiency in multiple programming languages – Javascript, Python, Scala, PySpark or Java. Extensive knowledge and experience with various database technologies, including distributed processing frameworks, relational databases, MPP databases, and NoSQL data stores. Deep understanding of Information Security principles to ensure compliant handling and management of data. Significant experience with cloud platforms, preferably AWS and its ecosystem. Advanced knowledge of development in CICD (Continuous Integration and Continuous Delivery) environments. Strong background in data warehousing / ETL tools. Proficiency in DevOps practices and tools such as Jenkins, Terraform, etc. Proficiency in serverless architecture and services like AWS Lambda. Understanding of security best practices and implementation in cloud environments. Ability to understand business objectives and create cloud-based solutions to meet those objectives. Result-driven, analytical, and creative thinker. Proven ability to work with cross-functional teams and bridge the gap between business and data science. Fluency in English is a must; additional languages are a plus. Additional Technical Skills Experience with front-end frameworks preferably React JS. Knowledge of back-end frameworks like Django, Flask, or Node.js. Familiarity with database technologies such as RedShift, MySQL, or DynamoDB. Understanding of RESTful API design and development. Experience with version control systems like CodeCommit. Show more Show less
Posted 1 week ago
8.0 - 15.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Position Summary... What you'll do... Role: Staff, Data Scientist Experience: 8- 15 years Location: Chennai About EBS team: Enterprise Business Services is invested in building a compact, robust organization that includes service operations and technology solutions for Finance, People, Associate Digital Experience. Our team is responsible for design and development of solution that knows our consumers needs better than ever by predicting what they want based on unconstrained demand, and efficiently unlock strategic growth, economic profit, and wallet share by orchestrating intelligent, connected planning and decisioning across all functions. We interact with multiple teams across the company to provide scalable robust technical solutions. This role will play crucial role in overseeing the planning, execution and delivery of complex projects within team Walmarts Enterprise Business Services (EBS) is a powerhouse of several exceptional teams delivering world-class technology solutions and services making a profound impact at every level of Walmart. As a key part of Walmart Global Tech, our teams set the bar for operational excellence and leverage emerging technology to support millions of customers, associates, and stakeholders worldwide. Each time an associate turns on their laptop, a customer makes a purchase, a new supplier is onboarded, the company closes the books, physical and legal risk is avoided, and when we pay our associates consistently and accurately, that is EBS. Joining EBS means embarking on a journey of limitless growth, relentless innovation, and the chance to set new industry standards that shape the future of Walmart. About Team The data science team at Enterprise Business Services Pillar at Walmart Global Tech focuses on using the latest research in machine learning, statistics, and optimization to solve business problems. We mine data, distill insights, extract information, build analytical models, deploy Machine Learning algorithms, and use the latest algorithms and technology to empower business decision-making. In addition, we work with engineers to build reference architectures and machine learning pipelines in a big data ecosystem to productize our solutions. Advanced analytical algorithms driven by our team will help Walmart to optimize business operations, business practices and change the way our customers shop. The data science community at Walmart Global Tech is active in most of the Hack events, utilizing the petabytes of data at our disposal, to build some of the coolest ideas. All the work we do at Walmart Labs will eventually benefit our operations ; our associates, helping Customers Save Money to Live Better. What You Will Do As a Staff Data Scientist for Walmart Global Tech, you'll have the opportunity to Drive data-derived insights across a wide range of retail ; Finance divisions by developing advanced statistical models, machine learning algorithms and computational algorithms based on business initiatives Direct the gathering of data, assess data validity and synthesize data into large analytics datasets to support project goals Utilize big data analytics and advanced data science techniques to identify trends, patterns, and discrepancies in data. Determine additional data needed to support insights Build and train AI/ML models for replication for future projects Deploy and maintain the data science solutions Communicate recommendations to business partners and influence future plans based on insights Consult with business stakeholders regarding algorithm-based recommendations and be a thought-leader to develop these into business actions. Closely partners with the Senior Manager ; Director of Data Science to drive data science adoption in the domain Guides. data scientists, senior data scientists ; staff data scientists across multiple sub-domains to ensure on-time delivery of ML products Drive efficiency across the domain in terms of DS and ML best practices, ML Ops practices, resource utilization, reusability and multi-tenancy. Lead multiple complex ML products and guide senior tech leads in the domain in efficiently leading their products. Drive synergies across different products in terms of algorithmic innovation and sharing of best practices. Proactive identification of complex business problems that can be solved using advanced ML, finding opportunities and gaps in the current business domain Evaluates proposed business cases for projects and initiatives What You Will Bring Masters with > 10 years OR Ph.D. with > 8 years of relevant experience. Educational qualifications should be Computer Science/Statistics/Mathematics or a related area. Minimum 6 years of experience as a data science technical lead Ability to lead multiple data science projects end to end. Deep experience in building data science solution in areas like fraud prevention, forecasting, shrink and waste reduction, inventory management, recommendation, assortment and price optimization Deep experience in simultaneously leading multiple data science initiatives end to end – from translating business needs to analytical asks, leading the process of building solutions and the eventual act of deployment and maintenance of them Strong experience in machine learning: Classification models, regression models, NLP, Forecasting, Unsupervised models, Optimization, Graph ML, Causal inference, Causal ML, Statistical Learning, experimentation ; Gen-AI In Gen-AI, it is desirable to have experience in embedding generation from training materials, storage and retrieval from Vector Databases, set-up and provisioning of managed LLM gateways, development of Retrieval augmented generation based LLM agents, model selection, iterative prompt engineering and finetuning based on accuracy and user-feedback, monitoring and governance. Ability to scale and deploy data science solutions. Strong Experience with one or more of Python and R. Experience in GCP/Azure Strong Experience in Python, PySpark Google Cloud platform, Vertex AI, Kubeflow, model deployment Strong Experience with big data platforms – Hadoop (Hive, Map Reduce, HQL, Scala) Experience with GPU/CUDA for computational efficiency About Walmart Global Tech Imagine working in an environment where one line of code can make life easier for hundreds of millions of people. That’s what we do at Walmart Global Tech. We’re a team of software engineers, data scientists, cybersecurity expert's and service professionals within the world’s leading retailer who make an epic impact and are at the forefront of the next retail disruption. People are why we innovate, and people power our innovations. We are people-led and tech-empowered. We train our team in the skillsets of the future and bring in experts like you to help us grow. We have roles for those chasing their first opportunity as well as those looking for the opportunity that will define their career. Here, you can kickstart a great career in tech, gain new skills and experience for virtually every industry, or leverage your expertise to innovate at scale, impact millions and reimagine the future of retail. Flexible, hybrid work We use a hybrid way of working with primary in office presence coupled with an optimal mix of virtual presence. We use our campuses to collaborate and be together in person, as business needs require and for development and networking opportunities. This approach helps us make quicker decisions, remove location barriers across our global team, be more flexible in our personal lives. Benefits Beyond our great compensation package, you can receive incentive awards for your performance. Other great perks include a host of best-in-class benefits maternity and parental leave, PTO, health benefits, and much more. Belonging We aim to create a culture where every associate feels valued for who they are, rooted in respect for the individual. Our goal is to foster a sense of belonging, to create opportunities for all our associates, customers and suppliers, and to be a Walmart for everyone. At Walmart, our vision is "everyone included." By fostering a workplace culture where everyone is—and feels—included, everyone wins. Our associates and customers reflect the makeup of all 19 countries where we operate. By making Walmart a welcoming place where all people feel like they belong, we’re able to engage associates, strengthen our business, improve our ability to serve customers, and support the communities where we operate. Equal Opportunity Employer Walmart, Inc., is an Equal Opportunities Employer – By Choice. We believe we are best equipped to help our associates, customers and the communities we serve live better when we really know them. That means understanding, respecting and valuing unique styles, experiences, identities, ideas and opinions – while being inclusive of all people. Minimum Qualifications... Outlined below are the required minimum qualifications for this position. If none are listed, there are no minimum qualifications. Minimum Qualifications:Option 1: Bachelors degree in Statistics, Economics, Analytics, Mathematics, Computer Science, Information Technology or related field and 4 years' experience in an analytics related field. Option 2: Masters degree in Statistics, Economics, Analytics, Mathematics, Computer Science, Information Technology or related field and 2 years' experience in an analytics related field. Option 3: 6 years' experience in an analytics or related field. Preferred Qualifications... Outlined below are the optional preferred qualifications for this position. If none are listed, there are no preferred qualifications. Primary Location... Rmz Millenia Business Park, No 143, Campus 1B (1St -6Th Floor), Dr. Mgr Road, (North Veeranam Salai) Perungudi , India R-2182242 Show more Show less
Posted 1 week ago
140.0 years
0 Lacs
Greater Kolkata Area
On-site
About NCR VOYIX NCR VOYIX Corporation (NYSE: VYX) is a leading global provider of digital commerce solutions for the retail, restaurant and banking industries. NCR VOYIX is headquartered in Atlanta, Georgia, with approximately 16,000 employees in 35 countries across the globe. For nearly 140 years, we have been the global leader in consumer transaction technologies, turning everyday consumer interactions into meaningful moments. Today, NCR VOYIX transforms the stores, restaurants and digital banking experiences with cloud-based, platform-led SaaS and services capabilities. Not only are we the leader in the market segments we serve and the technology we deliver, but we create exceptional consumer experiences in partnership with the world’s leading retailers, restaurants and financial institutions. We leverage our expertise, R&D capabilities and unique platform to help navigate, simplify and run our customers’ technology systems. Our customers are at the center of everything we do. Our mission is to enable stores, restaurants and financial institutions to exceed their goals – from customer satisfaction to revenue growth, to operational excellence, to reduced costs and profit growth. Our solutions empower our customers to succeed in today’s competitive landscape. Our unique perspective brings innovative, industry-leading tech to all the moving parts of business across industries. NCR VOYIX has earned the trust of businesses large and small — from the best-known brands around the world to your local favorite around the corner. Title :- Senior Software Engineering Manager – Data Engineering & Full stack Experience :- 12 Years – 15 Years Location :- Hyderabad/Gurgaon/Virtual YOU ARE… Passionate about technology and see the world a little differently than your peers. Everywhere you look, there’s possibility. Opportunity. Boundaries to push and challenges to solve. You believe software engineering changes how people live. At NCR Voyix, we believe that, too. We’re one of the world’s first tech companies, and still going strong. Like us, you know the online and mobile worlds better than any other—and see patterns that no one else sees. Our leadership team drives the delivery of products that provide optimal performance and stability with unsurpassed longevity with over 25 years in the Restaraunts, Retail, Payments & Services industry. We are looking for talented people to join our expanding our NCR Voyix Data and Analytics platform team. Our product as a cloud based SaaS solution is responsible for providing the foundation for NCR Voyix cloud-based Data and Analytics platform. Our primary customers are merchants you see and visit every day in the Retail, Grocery, and Hospitality industry. We experience the impact our work is having, and we take pride in providing services with great availability and ease of use. IN THIS ROLE, YOU CAN EXPECT TO… . The NCR Voyix Software Engineer will be responsible for front-end and back-end solution design, software development, code quality, data security, production readiness and performance tuning. The ideal candidate is an experienced software engineer who enjoys optimizing data systems and building them from the ground up. The Software Engineer will support database architects, data analysts and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. They must be self-directed and comfortable supporting the data needs of multiple teams, systems and products. The right candidate will be excited by the prospect of optimizing or even re-designing our company’s data architecture to support our next generation of products and data initiatives. The NCR Voyix Software Engineer contributes in the following: KEY AREAS OF RESPONSIBILITY: Lead team of talented developers and leads working on full stack frameworks and data engineering. Work with stakeholders throughout the organization to identify opportunities for leveraging company data to drive business solutions. Mine and analyze data from different NCR data sources to drive optimization of operations, and improve customer experience. Assess the effectiveness and accuracy of new data sources and data gathering techniques. Develop custom data models and algorithms to apply to data sets. Use predictive modeling to increase and optimize customer experiences, cost savings, actionable insights and other business outcomes. Develop company A/B testing framework and test model quality. Collaborate with different functional teams to implement models and monitor outcomes. Develop processes and tools to monitor and analyze model performance and data accuracy. Be part of an Agile team, participate in all Agile ceremonies & activities and be accountable for the sprint deliverable Create and maintain optimal data delivery architecture Assemble large, complex data sets that meet functional / non-functional business requirements. Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using Azure and GCP ‘big data’ technologies. Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics. Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data delivery needs. Keep our data separated and secure across national boundaries through multiple data centers and cloud regions. Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader. Work with data and analytics experts to strive for greater functionality in our data systems. YOU HAVE… 15+ years of experience in software testing or software engineering 10+ years in non-functional automation & performance testing 10+ years in Public Cloud based engineering React.js understanding: Experience with React components, hooks, and state management. JavaScript/TypeScript knowledge Node.js: Expertise in server-side development using Node.js. RESTful APIs & GraphQL: Ability to design and consume APIs. Agile Methodologies: Experience in Agile, Scrum, or Kanban environments. UI/UX Principles: Basic understanding for effective collaboration with designers. Experience building and optimizing ‘big data’ data pipelines, architectures and data sets. Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement. Strong analytic skills related to working with structured and unstructured datasets. Build processes supporting data transformation, data structures, metadata, dependency and workload management. A successful history of manipulating, processing and extracting value from large disconnected datasets. Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores. Strong project management and organizational skills. Experience supporting and working with cross-functional teams in a dynamic environment. Experience with ETL,and big data integration services: Confluent Kafka, BigQuery, Data Bricks, Data Factory, etc. Experience with relational SQL and NoSQL databases, including DataBricks, BigQuery, Azure Data Warehouse, etc. Experience with stream-processing systems: kSQL, Flink SQL, dbtLabs, DataBricks, Spark-Streaming, etc. Experience with object-oriented, functional and scripting languages: Python, Java, C#, Scala, etc. Experience with Dev Ops tools: CI & Dev Ops: GitHub, GitHub Actions, Jenkins, JIRA, Chef, Sonar Experience with unit testing, integration testing, performance testing and user acceptance testing BASIC QUALIFICATIONS: Strong inferential skills with an ability to succinctly communicate complex topics to business stakeholders. Experience with UI and full stack frameworks like ReactJS, NodeJS, Typescript, Material UI, SASS etc Experience using cloud platforms like Azure or GCP. Experience working with complex on-premise and cloud data architectures. GENERAL KNOWLEDGE, SKILLS AND ABILITIES: Exhibit leadership skills Azure or GCP Public Cloud Technologies In-depth knowledge of end-to-end systems development life cycles (including agile, iterative, and other modern approaches to software development) Outstanding verbal and written communication skills to technical and non-technical audiences of various levels in the organization (e.g., executive, management, individual contributors) Ability to estimate work effort for project sub-plans or small projects and ensure projects are successfully completed Quality assurance mindset Positive outlook, strong work ethic, and responsive to internal and external customers and contacts Willingly and successfully fulfils the role of teacher, mentor and coach Requires in-depth knowledge of networking, computing platform, storage, database, security, middleware, network and systems management, and related infrastructure technologies and practice Offers of employment are conditional upon passage of screening criteria applicable to the job EEO Statement Integrated into our shared values is NCR Voyix’s commitment to diversity and equal employment opportunity. All qualified applicants will receive consideration for employment without regard to sex, age, race, color, creed, religion, national origin, disability, sexual orientation, gender identity, veteran status, military service, genetic information, or any other characteristic or conduct protected by law. NCR Voyix is committed to being a globally inclusive company where all people are treated fairly, recognized for their individuality, promoted based on performance and encouraged to strive to reach their full potential. We believe in understanding and respecting differences among all people. Every individual at NCR Voyix has an ongoing responsibility to respect and support a globally diverse environment. Statement to Third Party Agencies To ALL recruitment agencies: NCR Voyix only accepts resumes from agencies on the preferred supplier list. Please do not forward resumes to our applicant tracking system, NCR Voyix employees, or any NCR Voyix facility. NCR Voyix is not responsible for any fees or charges associated with unsolicited resumes “When applying for a job, please make sure to only open emails that you will receive during your application process that come from a @ncrvoyix.com email domain.” Show more Show less
Posted 1 week ago
7.0 years
0 Lacs
Greater Kolkata Area
On-site
Overview Working at Atlassian Atlassians can choose where they work – whether in an office, from home, or a combination of the two. That way, Atlassians have more control over supporting their family, personal goals, and other priorities. We can hire people in any country where we have a legal entity. Interviews and onboarding are conducted virtually, a part of being a distributed-first company. Responsibilities Atlassian is looking for a Senior Data Engineer to join our Data Engineering team which is responsible for building our data lake, maintaining our big data pipelines / services and facilitating the movement of billions of messages each day. We work directly with the business stakeholders and plenty of platform and engineering teams to enable growth and retention strategies at Atlassian. We are looking for an open-minded, structured thinker who is passionate about building services that scale. On a typical day you will help our stakeholder teams ingest data faster into our data lake, you’ll find ways to make our data pipelines more efficient, or even come up ideas to help instigate self-serve data engineering within the company. You’ll get the opportunity to work on a AWS based data lake backed by the full suite of open source projects such as Spark and Airflow. We are a team with little legacy in our tech stack and as a result you’ll spend less time paying off technical debt and more time identifying ways to make our platform better and improve our users experience. Qualifications As a Senior Data Engineer in the DE team, you will have the opportunity to apply your strong technical experience building highly reliable services on managing and orchestrating a multi-petabyte scale data lake. You enjoy working in a fast paced environment and you are able to take vague requirements and transform them into solid solutions. You are motivated by solving challenging problems, where creativity is as crucial as your ability to write code and test cases. On Your First Day, We'll Expect You To Have A BS in Computer Science or equivalent experience At least 7+ years professional experience as a Sr. Software Engineer or Sr. Data Engineer Strong programming skills (Python, Java or Scala preferred) Experience writing SQL, structuring data, and data storage practices Experience with data modeling Knowledge of data warehousing concepts Experience building data pipelines, platforms Experience with Databricks, Spark, Hive, Airflow and other streaming technologies to process incredible volumes of streaming data Experience in modern software development practices (Agile, TDD, CICD) Strong focus on data quality and experience with internal/external tools/frameworks to automatically detect data issues, anomalies. A willingness to accept failure, learn and try again An open mind to try solutions that may seem crazy at first Experience working on Amazon Web Services (in particular using EMR, Kinesis, RDS, S3, SQS and the like) It's Preferred That You Have Experience building self-service tooling and platforms Built and designed Kappa architecture platforms Contributed to open source projects (Ex: Operators in Airflow) Experience with Data Build Tool (DBT) Our Perks & Benefits Atlassian offers a wide range of perks and benefits designed to support you, your family and to help you engage with your local community. Our offerings include health and wellbeing resources, paid volunteer days, and so much more. To learn more, visit go.atlassian.com/perksandbenefits . About Atlassian At Atlassian, we're motivated by a common goal: to unleash the potential of every team. Our software products help teams all over the planet and our solutions are designed for all types of work. Team collaboration through our tools makes what may be impossible alone, possible together. We believe that the unique contributions of all Atlassians create our success. To ensure that our products and culture continue to incorporate everyone's perspectives and experience, we never discriminate based on race, religion, national origin, gender identity or expression, sexual orientation, age, or marital, veteran, or disability status. All your information will be kept confidential according to EEO guidelines. To provide you the best experience, we can support with accommodations or adjustments at any stage of the recruitment process. Simply inform our Recruitment team during your conversation with them. To learn more about our culture and hiring process, visit go.atlassian.com/crh . Show more Show less
Posted 1 week ago
3.0 - 15.0 years
0 Lacs
Bangalore Urban, Karnataka, India
On-site
Technology & Transformation: EAD: Azure Data Engineer-Consultant/Senior Consultant/Manager Your potential, unleashed. India’s impact on the global economy has increased at an exponential rate and Deloitte presents an opportunity to unleash and realize your potential amongst cutting edge leaders, and organizations shaping the future of the region, and indeed, the world beyond. At Deloitte, your whole self to work, every day. Combine that with our drive to propel with purpose and you have the perfect playground to collaborate, innovate, grow, and make an impact that matters. The Team Deloitte’s Technology & Transformation practice can help you uncover and unlock the value buried deep inside vast amounts of data. Our global network provides strategic guidance and implementation services to help companies manage data from disparate sources and convert it into accurate, actionable information that can support fact-driven decision-making and generate an insight-driven advantage. Our practice addresses the continuum of opportunities in business intelligence & visualization, data management, performance management and next-generation analytics and technologies, including big data, cloud, cognitive and machine learning. Your work profile: As a Consultant/Senior Consultant/Manager in our T&T Team you’ll build and nurture positive working relationships with teams and clients with the intention to exceed client expectations: - Design, develop and deploy solutions using different tools, design principles and conventions. Configure robotics processes and objects using core workflow principles in an efficient way; ensure they are easily maintainable and easy to understand. Understand existing processes and facilitate change requirements as part of a structured change control process. Solve day to day issues arising while running robotics processes and provide timely resolutions. Maintain proper documentation for the solutions, test procedures and scenarios during UAT and Production phase. Coordinate with process owners and business to understand the as-is process and design the automation process flow. Desired Qualifications 3-15 Years of hands-on experience Implementing Azure Cloud data warehouses, Azure and No-SQL databases and hybrid data scenarios. Experience developing Azure Data Factory (covering Azure Functions, LogicApps, Triggers, IR), Databricks (pySpark, Scala), Stream Analytics, Event Hub & HD Insight Components Experience in working on data lake & DW solutions on Azure. Experience managing Azure DevOps pipelines (CI/CD) Experience managing source data access security, using Vault, configuring authentication and authorization, enforcing data policies and standards. UG: B. Tech /B.E. in Any Specialization . Location and way of working: Base location: Pan India This profile involves occasional travelling to client locations. Hybrid is our default way of working. Each domain has customized the hybrid approach to their unique needs. Your role as a Consultant/Senior Consultant/Manager: We expect our people to embrace and live our purpose by challenging themselves to identify issues that are most important for our clients, our people, and for society. In addition to living our purpose, Consultant/Senior Consultant/Manager across our organization must strive to be: Inspiring - Leading with integrity to build inclusion and motivation. Committed to creating purpose - Creating a sense of vision and purpose. Agile - Achieving high-quality results through collaboration and Team unity. Skilled at building diverse capability - Developing diverse capabilities for the future. Persuasive / Influencing - Persuading and influencing stakeholders. Collaborating - Partnering to build new solutions. Delivering value - Showing commercial acumen Committed to expanding business - Leveraging new business opportunities. Analytical Acumen - Leveraging data to recommend impactful approach and solutions through the power of analysis and visualization. Effective communication – Must be well abled to have well-structured and well-articulated conversations to achieve win-win possibilities. Engagement Management / Delivery Excellence - Effectively managing engagement(s) to ensure timely and proactive execution as well as course correction for the success of engagement(s) Managing change - Responding to changing environment with resilience Managing Quality & Risk - Delivering high quality results and mitigating risks with utmost integrity and precision Strategic Thinking & Problem Solving - Applying strategic mindset to solve business issues and complex problems. Tech Savvy - Leveraging ethical technology practices to deliver high impact for clients and for Deloitte Empathetic leadership and inclusivity - creating a safe and thriving environment where everyone's valued for who they are, use empathy to understand others to adapt our behaviours and attitudes to become more inclusive. How you’ll grow Connect for impact Our exceptional team of professionals across the globe are solving some of the world’s most complex business problems, as well as directly supporting our communities, the planet, and each other. Know more in our Global Impact Report and our India Impact Report. Empower to lead You can be a leader irrespective of your career level. Our colleagues are characterised by their ability to inspire, support, and provide opportunities for people to deliver their best and grow both as professionals and human beings. Know more about Deloitte and our One Young World partnership. Inclusion for all At Deloitte, people are valued and respected for who they are and are trusted to add value to their clients, teams and communities in a way that reflects their own unique capabilities. Know more about everyday steps that you can take to be more inclusive. At Deloitte, we believe in the unique skills, attitude and potential each and every one of us brings to the table to make an impact that matters. Drive your career At Deloitte, you are encouraged to take ownership of your career. We recognise there is no one size fits all career path, and global, cross-business mobility and up / re-skilling are all within the range of possibilities to shape a unique and fulfilling career. Know more about Life at Deloitte. Everyone’s welcome… entrust your happiness to us Our workspaces and initiatives are geared towards your 360-degree happiness. This includes specific needs you may have in terms of accessibility, flexibility, safety and security, and caregiving. Here’s a glimpse of things that are in store for you. Interview tips We want job seekers exploring opportunities at Deloitte to feel prepared, confident and comfortable. To help you with your interview, we suggest that you do your research, know some background about the organisation and the business area you’re applying to. Check out recruiting tips from Deloitte professionals. Show more Show less
Posted 1 week ago
4.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
At eBay, we're more than a global ecommerce leader — we’re changing the way the world shops and sells. Our platform empowers millions of buyers and sellers in more than 190 markets around the world. We’re committed to pushing boundaries and leaving our mark as we reinvent the future of ecommerce for enthusiasts. Our customers are our compass, authenticity thrives, bold ideas are welcome, and everyone can bring their unique selves to work — every day. We're in this together, sustaining the future of our customers, our company, and our planet. Join a team of passionate thinkers, innovators, and dreamers — and help us connect people and build communities to create economic opportunity for all. Do you love Big Data? Deploying Machine Learning models? Challenging optimization problems? Knowledgeable, collaborative co-workers? Come work at eBay and help us redefine global, online commerce! Who Are We? The Product Knowledge team is at the epicenter of eBay’s Tech-driven, Customer-centric overhaul. Our team is entrusted with creating and using eBay’s Product Knowledge - a vast Big Data system which is built up of listings, transactions, products, knowledge graphs, and more. Our team has a mix of highly proficient people from multiple fields such as Machine Learning, Data Science, Software Engineering, Operations, and Big Data Analytics. We have a strong culture of collaboration, and plenty of opportunity to learn, make an impact, and grow! What Will You Do We are looking for exceptional Engineers, who take pride in creating simple solutions to apparently-complex problems. Our Engineering tasks typically involve at least one of the following: Building a pipeline that processes up to billions of items, frequently employing ML models on these datasets Creating services that provide Search or other Information Retrieval capabilities at low latency on datasets of hundreds of millions of items Crafting sound API design and driving integration between our Data layers and Customer-facing applications and components Designing and running A/B tests in Production experiences in order to vet and measure the impact of any new or improved functionality If you love a good challenge, and are good at handling complexity - we’d love to hear from you! eBay is an amazing company to work for. Being on the team, you can expect to benefit from: A competitive salary - including stock grants and a yearly bonus A healthy work culture that promotes business impact and at the same time highly values your personal well-being Being part of a force for good in this world - eBay truly cares about its employees, its customers, and the world’s population, and takes every opportunity to make this clearly apparent Job Responsibilities Design, deliver, and maintain significant features in data pipelines, ML processing, and / or service infrastructure Optimize software performance to achieve the required throughput and / or latency Work with your manager, peers, and Product Managers to scope projects and features Come up with a sound technical strategy, taking into consideration the project goals, timelines, and expected impact Take point on some cross-team efforts, taking ownership of a business problem and ensuring the different teams are in sync and working towards a coherent technical solution Take active part in knowledge sharing across the organization - both teaching and learning from others Minimum Qualifications Passion and commitment for technical excellence B.Sc. or M.Sc. in Computer Science or an equivalent professional experience 4+ years of software design and development experience, tackling non-trivial problems in backend services and / or data pipelines A solid foundation in Data Structures, Algorithms, Object-Oriented Programming, Software Design, and core Statistics knowledge Experience in production-grade coding in Java, and Python/Scala Experience in the close examination of data and computation of statistics Experience in designing and operating Big Data processing pipelines, such as: Hadoop and Spark Excellent verbal and written communication and collaboration skills Please see the Talent Privacy Notice for information regarding how eBay handles your personal data collected when you use the eBay Careers website or apply for a job with eBay. eBay is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, sex, sexual orientation, gender identity, veteran status, and disability, or other legally protected status. If you have a need that requires accommodation, please contact us at talent@ebay.com. We will make every effort to respond to your request for accommodation as soon as possible. View our accessibility statement to learn more about eBay's commitment to ensuring digital accessibility for people with disabilities. The eBay Jobs website uses cookies to enhance your experience. By continuing to browse the site, you agree to our use of cookies. Visit our Privacy Center for more information. Show more Show less
Posted 1 week ago
2.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
At eBay, we're more than a global ecommerce leader — we’re changing the way the world shops and sells. Our platform empowers millions of buyers and sellers in more than 190 markets around the world. We’re committed to pushing boundaries and leaving our mark as we reinvent the future of ecommerce for enthusiasts. Our customers are our compass, authenticity thrives, bold ideas are welcome, and everyone can bring their unique selves to work — every day. We're in this together, sustaining the future of our customers, our company, and our planet. Join a team of passionate thinkers, innovators, and dreamers — and help us connect people and build communities to create economic opportunity for all. Machine Learning Engineer (T24), Product Knowledge Do you love Big Data? Deploying Machine Learning models? Challenging optimization problems? Knowledgeable, collaborative co-workers? Come work at eBay and help us redefine global, online commerce! Who Are We? The Product Knowledge team is at the epicenter of eBay’s Tech-driven, Customer-centric overhaul. Our team is entrusted with creating and using eBay’s Product Knowledge - a vast Big Data system which is built up of listings, transactions, products, knowledge graphs, and more. Our team has a mix of highly proficient people from multiple fields such as Machine Learning, Data Science, Software Engineering, Operations, and Big Data Analytics. We have a strong culture of collaboration, and plenty of opportunity to learn, make an impact, and grow! What Will You Do We are looking for exceptional Engineers, who take pride in creating simple solutions to apparently-complex problems. Our Engineering tasks typically involve at least one of the following: Building a pipeline that processes up to billions of items, frequently employing ML models on these datasets Creating services that provide Search or other Information Retrieval capabilities at low latency on datasets of hundreds of millions of items Crafting sound API design and driving integration between our Data layers and Customer-facing applications and components Designing and running A/B tests in Production experiences in order to vet and measure the impact of any new or improved functionality If you love a good challenge, and are good at handling complexity - we’d love to hear from you! eBay is an amazing company to work for. Being on the team, you can expect to benefit from: A competitive salary - including stock grants and a yearly bonus A healthy work culture that promotes business impact and at the same time highly values your personal well-being Being part of a force for good in this world - eBay truly cares about its employees, its customers, and the world’s population, and takes every opportunity to make this clearly apparent Job Responsibilities Design, deliver, and maintain significant features in data pipelines, ML processing, and / or service infrastructure Optimize software performance to achieve the required throughput and / or latency Work with your manager, peers, and Product Managers to scope projects and features Come up with a sound technical strategy, taking into consideration the project goals, timelines, and expected impact Take point on some cross-team efforts, taking ownership of a business problem and ensuring the different teams are in sync and working towards a coherent technical solution Take active part in knowledge sharing across the organization - both teaching and learning from others Minimum Qualifications Passion and commitment for technical excellence B.Sc. or M.Sc. in Computer Science or an equivalent professional experience 2+ years of software design and development experience, tackling non-trivial problems in backend services and / or data pipelines A solid foundation in Data Structures, Algorithms, Object-Oriented Programming, Software Design, and core Statistics knowledge Experience in production-grade coding in Java, and Python/Scala Experience in the close examination of data and computation of statistics Experience in using and operating Big Data processing pipelines, such as: Hadoop and Spark Good verbal and written communication and collaboration skills Please see the Talent Privacy Notice for information regarding how eBay handles your personal data collected when you use the eBay Careers website or apply for a job with eBay. eBay is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, sex, sexual orientation, gender identity, veteran status, and disability, or other legally protected status. If you have a need that requires accommodation, please contact us at talent@ebay.com. We will make every effort to respond to your request for accommodation as soon as possible. View our accessibility statement to learn more about eBay's commitment to ensuring digital accessibility for people with disabilities. The eBay Jobs website uses cookies to enhance your experience. By continuing to browse the site, you agree to our use of cookies. Visit our Privacy Center for more information. Show more Show less
Posted 1 week ago
3.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Overview Working at Atlassian Atlassians can choose where they work – whether in an office, from home, or a combination of the two. That way, Atlassians have more control over supporting their family, personal goals, and other priorities. We can hire people in any country where we have a legal entity. Interviews and onboarding are conducted virtually, a part of being a distributed-first company. Responsibilities As a data engineer, you will have the opportunity to apply your strong technical experience building highly reliable data products. You enjoy working in an agile environment. You are able to translate raw requirements into solid solutions. You are motivated by solving challenging problems, where creativity is as crucial as your ability to write code and test cases. On a typical day you will help our partner teams ingest data faster into our data lake, you’ll find ways to make our data products more efficient, or come up with ideas to help build self-serve data engineering within the company. Then you will move on to building micro-services, architecting, designing, and promoting self serve capabilities at scale to help Atlassian grow. Qualifications On your first day, we'll expect you to have: At least 3+ years of professional experience as a software engineer or data engineer A BS in Computer Science or equivalent experience Strong programming skills (some combination of Python, Java, and Scala) Experience writing SQL, structuring data, and data storage practices Experience with data modeling Knowledge of data warehousing concepts Experienced building data pipelines and micro services Experience with Spark, Airflow and other streaming technologies to process incredible volumes of streaming data A willingness to accept failure, learn and try again An open mind to try solutions that may seem impossible at first Experience in working on Amazon Web Services (in particular using EMR, Kinesis, RDS, S3, SQS and the like), and databricks. It's Preferred, But Not Technically Required, That You Have Experience building self-service tooling and platforms Built and designed Kappa architecture platforms A passion for building and running continuous integration pipelines. Built pipelines using Databricks and well versed with their API’s Contributed to open source projects (Ex: Operators in Airflow) Our Perks & Benefits Atlassian offers a variety of perks and benefits to support you, your family and to help you engage with your local community. Our offerings include health coverage, paid volunteer days, wellness resources, and so much more. Visit go.atlassian.com/perksandbenefits to learn more. About Atlassian At Atlassian, we're motivated by a common goal: to unleash the potential of every team. Our software products help teams all over the planet and our solutions are designed for all types of work. Team collaboration through our tools makes what may be impossible alone, possible together. We believe that the unique contributions of all Atlassians create our success. To ensure that our products and culture continue to incorporate everyone's perspectives and experience, we never discriminate based on race, religion, national origin, gender identity or expression, sexual orientation, age, or marital, veteran, or disability status. All your information will be kept confidential according to EEO guidelines. To provide you the best experience, we can support with accommodations or adjustments at any stage of the recruitment process. Simply inform our Recruitment team during your conversation with them. To learn more about our culture and hiring process, visit go.atlassian.com/crh . Show more Show less
Posted 1 week ago
7.0 - 10.0 years
17 - 27 Lacs
Gurugram
Hybrid
Primary Responsibilities: Design and develop applications and services running on Azure, with a strong emphasis on Azure Databricks, ensuring optimal performance, scalability, and security. Build and maintain data pipelines using Azure Databricks and other Azure data integration tools. Write, read, and debug Spark, Scala, and Python code to process and analyze large datasets. Write extensive query in SQL and Snowflake Implement security and access control measures and regularly audit Azure platform and infrastructure to ensure compliance. Create, understand, and validate design and estimated effort for given module/task, and be able to justify it. Possess solid troubleshooting skills and perform troubleshooting of issues in different technologies and environments. Implement and adhere to best engineering practices like design, unit testing, functional testing automation, continuous integration, and delivery. Maintain code quality by writing clean, maintainable, and testable code. Monitor performance and optimize resources to ensure cost-effectiveness and high availability. Define and document best practices and strategies regarding application deployment and infrastructure maintenance. Provide technical support and consultation for infrastructure questions. Help develop, manage, and monitor continuous integration and delivery systems. Take accountability and ownership of features and teamwork. Comply with the terms and conditions of the employment contract, company policies and procedures, and any directives. Required Qualifications: B.Tech/MCA (Minimum 16 years of formal education) Overall 7+ years of experience. Minimum of 3 years of experience in Azure (ADF), Databricks and DevOps. 5 years of experience in writing advanced leve l SQL. 2-3 years of experience in writing, reading, and debugging Spark, Scala, and Python code . 3 or more years of experience in architecting, designing, developing, and implementing cloud solutions on Azure. Proficiency in programming languages and scripting tools. Understanding of cloud data storage and database technologies such as SQL and NoSQL. Proven ability to collaborate with multidisciplinary teams of business analysts, developers, data scientists, and subject-matter experts. Familiarity with DevOps practices and tools, such as continuous integration and continuous deployment (CI/CD) and Teraform. Proven proactive approach to spotting problems, areas for improvement, and performance bottlenecks. Proven excellent communication, writing, and presentation skills. Experience in interacting with international customers to gather requirements and convert them into solutions using relevant skills. Preferred Qualifications: Knowledge of AI/ML or LLM (GenAI). Knowledge of US Healthcare domain and experience with healthcare data. Experience and skills with Snowflake.
Posted 1 week ago
6.0 years
0 Lacs
Pune, Maharashtra, India
Remote
About the Team Come help us build the world's most reliable on-demand, logistics engine for delivery! We're bringing on experienced engineers to help us further our 24x7, global infrastructure system that powers DoorDash's three-sided marketplace of consumers, merchants, and dashers. About the Role The Data Tools mission is to build robust data platforms and establish policies that guarantee the analytics data is of high quality, easily accessible/cataloged, and compliant with financial and privacy regulations, fostering trust and confidence in our data-driven decision-making process. We are building the Data Tools team in India and you will have an opportunity to be part of a founding team with a greater opportunity for impact where you can help grow the team and shape the roadmap for the data platform at DoorDash. You will report directly to the Data Tools Engineering Manager. You're excited about this opportunity because you will… Work on building a data discovery platform, privacy frameworks, unified access control frameworks, and data quality platform to enable data builders at DoorDash to deliver high-quality and trustable data sets and metrics Help accelerate the adoption of the data discovery platform by building integrations across online, analytics platforms and promoting self-serve Come up with solutions for scaling data systems for various business needs Collaborate in a dynamic startup environment We're excited about you because… B.E./B.Tech., M.E./M.Tech, or Ph.D. in Computer Science or equivalent 6+ years of experience with CS fundamental concepts and experience with at least one of the programming languages of Scala, Java, and Python Prior technical experience in Big Data infrastructure & governance - you've built meaningful pieces of data infrastructure. Bonus if those were open-sourced technologies like DataHub, Spark, Airflow, Kafka, Flink Experience improving efficiency, scalability, and stability of data platforms Notice to Applicants for Jobs Located in NYC or Remote Jobs Associated With Office in NYC Only We use Covey as part of our hiring and/or promotional process for jobs in NYC and certain features may qualify it as an AEDT in NYC. As part of the hiring and/or promotion process, we provide Covey with job requirements and candidate submitted applications. We began using Covey Scout for Inbound from August 21, 2023, through December 21, 2023, and resumed using Covey Scout for Inbound again on June 29, 2024. The Covey tool has been reviewed by an independent auditor. Results of the audit may be viewed here: Covey About DoorDash At DoorDash, our mission to empower local economies shapes how our team members move quickly, learn, and reiterate in order to make impactful decisions that display empathy for our range of users—from Dashers to merchant partners to consumers. We are a technology and logistics company that started with door-to-door delivery, and we are looking for team members who can help us go from a company that is known for delivering food to a company that people turn to for any and all goods. DoorDash is growing rapidly and changing constantly, which gives our team members the opportunity to share their unique perspectives, solve new challenges, and own their careers. We're committed to supporting employees' happiness, healthiness, and overall well-being by providing comprehensive benefits and perks. Our Commitment to Diversity and Inclusion We're committed to growing and empowering a more inclusive community within our company, industry, and cities. That's why we hire and cultivate diverse teams of people from all backgrounds, experiences, and perspectives. We believe that true innovation happens when everyone has room at the table and the tools, resources, and opportunity to excel. If you need any accommodations, please inform your recruiting contact upon initial connection. About DoorDash At DoorDash, our mission to empower local economies shapes how our team members move quickly, learn, and reiterate in order to make impactful decisions that display empathy for our range of users—from Dashers to merchant partners to consumers. We are a technology and logistics company that started with door-to-door delivery, and we are looking for team members who can help us go from a company that is known for delivering food to a company that people turn to for any and all goods. DoorDash is growing rapidly and changing constantly, which gives our team members the opportunity to share their unique perspectives, solve new challenges, and own their careers. We're committed to supporting employees' happiness, healthiness, and overall well-being by providing comprehensive benefits and perks. Our Commitment to Diversity and Inclusion We're committed to growing and empowering a more inclusive community within our company, industry, and cities. That's why we hire and cultivate diverse teams of people from all backgrounds, experiences, and perspectives. We believe that true innovation happens when everyone has room at the table and the tools, resources, and opportunity to excel. If you need any accommodations, please inform your recruiting contact upon initial connection. We use Covey as part of our hiring and/or promotional process for jobs in certain locations. The Covey tool has been reviewed by an independent auditor. Results of the audit may be viewed here: https://getcovey.com/nyc-local-law-144 To request a reasonable accommodation under applicable law or alternate selection process, please inform your recruiting contact upon initial connection. Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
Remote
It was nice visiting your profile in portal, One of our top MNC client has critical job position on C++ Software Engineer for Pune Location Please Apply relevant Profiles Candidates should have Trading platform experience Candidates Required Skill: C++ Software Engineer Years of Experience:5 to 10 Years, CTC: Can be discussed Notice Period: Immediate Joiners or 15-20 Days or can be discussed Work Location:Pune Working model:Hybred Interview: Online Job Description We are currently seeking a driven and talented C++ Software Engineer to join our team. Our Software Engineers thrive on pushing the limits of technology to produce state-of-the-art applications for TTs platform, which is the front-end screen of choice for professional derivatives traders around the world. As a Software Engineer you will work on our award-winning trading platform which incorporates robust, high-performance tools for spread trading, strategy creation, algorithmic and automated trading, black-box execution, high-frequency proximity-based trading and more. What WIll You be Involved With? Code day to day in C++ and other programming languages Design and implement software requirements and new product features Enhance and maintain existing functionality Participate in design discussions and review sessions Create high-level and detailed design documents Assist with product documentation, unit testing and ensuring overall product quality Support, maintain, and enhance existing and new product functionality for trading software in a real-time, multi-threaded, multi-tier server architecture environment to create high and low level design for concurrent high throughput, low latency software architecture Provide software development plans that meet future needs of clients and markets Evolve the new software platform and architecture by introducing new components and integrating them with existing ones Perform memory and resource management analysis Analyze stack traces, core dumps and production incident reports from traders and support teams Propose fixes, enhancements to existing trading systems Adhere to release and sprint planning with the Quality Assurance Group and Project Management Attend and participate in daily scrum meetings Design, develop, program server-side software components What Will You Bring to the Table? A minimum 5 years of solid modern C++ development experience and the ability to understand, write, review and debug multithreaded code is required Proven experience in multi-threaded applications with a focus on performance is required Experience in the trading industry (specifically market data algorithmic trading) is strongly preferred Experience with Linux operating systems is a plus Knowledge of Python is a plus Knowledge of Scala is a plus Experience with financial trading systems experience is a plus but not required Strong object-oriented design and programming skills Ability to understand business requirements and translate them into technical requirements and working application code Familiarity with agile/iterative development methodologies Solid debugging and performance tuning skills What We Bring to the Table? Competitive benefits, including medical, dental, vision, FSA, HSA, 401(k) and pre-tax transit/parking Flexible work schedules with some remote work 22 PTO (paid time off) days per year with the ability to roll over days into the following year, one day per year available for volunteering, 2 Training days per year to allow uninterrupted professional development, 1 additional PTO day added during a milestone year, robust paid holiday schedule with early dismissal, generous parental leave (for all genders and staff, including adoptive parents), and backup child and pet care as well as tutoring services The company provided top-of-the-line tech resources and a tech accessories budget for monitors, headphones, keyboards, office equipment. Milestone anniversary bonuses Stipend and subsidy contributions toward personally-owned cell phones and laptops, gym memberships, and health/wellness initiatives (including discounted healthcare premiums and healthy meal delivery programs) If you are interested. If you have any doubts, we are available over email and calls to clarify you If you are interested, Kindly apply for this position with above details to process further with our client. If you have any queries, please feel free to contact us, Thank you for your time. With regards Arun Integration Minds +91-9036020999 This job is provided by Shine.com Show more Show less
Posted 1 week ago
0 years
0 Lacs
Kolkata, West Bengal, India
On-site
Roles and Responsibilities: Data Pipeline Development: Design, develop, and maintain scalable data pipelines to support ETL (Extract, Transform, Load) processes using tools like Apache Airflow, AWS Glue, or similar. Database Management: Design, optimize, and manage relational and NoSQL databases (such as MySQL, PostgreSQL, MongoDB, or Cassandra) to ensure high performance and scalability. SQL Development: Write advanced SQL queries, stored procedures, and functions to extract, transform, and analyze large datasets efficiently. Cloud Integration: Implement and manage data solutions on cloud platforms such as AWS, Azure, or Google Cloud, utilizing services like Redshift, BigQuery, or Snowflake. Data Warehousing: Contribute to the design and maintenance of data warehouses and data lakes to support analytics and BI requirements. Programming and Automation: Develop scripts and applications in Python or other programming languages to automate data processing tasks. Data Governance: Implement data quality checks, monitoring, and governance policies to ensure data accuracy, consistency, and security. Collaboration: Work closely with data scientists, analysts, and business stakeholders to understand data needs and translate them into technical solutions. Performance Optimization: Identify and resolve performance bottlenecks in data systems and optimize data storage and retrieval. Documentation: Maintain comprehensive documentation for data processes, pipelines, and infrastructure. Stay Current: Keep up-to-date with the latest trends and advancements in data engineering, big data technologies, and cloud services. Required Skills and Qualifications: Education: Bachelor’s or Master’s degree in Computer Science, Information Technology, Data Engineering, or a related field. Technical Skills: Proficiency in SQL and relational databases (PostgreSQL, MySQL, etc.). Experience with NoSQL databases (MongoDB, Cassandra, etc.). Strong programming skills in Python; familiarity with Java or Scala is a plus. Experience with data pipeline tools (Apache Airflow, Luigi, or similar). Expertise in cloud platforms (AWS, Azure, or Google Cloud) and data services (Redshift, Big Query, Snowflake). Knowledge of big data tools like Apache Spark, Hadoop, or Kafka is a plus. Data Modeling: Experience in designing and maintaining data models for relational and non-relational databases. Analytical Skills: Strong analytical and problem-solving abilities with a focus on performance optimization and scalability. Soft Skills: Excellent verbal and written communication skills to convey technical concepts to non-technical stakeholders. Ability to work collaboratively in cross-functional teams. Certifications (Preferred): AWS Certified Data Analytics, Google Professional Data Engineer, or similar. Mindset: Eagerness to learn new technologies and adapt quickly in a fast-paced environment. Show more Show less
Posted 1 week ago
6.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
About the job A little about us... LTIMindtree is a global technology consulting and digital solutions company that enables enterprises across industries to reimagine business models, accelerate innovation, and maximize growth by harnessing digital technologies. As a digital transformation partner to more than 750 clients, LTIMindtree brings extensive domain and technology expertise to help drive superior competitive differentiation, customer experiences, and business outcomes in a converging world. Powered by nearly 90,000 talented and entrepreneurial professionals across 35 countries, LTIMindtree — a Larsen & Toubro Group company — combines the industry-acclaimed strengths of erstwhile Larsen and Toubro Infotech and Mindtree in solving the most complex business challenges and delivering transformation at scale. For more info, please visit www.ltimindtree.com Job Details- We are having a weekend drive for the requirement of a Data Scientist at our Bangalore office. Date - 14th June Experience - 4 to 12 Yrs. Location – LTIMindtree Office Bangalore Whitefield Notice Period - Immediate to 60 Days only Mandatory Skills- Gen-AI, Data Science, Python, RAG and Cloud (AWS/Azure) Secondary - (Any) - Machine Learning, Deep Learning, ChatGPT, Langchain, Prompt, vector stores, RAG, llama, Computer vision, Deep learning, Machine learning, OCR, Transformer, regression, forecasting, classification, hyper parameter tunning, MLOps, Inference, Model training, Model Deployment Generic JD : More than 6 years of experience in Data Engineering, Data Science and AI / ML domain Excellent understanding of machine learning techniques and algorithms, such as GPTs, CNN, RNN, k-NN, Naive Bayes, SVM, Decision Forests, etc. Experience using business intelligence tools (e.g. Tableau, PowerBI) and data frameworks (e.g. Hadoop) Experience in Cloud native skills. Knowledge of SQL and Python; familiarity with Scala, Java or C++ is an asset Analytical mind and business acumen and Strong math skills (e.g. statistics, algebra) Experience with common data science toolkits, such as TensorFlow, KERAs, PyTorch, PANDAs, Microsoft CNTK, NumPy etc. Deep expertise in at least one of these is highly desirable. Experience with NLP, NLG and Large Language Models like – BERT, LLaMa, LaMDA, GPT, BLOOM, PaLM, DALL-E, etc. Great communication and presentation skills. Should have experience in working in a fast-paced team culture. Experience with AIML and Big Data technologies like – AWS SageMaker, Azure Cognitive Services, Google Colab, Jupyter Notebook, Hadoop, PySpark, HIVE, AWS EMR etc. Experience with NoSQL databases, such as MongoDB, Cassandra, HBase, Vector databases Good understanding of applied statistics skills, such as distributions, statistical testing, regression, etc. Should be a data-oriented person with analytical mind and business acumen. Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
As a Software Developer you will work in a constantly evolving environment, due to technological advances and the strategic direction of the organization you work for. You will create, maintain, audit, and improve systems to meet particular needs, often as advised by a systems analyst or architect, testing both hard and software systems to diagnose and resolve system faults. The role also covers writing diagnostic programs and designing and writing code for operating systems and software to ensure efficiency. When required, you will make recommendations for future developments Benefits of Joining Us Challenging Projects : Work on cutting-edge projects and solve complex technical problems. Career Growth : Advance your career quickly and take on leadership roles. Mentorship : Learn from experienced mentors and industry experts. Global Opportunities : Work with clients from around the world and gain international experience. Competitive Compensation : Receive attractive compensation packages and benefits. If you're passionate about technology and want to work on challenging projects with a talented team, becoming an Infosys Power Programmer could be a great career choice. Mandatory Skills AWS Glue, AWS Redshift/Spectrum, S3, API Gateway, Athena, Step and Lambda functions Experience in Extract Transform Load (ETL) and Extract Load & Transform (ELT) data integration pattern. Experience in designing and building data pipelines. Development experience in one or more object-oriented programming languages, preferably Python Job Specs 5+ years of in depth hands on experience of developing, testing, deployment and debugging of Spark Jobs using Scala in Hadoop Platform In depth knowledge of Spark Core, working with RDDs, Spark SQL In depth knowledge on Spark Optimization Techniques and Best practices Good Knowledge of Scala Functional Programming: Try, Option, Future, Collections Good Knowledge of Scala OOPS: Classes, Traits and Objects (Singleton and Companion), Case Classes Good Understanding of Scala Language Features: Type System, Implicit/Givens Hands on experience of working in Hadoop Environment (HDFS/Hive), AWS S3, EMR Python programming skills Working experience on Workflow Orchestration tools like Airflow, Oozie Working with API calls in Scala Understanding and exposure to file formats such as Apache AVRO, Parquet, JSON Good to have knowledge of Protocol Buffers and Geospatial data analytics. Writing Test cases using frameworks such as scalatest. Good Knowledge of Build Tools such as: Gradle & SBT in depth Experience on using GIT, resolving conflicts, working with branches. Good to have worked on some workflow systems as Airflow Strong programming skills using data structures and algorithms. Excellent analytical skills Good communication skills Qualification 7-10 Yrs in the industry BE/B.tech CS or equivalent Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
Remote
It was nice visiting your profile in portal, One of our top MNC client has critical job position on C++ Software Engineer for Pune Location Please Apply relevant Profiles Candidates should have Trading platform experience Candidates Required Skill: C++ Software Engineer Years of Experience:5 to 10 Years, CTC: Can be discussed Notice Period: Immediate Joiners or 15-20 Days or can be discussed Work Location:Pune Working model:Hybred Interview: Online Job Description We are currently seeking a driven and talented C++ Software Engineer to join our team. Our Software Engineers thrive on pushing the limits of technology to produce state-of-the-art applications for TTs platform, which is the front-end screen of choice for professional derivatives traders around the world. As a Software Engineer you will work on our award-winning trading platform which incorporates robust, high-performance tools for spread trading, strategy creation, algorithmic and automated trading, black-box execution, high-frequency proximity-based trading and more. What WIll You be Involved With? Code day to day in C++ and other programming languages Design and implement software requirements and new product features Enhance and maintain existing functionality Participate in design discussions and review sessions Create high-level and detailed design documents Assist with product documentation, unit testing and ensuring overall product quality Support, maintain, and enhance existing and new product functionality for trading software in a real-time, multi-threaded, multi-tier server architecture environment to create high and low level design for concurrent high throughput, low latency software architecture Provide software development plans that meet future needs of clients and markets Evolve the new software platform and architecture by introducing new components and integrating them with existing ones Perform memory and resource management analysis Analyze stack traces, core dumps and production incident reports from traders and support teams Propose fixes, enhancements to existing trading systems Adhere to release and sprint planning with the Quality Assurance Group and Project Management Attend and participate in daily scrum meetings Design, develop, program server-side software components What Will You Bring to the Table? A minimum 5 years of solid modern C++ development experience and the ability to understand, write, review and debug multithreaded code is required Proven experience in multi-threaded applications with a focus on performance is required Experience in the trading industry (specifically market data algorithmic trading) is strongly preferred Experience with Linux operating systems is a plus Knowledge of Python is a plus Knowledge of Scala is a plus Experience with financial trading systems experience is a plus but not required Strong object-oriented design and programming skills Ability to understand business requirements and translate them into technical requirements and working application code Familiarity with agile/iterative development methodologies Solid debugging and performance tuning skills What We Bring to the Table? Competitive benefits, including medical, dental, vision, FSA, HSA, 401(k) and pre-tax transit/parking Flexible work schedules with some remote work 22 PTO (paid time off) days per year with the ability to roll over days into the following year, one day per year available for volunteering, 2 Training days per year to allow uninterrupted professional development, 1 additional PTO day added during a milestone year, robust paid holiday schedule with early dismissal, generous parental leave (for all genders and staff, including adoptive parents), and backup child and pet care as well as tutoring services The company provided top-of-the-line tech resources and a tech accessories budget for monitors, headphones, keyboards, office equipment. Milestone anniversary bonuses Stipend and subsidy contributions toward personally-owned cell phones and laptops, gym memberships, and health/wellness initiatives (including discounted healthcare premiums and healthy meal delivery programs) If you are interested. If you have any doubts, we are available over email and calls to clarify you If you are interested, Kindly apply for this position with above details to process further with our client. If you have any queries, please feel free to contact us, Thank you for your time. With regards Arun Integration Minds +91-9036020999 This job is provided by Shine.com Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Mumbai, Maharashtra, India
Remote
It was nice visiting your profile in portal, One of our top MNC client has critical job position on C++ Software Engineer for Pune Location Please Apply relevant Profiles Candidates should have Trading platform experience Candidates Required Skill: C++ Software Engineer Years of Experience:5 to 10 Years, CTC: Can be discussed Notice Period: Immediate Joiners or 15-20 Days or can be discussed Work Location:Pune Working model:Hybred Interview: Online Job Description We are currently seeking a driven and talented C++ Software Engineer to join our team. Our Software Engineers thrive on pushing the limits of technology to produce state-of-the-art applications for TTs platform, which is the front-end screen of choice for professional derivatives traders around the world. As a Software Engineer you will work on our award-winning trading platform which incorporates robust, high-performance tools for spread trading, strategy creation, algorithmic and automated trading, black-box execution, high-frequency proximity-based trading and more. What WIll You be Involved With? Code day to day in C++ and other programming languages Design and implement software requirements and new product features Enhance and maintain existing functionality Participate in design discussions and review sessions Create high-level and detailed design documents Assist with product documentation, unit testing and ensuring overall product quality Support, maintain, and enhance existing and new product functionality for trading software in a real-time, multi-threaded, multi-tier server architecture environment to create high and low level design for concurrent high throughput, low latency software architecture Provide software development plans that meet future needs of clients and markets Evolve the new software platform and architecture by introducing new components and integrating them with existing ones Perform memory and resource management analysis Analyze stack traces, core dumps and production incident reports from traders and support teams Propose fixes, enhancements to existing trading systems Adhere to release and sprint planning with the Quality Assurance Group and Project Management Attend and participate in daily scrum meetings Design, develop, program server-side software components What Will You Bring to the Table? A minimum 5 years of solid modern C++ development experience and the ability to understand, write, review and debug multithreaded code is required Proven experience in multi-threaded applications with a focus on performance is required Experience in the trading industry (specifically market data algorithmic trading) is strongly preferred Experience with Linux operating systems is a plus Knowledge of Python is a plus Knowledge of Scala is a plus Experience with financial trading systems experience is a plus but not required Strong object-oriented design and programming skills Ability to understand business requirements and translate them into technical requirements and working application code Familiarity with agile/iterative development methodologies Solid debugging and performance tuning skills What We Bring to the Table? Competitive benefits, including medical, dental, vision, FSA, HSA, 401(k) and pre-tax transit/parking Flexible work schedules with some remote work 22 PTO (paid time off) days per year with the ability to roll over days into the following year, one day per year available for volunteering, 2 Training days per year to allow uninterrupted professional development, 1 additional PTO day added during a milestone year, robust paid holiday schedule with early dismissal, generous parental leave (for all genders and staff, including adoptive parents), and backup child and pet care as well as tutoring services The company provided top-of-the-line tech resources and a tech accessories budget for monitors, headphones, keyboards, office equipment. Milestone anniversary bonuses Stipend and subsidy contributions toward personally-owned cell phones and laptops, gym memberships, and health/wellness initiatives (including discounted healthcare premiums and healthy meal delivery programs) If you are interested. If you have any doubts, we are available over email and calls to clarify you If you are interested, Kindly apply for this position with above details to process further with our client. If you have any queries, please feel free to contact us, Thank you for your time. With regards Arun Integration Minds +91-9036020999 This job is provided by Shine.com Show more Show less
Posted 1 week ago
4.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Role Summary Pfizer’s purpose is to deliver breakthroughs that change patients’ lives. Research and Development is at the heart of fulfilling Pfizer’s purpose as we work to translate advanced science and technologies into the therapies and vaccines that matter most. Whether you are in the discovery sciences, ensuring drug safety and efficacy or supporting clinical trials, you will apply cutting edge design and process development capabilities to accelerate and bring the best in class medicines to patients around the world. Pfizer is seeking a highly skilled and motivated AI Engineer to join our advanced technology team. The successful candidate will be responsible for developing, implementing, and optimizing artificial intelligence models and algorithms to drive innovation and efficiency in our Data Analytics and Supply Chain solutions. This role demands a collaborative mindset, a passion for cutting-edge technology, and a commitment to improving patient outcomes. Role Responsibilities Lead data modeling and engineering efforts within advanced data platforms teams to achieve digital outcomes. Provides guidance and may lead/co-lead moderately complex projects. Oversee the development and execution of test plans, creation of test scripts, and thorough data validation processes. Lead the architecture, design, and implementation of Cloud Data Lake, Data Warehouse, Data Marts, and Data APIs. Lead the development of complex data products that benefit PGS and ensure reusability across the enterprise. Collaborate effectively with contractors to deliver technical enhancements. Oversee the development of automated systems for building, testing, monitoring, and deploying ETL data pipelines within a continuous integration environment. Collaborate with backend engineering teams to analyze data, enhancing its quality and consistency. Conduct root cause analysis and address production data issues. Lead the design, develop, and implement AI models and algorithms to solve sophisticated data analytics and supply chain initiatives. Stay abreast of the latest advancements in AI and machine learning technologies and apply them to Pfizer's projects. Provide technical expertise and guidance to team members and stakeholders on AI-related initiatives. Document and present findings, methodologies, and project outcomes to various stakeholders. Integrate and collaborate with different technical teams across Digital to drive overall implementation and delivery. Ability to work with large and complex datasets, including data cleaning, preprocessing, and feature selection. Basic Qualifications A bachelor's or master’s degree in computer science, Artificial Intelligence, Machine Learning, or a related discipline. Over 4 years of experience as a Data Engineer, Data Architect, or in Data Warehousing, Data Modeling, and Data Transformations. Over 2 years of experience in AI, machine learning, and large language models (LLMs) development and deployment. Proven track record of successfully implementing AI solutions in a healthcare or pharmaceutical setting is preferred. Strong understanding of data structures, algorithms, and software design principles Programming Languages: Proficiency in Python, SQL, and familiarity with Java or Scala AI and Automation: Knowledge of AI-driven tools for data pipeline automation, such as Apache Airflow or Prefect. Ability to use GenAI or Agents to augment data engineering practices Preferred Qualifications Data Warehousing: Experience with data warehousing solutions such as Amazon Redshift, Google BigQuery, or Snowflake. ETL Tools: Knowledge of ETL tools like Apache NiFi, Talend, or Informatica. Big Data Technologies: Familiarity with Hadoop, Spark, and Kafka for big data processing. Cloud Platforms: Hands-on experience with cloud platforms such as AWS, Azure, or Google Cloud Platform (GCP). Containerization: Understanding of Docker and Kubernetes for containerization and orchestration. Data Integration: Skills in integrating data from various sources, including APIs, databases, and external files. Data Modeling: Understanding of data modeling and database design principles, including graph technologies like Neo4j or Amazon Neptune. Structured Data: Proficiency in handling structured data from relational databases, data warehouses, and spreadsheets. Unstructured Data: Experience with unstructured data sources such as text, images, and log files, and tools like Apache Solr or Elasticsearch. Data Excellence: Familiarity with data excellence concepts, including data governance, data quality management, and data stewardship. Non-standard Work Schedule, Travel Or Environment Requirements Occasionally travel required Work Location Assignment: Hybrid The annual base salary for this position ranges from $96,300.00 to $160,500.00. In addition, this position is eligible for participation in Pfizer’s Global Performance Plan with a bonus target of 12.5% of the base salary and eligibility to participate in our share based long term incentive program. We offer comprehensive and generous benefits and programs to help our colleagues lead healthy lives and to support each of life’s moments. Benefits offered include a 401(k) plan with Pfizer Matching Contributions and an additional Pfizer Retirement Savings Contribution, paid vacation, holiday and personal days, paid caregiver/parental and medical leave, and health benefits to include medical, prescription drug, dental and vision coverage. Learn more at Pfizer Candidate Site – U.S. Benefits | (uscandidates.mypfizerbenefits.com). Pfizer compensation structures and benefit packages are aligned based on the location of hire. The United States salary range provided does not apply to Tampa, FL or any location outside of the United States. Relocation assistance may be available based on business needs and/or eligibility. Sunshine Act Pfizer reports payments and other transfers of value to health care providers as required by federal and state transparency laws and implementing regulations. These laws and regulations require Pfizer to provide government agencies with information such as a health care provider’s name, address and the type of payments or other value received, generally for public disclosure. Subject to further legal review and statutory or regulatory clarification, which Pfizer intends to pursue, reimbursement of recruiting expenses for licensed physicians may constitute a reportable transfer of value under the federal transparency law commonly known as the Sunshine Act. Therefore, if you are a licensed physician who incurs recruiting expenses as a result of interviewing with Pfizer that we pay or reimburse, your name, address and the amount of payments made currently will be reported to the government. If you have questions regarding this matter, please do not hesitate to contact your Talent Acquisition representative. EEO & Employment Eligibility Pfizer is committed to equal opportunity in the terms and conditions of employment for all employees and job applicants without regard to race, color, religion, sex, sexual orientation, age, gender identity or gender expression, national origin, disability or veteran status. Pfizer also complies with all applicable national, state and local laws governing nondiscrimination in employment as well as work authorization and employment eligibility verification requirements of the Immigration and Nationality Act and IRCA. Pfizer is an E-Verify employer. This position requires permanent work authorization in the United States. Information & Business Tech Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Pune, Maharashtra, India
Remote
It was nice visiting your profile in portal, One of our top MNC client has critical job position on C++ Software Engineer for Pune Location Please Apply relevant Profiles Candidates should have Trading platform experience Candidates Required Skill: C++ Software Engineer Years of Experience:5 to 10 Years, CTC: Can be discussed Notice Period: Immediate Joiners or 15-20 Days or can be discussed Work Location:Pune Working model:Hybred Interview: Online Job Description We are currently seeking a driven and talented C++ Software Engineer to join our team. Our Software Engineers thrive on pushing the limits of technology to produce state-of-the-art applications for TTs platform, which is the front-end screen of choice for professional derivatives traders around the world. As a Software Engineer you will work on our award-winning trading platform which incorporates robust, high-performance tools for spread trading, strategy creation, algorithmic and automated trading, black-box execution, high-frequency proximity-based trading and more. What WIll You be Involved With? Code day to day in C++ and other programming languages Design and implement software requirements and new product features Enhance and maintain existing functionality Participate in design discussions and review sessions Create high-level and detailed design documents Assist with product documentation, unit testing and ensuring overall product quality Support, maintain, and enhance existing and new product functionality for trading software in a real-time, multi-threaded, multi-tier server architecture environment to create high and low level design for concurrent high throughput, low latency software architecture Provide software development plans that meet future needs of clients and markets Evolve the new software platform and architecture by introducing new components and integrating them with existing ones Perform memory and resource management analysis Analyze stack traces, core dumps and production incident reports from traders and support teams Propose fixes, enhancements to existing trading systems Adhere to release and sprint planning with the Quality Assurance Group and Project Management Attend and participate in daily scrum meetings Design, develop, program server-side software components What Will You Bring to the Table? A minimum 5 years of solid modern C++ development experience and the ability to understand, write, review and debug multithreaded code is required Proven experience in multi-threaded applications with a focus on performance is required Experience in the trading industry (specifically market data algorithmic trading) is strongly preferred Experience with Linux operating systems is a plus Knowledge of Python is a plus Knowledge of Scala is a plus Experience with financial trading systems experience is a plus but not required Strong object-oriented design and programming skills Ability to understand business requirements and translate them into technical requirements and working application code Familiarity with agile/iterative development methodologies Solid debugging and performance tuning skills What We Bring to the Table? Competitive benefits, including medical, dental, vision, FSA, HSA, 401(k) and pre-tax transit/parking Flexible work schedules with some remote work 22 PTO (paid time off) days per year with the ability to roll over days into the following year, one day per year available for volunteering, 2 Training days per year to allow uninterrupted professional development, 1 additional PTO day added during a milestone year, robust paid holiday schedule with early dismissal, generous parental leave (for all genders and staff, including adoptive parents), and backup child and pet care as well as tutoring services The company provided top-of-the-line tech resources and a tech accessories budget for monitors, headphones, keyboards, office equipment. Milestone anniversary bonuses Stipend and subsidy contributions toward personally-owned cell phones and laptops, gym memberships, and health/wellness initiatives (including discounted healthcare premiums and healthy meal delivery programs) If you are interested. If you have any doubts, we are available over email and calls to clarify you If you are interested, Kindly apply for this position with above details to process further with our client. If you have any queries, please feel free to contact us, Thank you for your time. With regards Arun Integration Minds +91-9036020999 This job is provided by Shine.com Show more Show less
Posted 1 week ago
7.0 - 12.0 years
25 - 30 Lacs
Hyderabad, Bengaluru
Hybrid
Cloud Data Engineer The Cloud Data Engineer will be responsible for developing the data lake platform and all applications on Azure cloud. Proficiency in data engineering, data modeling, SQL, and Python programming is essential. The Data Engineer will provide design and development solutions for applications in the cloud. Essential Job Functions: Understand requirements and collaborate with the team to design and deliver projects. Design and implement data lake house projects within Azure. Develop application lifecycle utilizing Microsoft Azure technologies. Participate in design, planning, and necessary documentation. Engage in Agile ceremonies including daily standups, scrum, retrospectives, demos, and code reviews. Hands-on experience with Python/SQL development and Azure data pipelines. Collaborate with the team to develop and deliver cross-functional products. Key Skills: a. Data Engineering and SQL b. Python c. PySpark d. Azure Data Lake and ADF e. Databricks f. CI/CD g. Strong communication Other Responsibilities: Document and maintain project artifacts. Maintain comprehensive knowledge of industry standards, methodologies, processes, and best practices. Complete training as required for Privacy, Code of Conduct, etc. Promptly report any known or suspected loss, theft, or unauthorized disclosure or use of PI to the General Counsel/Chief Compliance Officer or Chief Information Officer. Adhere to the company's compliance program. Safeguard the company's intellectual property, information, and assets. Other duties as assigned. Minimum Qualifications and Job Requirements: Bachelor's degree in Computer Science. 7 years of hands-on experience in designing and developing distributed data pipelines. 5 years of hands-on experience in Azure data service technologies. 5 years of hands-on experience in Python, SQL, Object-oriented programming, ETL, and unit testing. Experience with data integration with APIs, Web services, Queues. Experience with Azure DevOps and CI/CD as well as agile tools and processes including JIRA, Confluence. *Required: Azure data engineering associate and databricks data engineering certification
Posted 1 week ago
3.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Description & Requirements Electronic Arts creates next-level entertainment experiences that inspire players and fans around the world. Here, everyone is part of the story. Part of a community that connects across the globe. A place where creativity thrives, new perspectives are invited, and ideas matter. A team where everyone makes play happen. Software Engineer II - AI/ML Engineer The EA Digital Platform (EADP) group is the core powering the global EA ecosystem. We provide the foundation for all of EA’s incredible games and player experiences with high-level platforms like Cloud, Commerce, Data and AI, Gameplay Services, Identity and Social. By providing reusable capabilities that game teams can easily integrate into their work, we let them focus on making some of the best games in the world and creating meaningful relationships with our players. We’re behind the curtain, making it all work together. Come power the future of play with us. The Challenge Ahead We are looking for developers who want to work on a large-scale distributed data system that empowers EA Games to personalize player experience and engagement. Responsibilities You will help with designing, implementing and optimizing the infrastructure for AI model training and deployment platform You will help with integrating AI capabilities into existing software systems and applications You will develop tools and systems to monitor the performance of the platform in real-time, analyzing key metrics, and proactively identify and address any issues or opportunities for improvement. You will participate in code reviews to maintain code quality and ensure best practices You will help with feature and operation enhancement for platform under senior guidance You will help with improving the stability and observability of the platform Qualifications Bachelor's degree or foreign degree equivalent in Computer Science, Electrical Engineering, or related field. 3+ years of experience with software development and model development Experience with a programming language such as Go, Java or Scala Experience with scripting languages such as bash, awk, python Experience with Scikit-Learn, Pandas, Matplotlib 3+ years of experience with Deep Learning frameworks like PyTorch, TensorFlow, CUDA Hands-on experience of any ML Platform (Sagemaker, Azure ML, GCP Vertex AI) Experience with cloud services and modern data technologies Experience with data streaming and processing systems About Electronic Arts We’re proud to have an extensive portfolio of games and experiences, locations around the world, and opportunities across EA. We value adaptability, resilience, creativity, and curiosity. From leadership that brings out your potential, to creating space for learning and experimenting, we empower you to do great work and pursue opportunities for growth. We adopt a holistic approach to our benefits programs, emphasizing physical, emotional, financial, career, and community wellness to support a balanced life. Our packages are tailored to meet local needs and may include healthcare coverage, mental well-being support, retirement savings, paid time off, family leaves, complimentary games, and more. We nurture environments where our teams can always bring their best to what they do. Electronic Arts is an equal opportunity employer. All employment decisions are made without regard to race, color, national origin, ancestry, sex, gender, gender identity or expression, sexual orientation, age, genetic information, religion, disability, medical condition, pregnancy, marital status, family status, veteran status, or any other characteristic protected by law. We will also consider employment qualified applicants with criminal records in accordance with applicable law. EA also makes workplace accommodations for qualified individuals with disabilities as required by applicable law. Show more Show less
Posted 1 week ago
3.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
About Simpleenergy Simpleenergy specializes in the manufacture of smart electric two-wheelers. We are a team of 300+ engineers coming together to make smart, supercharging, and affordable two-wheelers. The company was founded in 2019 and is based in Bangalore, India. Our mission is to build the future of mobility that is electric and connected. We at Simple energy are working towards accelerating by making them more accessible, affordable, secure and comfortable and we embrace the responsibility to lead the change that will make our world better, safer and more equitable for all. Job description: Data Engineer Location: Yelahanka, Bangalore About The Gig We’re on the lookout for a Data Engineer who loves building scalable data pipelines and can dance with Kafka and Flink like they’re on their playlist. If Spark is your old buddy, even better—but it’s not a deal-breaker. What You’ll Do Design, build, and maintain real-time and batch data pipelines using Apache Kafka and Apache Flink. Ensure high-throughput, low-latency, and fault-tolerant data ingestion for telemetry, analytics, and system monitoring. Work closely with backend and product teams to define event contracts and data models. Maintain schema consistency and versioning across high-volume event streams. Optimize Flink jobs for memory, throughput, and latency. If you know a little Spark, help out with batch processing and offline analytics too (we won’t complain) Ensure data quality, lineage, and observability for everything that flows through your pipelines. What You Bring 3+ years of experience as a data/backend engineer working with real-time or streaming systems. Hands-on experience with Kafka (topics, partitions, consumers, etc.). Experience writing production-grade Flink jobs (DataStream API preferred). Good fundamentals in distributed systems, partitioning strategies, and stateful processing. Comfortable with any one programming language – Java, Scala, or Python. Basic working knowledge of Spark is a plus (optional, but nice to have). Comfortable working in a cloud-native environment (GCP or AWS). 🎁 Bonus Points Experience with Protobuf/Avro schemas and schema registry. Exposure to time-series data (we live and breathe CAN signals). Interest in vehicle data, IoT, or edge computing. Why Simple Energy? You’ll build pipelines that move billions of records a day from electric vehicles across India. You’ll be part of a lean, fast-moving team where decisions happen fast and learning is constant. Your code will directly impact how we track, monitor, and improve our vehicles on the road. Zero fluff. Full impact. Skills: scala,cloud-native environments,time-series data,data quality,java,avro,batch data pipelines,pipelines,apache flink,data ingestion,flink,kafka,data lineage,distributed systems,gcp,python,real-time data pipelines,aws,data,protobuf,apache kafka Show more Show less
Posted 1 week ago
4.0 years
0 Lacs
Kochi, Kerala, India
On-site
Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your Role And Responsibilities As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and AWS Cloud Data Platform Responsibilities Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark, Scala, and Hive, Hbase or other NoSQL databases on Cloud Data Platforms (AWS) or HDFS Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the platform Experience in developing streaming pipelines Experience to work with Hadoop / AWS eco system components to implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Apache Spark, Kafka, any Cloud computing etc Preferred Education Master's Degree Required Technical And Professional Expertise Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala ; Minimum 3 years of experience on Cloud Data Platforms on AWS; Experience in AWS EMR / AWS Glue / DataBricks, AWS RedShift, DynamoDB Good to excellent SQL skills Exposure to streaming solutions and message brokers like Kafka technologies Preferred Technical And Professional Experience Certification in AWS and Data Bricks or Cloudera Spark Certified developers Show more Show less
Posted 1 week ago
10.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Position Overview: ShyftLabs is seeking an experienced Databricks Architect to lead the design, development, and optimization of big data solutions using the Databricks Unified Analytics Platform. This role requires deep expertise in Apache Spark, SQL, Python, and cloud platforms (AWS/Azure/GCP). The ideal candidate will collaborate with cross-functional teams to architect scalable, high-performance data platforms and drive data-driven innovation. ShyftLabs is a growing data product company that was founded in early 2020 and works primarily with Fortune 500 companies. We deliver digital solutions built to accelerate business growth across various industries by focusing on creating value through innovation. Job Responsibilities Architect, design, and optimize big data and AI/ML solutions on the Databricks platform. Develop and implement highly scalable ETL pipelines for processing large datasets. Lead the adoption of Apache Spark for distributed data processing and real-time analytics. Define and enforce data governance, security policies, and compliance standards. Optimize data lakehouse architectures for performance, scalability, and cost-efficiency. Collaborate with data scientists, analysts, and engineers to enable AI/ML-driven insights. Oversee and troubleshoot Databricks clusters, jobs, and performance bottlenecks. Automate data workflows using CI/CD pipelines and infrastructure-as-code practices. Ensure data integrity, quality, and reliability across all data processes. Basic Qualifications: Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related field. 10+ years of hands-on experience in data engineering, with at least 5+ years in Databricks Architect and Apache Spark. Proficiency in SQL, Python, or Scala for data processing and analytics. Extensive experience with cloud platforms (AWS, Azure, or GCP) for data engineering. Strong knowledge of ETL frameworks, data lakes, and Delta Lake architecture. Hands-on experience with CI/CD tools and DevOps best practices. Familiarity with data security, compliance, and governance best practices. Strong problem-solving and analytical skills in a fast-paced environment. Preferred Qualifications: Databricks certifications (e.g., Databricks Certified Data Engineer, Spark Developer). Hands-on experience with MLflow, Feature Store, or Databricks SQL. Exposure to Kubernetes, Docker, and Terraform. Experience with streaming data architectures (Kafka, Kinesis, etc.). Strong understanding of business intelligence and reporting tools (Power BI, Tableau, Looker). Prior experience working with retail, e-commerce, or ad-tech data platforms. We are proud to offer a competitive salary alongside a strong insurance package. We pride ourselves on the growth of our employees, offering extensive learning and development resources. Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Ahmedabad, Gujarat, India
Remote
About Qualitrol Qualitrol is a leader in providing condition monitoring solutions for the electricity industry, ensuring reliability and efficiency in high-voltage electrical assets. We leverage cutting-edge technology, data analytics, and AI to transform how utilities manage their assets and make data-driven decisions. Role Summary We are looking for a highly skilled Senior Data Engineer to join our team and drive the development of our data engineering capabilities. This role involves designing, developing, and maintaining scalable data pipelines, optimizing data infrastructure, and ensuring high-quality data for analytics and AI-driven solutions. The ideal candidate will have deep expertise in data modeling, cloud-based data platforms, and best practices in data engineering. Key Responsibilities Design, develop, and optimize scalable ETL/ELT pipelines for large-scale industrial data. Architect and maintain data warehouses, lakes, and streaming solutions to support analytics and AI-driven insights. Implement data governance, security, and quality best practices to ensure data integrity and compliance. Work closely with Data Scientists, AI Engineers, and Software Developers to build robust data solutions. Optimize data infrastructure performance for real-time and batch processing. Leverage cloud-based technologies (AWS, Azure, GCP) to develop and deploy scalable data solutions. Develop and maintain APIs and data access layers for seamless integration across platforms. Collaborate with cross-functional teams to define and implement data strategy and architecture. Stay up to date with emerging data engineering technologies and best practices. Required Qualifications & Experience 5+ years of experience in data engineering, software development, or related fields. Proficiency in programming languages such as Python, Scala, or Java. Expertise in SQL and database technologies (PostgreSQL, MySQL, NoSQL, etc.). Hands-on experience with big data technologies (e.g., Spark, Kafka, Hadoop). Strong understanding of data warehousing (e.g., Snowflake, Redshift, BigQuery) and data lake architectures. Experience with cloud platforms (AWS, Azure, or GCP) and cloud-native data solutions. Knowledge of CI/CD pipelines, DevOps, and infrastructure as code (Terraform, Kubernetes, Docker). Familiarity with ML Ops and AI-driven data workflows is a plus. Strong problem-solving skills, ability to work independently, and excellent communication skills. Preferred Qualifications Experience in the electricity, utilities, or industrial sectors. Knowledge of IoT data ingestion and edge computing. Familiarity with GraphQL and RESTful API development. Experience in data visualization and business intelligence tools (Power BI, Tableau, etc.). Contributions to open-source data engineering projects. What We Offer Competitive salary and performance-based incentives. Comprehensive benefits package, including health, dental, and retirement plans. Opportunities for career growth and professional development. A dynamic work environment focused on innovation and cutting-edge technology. Hybrid/remote work flexibility (depending on location and project needs). How To Apply Interested candidates should submit their resume and a cover letter detailing their experience and qualifications. Fortive Corporation Overview Fortive’s essential technology makes the world stronger, safer, and smarter. We accelerate transformation across a broad range of applications including environmental, health and safety compliance, industrial condition monitoring, next-generation product design, and healthcare safety solutions. We are a global industrial technology innovator with a startup spirit. Our forward-looking companies lead the way in software-powered workflow solutions, data-driven intelligence, AI-powered automation, and other disruptive technologies. We’re a force for progress, working alongside our customers and partners to solve challenges on a global scale, from workplace safety in the most demanding conditions to groundbreaking sustainability solutions. We are a diverse team 17,000 strong, united by a dynamic, inclusive culture and energized by limitless learning and growth. We use the proven Fortive Business System (FBS) to accelerate our positive impact. At Fortive, we believe in you. We believe in your potential—your ability to learn, grow, and make a difference. At Fortive, we believe in us. We believe in the power of people working together to solve problems no one could solve alone. At Fortive, we believe in growth. We’re honest about what’s working and what isn’t, and we never stop improving and innovating. Fortive: For you, for us, for growth. About Qualitrol QUALITROL manufactures monitoring and protection devices for high value electrical assets and OEM manufacturing companies. Established in 1945, QUALITROL produces thousands of different types of products on demand and customized to meet our individual customers’ needs. We are the largest and most trusted global leader for partial discharge monitoring, asset protection equipment and information products across power generation, transmission, and distribution. At Qualitrol, we are redefining condition-based monitoring. We Are an Equal Opportunity Employer. Fortive Corporation and all Fortive Companies are proud to be equal opportunity employers. We value and encourage diversity and solicit applications from all qualified applicants without regard to race, color, national origin, religion, sex, age, marital status, disability, veteran status, sexual orientation, gender identity or expression, or other characteristics protected by law. Fortive and all Fortive Companies are also committed to providing reasonable accommodations for applicants with disabilities. Individuals who need a reasonable accommodation because of a disability for any part of the employment application process, please contact us at applyassistance@fortive.com. Bonus or Equity This position is also eligible for bonus as part of the total compensation package. QUALITROL manufactures monitoring and protection devices for high value electrical assets and OEM manufacturing companies. Established in 1945, QUALITROL produces thousands of different types of products on demand and customized to meet our individual customers’ needs. We are the largest and most trusted global leader for partial discharge monitoring, asset protection equipment and information products across power generation, transmission, and distribution. At Qualitrol, we are redefining condition-based monitoring. We Are an Equal Opportunity Employer. Fortive Corporation and all Fortive Companies are proud to be equal opportunity employers. We value and encourage diversity and solicit applications from all qualified applicants without regard to race, color, national origin, religion, sex, age, marital status, disability, veteran status, sexual orientation, gender identity or expression, or other characteristics protected by law. Fortive and all Fortive Companies are also committed to providing reasonable accommodations for applicants with disabilities. Individuals who need a reasonable accommodation because of a disability for any part of the employment application process, please contact us at applyassistance@fortive.com. This position is also eligible for bonus as part of the total compensation package. Show more Show less
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Scala is a popular programming language that is widely used in India, especially in the tech industry. Job seekers looking for opportunities in Scala can find a variety of roles across different cities in the country. In this article, we will dive into the Scala job market in India and provide valuable insights for job seekers.
These cities are known for their thriving tech ecosystem and have a high demand for Scala professionals.
The salary range for Scala professionals in India varies based on experience levels. Entry-level Scala developers can expect to earn around INR 6-8 lakhs per annum, while experienced professionals with 5+ years of experience can earn upwards of INR 15 lakhs per annum.
In the Scala job market, a typical career path may look like: - Junior Developer - Scala Developer - Senior Developer - Tech Lead
As professionals gain more experience and expertise in Scala, they can progress to higher roles with increased responsibilities.
In addition to Scala expertise, employers often look for candidates with the following skills: - Java - Spark - Akka - Play Framework - Functional programming concepts
Having a good understanding of these related skills can enhance a candidate's profile and increase their chances of landing a Scala job.
Here are 25 interview questions that you may encounter when applying for Scala roles:
As you explore Scala jobs in India, remember to showcase your expertise in Scala and related skills during interviews. Prepare well, stay confident, and you'll be on your way to a successful career in Scala. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.