Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
2.0 - 6.0 years
0 Lacs
karnataka
On-site
We have an exciting and rewarding opportunity for you to take your software engineering career to the next level. As a Software Engineer III - Big Data/Java/Scala at JPMorgan Chase within the Liquidity Risk (LRI) team, you will design and implement the next generation build out of a cloud native liquidity risk management platform for JPMC. The Liquidity Risk technology organization aims to provide comprehensive solutions to managing the firm's liquidity risk and to meet our regulatory reporting obligations across 50+ markets. The program will include the strategic build out of advanced liquidity calculation engines, incorporate AI and ML into our liquidity risk processes, and bring digital-first reporting capabilities. The target platform must process 40-60 million transactions and positions daily, calculate risk presented by the current actual as well as model-based what-if state of the market, build a multidimensional picture of the corporate risk profile, and provide the ability to analyze it in real time. Job Responsibilities: Executes standard software solutions, design, development, and technical troubleshooting. Applies knowledge of tools within the Software Development Life Cycle toolchain to improve the value realized by automation. Gathers, analyzes, and draws conclusions from large, diverse data sets to identify problems and contribute to decision-making in service of secure, stable application development. Learns and applies system processes, methodologies, and skills for the development of secure, stable code and systems. Adds to team culture of diversity, equity, inclusion, and respect. Contributes to team drive for continual improvement of development process and innovative solutions to meet business needs. Applies appropriate dedication to support the business goals through technology solutions. Required Qualifications, Capabilities, and Skills: Formal training or certification on software engineering concepts and 2+ years applied experience. Hands-on development experience and in-depth knowledge of Java, Scala, Spark, Bigdata related technologies. Hands-on practical experience in system design, application development, testing, and operational stability. Experience in cloud technologies (AWS). Experience across the whole Software Development Life Cycle. Experience to agile methodologies such as CI/CD, Applicant Resiliency, and Security. Emerging knowledge of software applications and technical processes within a technical discipline. Ability to work closely with stakeholders to define requirements. Interacting with partners across feature teams to collaborate on reusable services to meet solution requirements. Preferred Qualifications, Capabilities, and Skills: Experience of working in big data solutions with evidence of ability to analyze data to drive solutions. Exposure to complex computing using JVM and Big data. Ability to find the issue and optimize an existing workflow.,
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
pune, maharashtra
On-site
As a Data Platform developer at Barclays, you will play a crucial role in shaping the digital landscape and enhancing customer experiences. Leveraging cutting-edge technology, you will work alongside a team of engineers, business analysts, and stakeholders to deliver high-quality solutions that meet business requirements. Your responsibilities will include tackling complex technical challenges, building efficient data pipelines, and staying updated on the latest technologies to continuously enhance your skills. To excel in this role, you should have hands-on coding experience in Python, along with a strong understanding and practical experience in AWS development. Experience with tools such as Lambda, Glue, Step Functions, IAM roles, and various AWS services will be essential. Additionally, your expertise in building data pipelines using Apache Spark and AWS services will be highly valued. Strong analytical skills, troubleshooting abilities, and a proactive approach to learning new technologies are key attributes for success in this role. Furthermore, experience in designing and developing enterprise-level software solutions, knowledge of different file formats like JSON, Iceberg, Avro, and familiarity with streaming services such as Kafka, MSK, Kinesis, and Glue Streaming will be advantageous. Effective communication and collaboration skills are essential to interact with cross-functional teams and document best practices. Your role will involve developing and delivering high-quality software solutions, collaborating with various stakeholders to define requirements, promoting a culture of code quality, and staying updated on industry trends. Adherence to secure coding practices, implementation of effective unit testing, and continuous improvement are integral parts of your responsibilities. As a Data Platform developer, you will be expected to lead and supervise a team, guide professional development, and ensure the delivery of work to a consistently high standard. Your impact will extend to related teams within the organization, and you will be responsible for managing risks, strengthening controls, and contributing to the achievement of organizational objectives. Ultimately, you will be part of a team that upholds Barclays" values of Respect, Integrity, Service, Excellence, and Stewardship, while embodying the Barclays Mindset of Empower, Challenge, and Drive in your daily interactions and work ethic.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
sonipat, haryana
On-site
As a Data Engineer + Subject Matter Expert in Data Mining at Newton School of Technology, you will play a crucial role in revolutionizing technology education and empowering students to bridge the employability gap in the tech industry. You will have the opportunity to develop and deliver engaging lectures, mentor students, and contribute to the academic and research environment of the Computer Science Department. Your key responsibilities will include developing comprehensive lectures on "Data Mining", BigData, and Data Analytics courses, covering foundational concepts to advanced techniques. You will guide students on the complete data lifecycle, including preprocessing, cleaning, transformation, and feature engineering. Teaching a wide range of algorithms for Classification, Association rules mining, Clustering, and Anomaly Detections will be a part of your role. Moreover, you will design practical lab sessions, grade assessments, mentor students on projects, and stay updated with the latest advancements in data engineering and machine learning to ensure the curriculum remains cutting-edge. To excel in this role, you are required to have a Ph.D. or a Master's degree with significant industry experience in Computer Science, Data Science, Artificial Intelligence, or related fields. Your expertise in data engineering and machine learning concepts, proficiency in Python and its data science ecosystem, experience in teaching complex topics at the undergraduate level, and excellent communication skills are essential qualifications. Preferred qualifications include a record of academic publications, industry experience as a Data Scientist or in a similar role, familiarity with big data technologies and deep learning frameworks, and experience in mentoring student teams for data science competitions or hackathons. By joining Newton School of Technology, you will be offered competitive salary packages, access to advanced labs and facilities, and the opportunity to be part of a forward-thinking academic team shaping the future of tech education. If you are passionate about transforming technology education, empowering students, and staying at the forefront of data engineering and machine learning, we are excited about the possibility of you joining our team at Newton School of Technology. For more information about our university, please visit our website: Newton School of Technology.,
Posted 1 week ago
10.0 - 15.0 years
20 - 30 Lacs
Hyderabad, Bengaluru, Delhi / NCR
Hybrid
Role: Lead Data Engineer Exp.: 10+ years Location: Pune, Bengaluru, Hyderabad, Chennai, Gurugram, Noida Work Mode: Hybrid (3 days work from office) Key Skills: Snowflake, SQL, Data Engineering, ETL, Any Cloud (GCP/AWS/Azure) Must Have Skills: Proficient in snowflake and SQL 4+ years of experience in snowflake and 8+ years of experience in SQL Atleast 10+ years of experience in data engineering development project Atleast 6+ years of experience in Data Engineering in the cloud technology Strong expertise with Snowflake data warehouse platform, including architecture, features, and best practices. Hands on experience ETL and DE tools Design, develop, and maintain efficient ETL/ELT pipelines using Snowflake and related data engineering tools. Optimize Snowflake data warehouses for performance, cost, and scalability. Collaborate with data scientists, analysts, and business stakeholders to understand data requirements and deliver data solutions. Implement data modeling and schema design best practices in Snowflake. Good communication skills is a must Good to have skills: Knowledge of DNA/Fiserv- core banking system Knowledge of data governance, security, and compliance standards
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
As an Infoscion, your primary responsibility is to interact with clients to address quality assurance issues and ensure utmost customer satisfaction. You will be involved in understanding requirements, creating and reviewing designs, validating architecture, and delivering high-quality service offerings in the technology domain. Participating in project estimation, providing inputs for solution delivery, conducting technical risk planning, performing code reviews, and unit test plan reviews are crucial aspects of your role. Leading and guiding your teams towards developing optimized code deliverables, continuous knowledge management, and adherence to organizational guidelines and processes are also key responsibilities. You will play a significant role in building efficient programs and systems. If you believe you have the skills to assist clients in their digital transformation journey, this is the ideal place for you to thrive. In addition to the primary responsibilities, you are expected to have knowledge of multiple technologies, basic understanding of architecture and design fundamentals, familiarity with testing tools, and agile methodologies. Understanding project life cycle activities, estimation methodologies, quality processes, and business domains is essential. Analytical abilities, strong technical skills, good communication skills, and a deep understanding of technology and domains are also required. Furthermore, you should be able to demonstrate a solid understanding of software quality assurance principles, SOLID design principles, and modeling methods. Keeping abreast of the latest technologies and trends, and possessing excellent problem-solving, analytical, and debugging skills are highly valued. Preferred Skills: - Technology: Functional Programming - Scala,
Posted 1 week ago
6.0 - 11.0 years
15 - 25 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
Warm welcome from SP Staffing Services! Reaching out to you regarding permanent opportunity !! Job Description: Exp: 6-12 yrs Location: Hyderabad/Bangalore/Pune/Gurgaon Skill: GCP Data Engineer Interested can share your resume to sangeetha.spstaffing@gmail.com with below inline details. Full Name as per PAN: Mobile No: Alt No/ Whatsapp No: Total Exp: Relevant Exp in GCP: Rel Exp in Big Query: Rel Exp in Bigdata: Current CTC: Expected CTC: Notice Period (Official): Notice Period (Negotiable)/Reason: Date of Birth: PAN number: Reason for Job Change: Offer in Pipeline (Current Status): Availability for virtual interview on weekdays between 10 AM- 4 PM(plz mention time): Current Res Location: Preferred Job Location: Whether educational % in 10th std, 12th std, UG is all above 50%? Do you have any gaps in between your education or Career? If having gap, please mention the duration in months/year:
Posted 1 week ago
6.0 - 11.0 years
0 - 0 Lacs
Bengaluru
Work from Office
Experience across Enterprise BI/Big Data/DW/ETL technologies such as Teradata, Hadoop, Tableau, SAS, Hyperion, or Business Objects. Data Modelling Patterns Experience in working within a Data Delivery Life Cycle framework. Experience leading discussions and presentations. Experience in driving decisions across groups of stakeholders. Extensive experience in large enterprise environments handling large volume of datasets with High Service Level Agreement(s) 5+ years experience gained in a financial Institution, or Insurance provider, is desirable. Experience in Development projects using Ab Initio, Snowflake, AWS/Cloud Services Recognized industry certifications Role & responsibilities Preferred candidate profile
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
Salesforce is currently seeking software developers who are passionate about creating impactful solutions for users, the company, and the industry. Join a team of talented engineers to design and develop innovative features that enhance our CRM platform's stability and scalability. As a software engineer at Salesforce, you will be involved in architecture, design, implementation, and testing to ensure the delivery of high-quality products to our customers. We take pride in writing maintainable code that strengthens product stability and simplifies our work processes. Our team values individual strengths and encourages personal growth. By empowering autonomous teams, we aim to foster a culture of innovation and excellence that benefits both our employees and customers. **Your Impact** As a Senior Backend Software Engineer at Salesforce, your responsibilities will include: - Building new components to enhance our technology offerings in a dynamic market - Developing high-quality code for our cloud platform used by millions of users - Designing, implementing, and optimizing APIs and API framework features for scalability - Contributing to all phases of software development life cycle in a Hybrid Engineering model - Creating efficient components for a multi-tenant SaaS cloud environment - Conducting code reviews, mentoring junior engineers, and providing technical guidance **Required Skills:** - Proficiency in multiple programming languages and platforms - 5+ years of experience in backend software development, including designing distributed systems - Deep knowledge of object-oriented programming and scripting languages such as Java, Python, Scala, C#, Go, Node.JS, and C++ - Strong skills in PostgreSQL/SQL and experience with relational and non-relational databases - Understanding of software development best practices and leadership abilities - Degree or equivalent experience with relevant competencies **Preferred Skills:** - Experience with developing SAAS products on public cloud platforms like AWS, Azure, or GCP - Knowledge of Big Data/ML, S3, Kafka, Elastic Search, Terraform, Kubernetes, and Docker - Previous experience in a fast-paced, multinational organization **Benefits & Perks** - Comprehensive benefits package including well-being reimbursement, parental leave, adoption assistance, and more - Access to training resources on Trailhead.com - Mentorship opportunities with leadership and executive thought leaders - Volunteer programs and community engagement initiatives as part of our giving back model For further details, please visit [Salesforce Benefits Page](https://www.salesforcebenefits.com/),
Posted 1 week ago
5.0 - 8.0 years
0 - 3 Lacs
Pune, Chennai
Hybrid
Hello Connections , Exciting Opportunity Alert !! We're on the hunt for passionate individuals to join our dynamic team as Data Engineer Job Profile : Data Engineers Experience : Minimum 5 to Maximum 8 Yrs of exp Location : Chennai / Pune Mandatory Skills : Big Data | Hadoop | pyspark | spark | sparkSql | Hive Qualification : B.TECH / B.E / MCA / Computer Science Background - Any Specification How to Apply? Send your CV to: sipriyar@sightspectrum.in Contact Number - 6383476138 Don't miss out on this amazing opportunity to accelerate your professional career! #bigdata #dataengineer #hadoop #spark #python #hive #pysaprk
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
As a Data Platform Engineer Lead at Barclays, your role is crucial in building and maintaining systems that collect, store, process, and analyze data, including data pipelines, data warehouses, and data lakes. Your responsibility includes ensuring the accuracy, accessibility, and security of all data. To excel in this role, you should have hands-on coding experience in Java or Python and a strong understanding of AWS development, encompassing various services such as Lambda, Glue, Step Functions, IAM roles, and more. Proficiency in building efficient data pipelines using Apache Spark and AWS services is essential. You are expected to possess strong technical acumen, troubleshoot complex systems, and apply sound engineering principles to problem-solving. Continuous learning and staying updated with new technologies are key attributes for success in this role. Design experience in diverse projects where you have led the technical development is advantageous, especially in the Big Data/Data Warehouse domain within Financial services. Additional skills in enterprise-level software solutions development, knowledge of different file formats like JSON, Iceberg, Avro, and familiarity with streaming services such as Kafka, MSK, and Kinesis are highly valued. Effective communication, collaboration with cross-functional teams, documentation skills, and experience in mentoring team members are also important aspects of this role. Your accountabilities will include the construction and maintenance of data architectures pipelines, designing and implementing data warehouses and data lakes, developing processing and analysis algorithms, and collaborating with data scientists to deploy machine learning models. You will also be expected to contribute to strategy, drive requirements for change, manage resources and policies, deliver continuous improvements, and demonstrate leadership behaviors if in a leadership role. Ultimately, as a Data Platform Engineer Lead at Barclays in Pune, you will play a pivotal role in ensuring data accuracy, accessibility, and security while leveraging your technical expertise and collaborative skills to drive innovation and excellence in data management.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
haryana
On-site
About Impetus: Impetus Technologies is a digital engineering company dedicated to offering expert services and products to support enterprises in accomplishing their transformation objectives. Specializing in solving the analytics, AI, and cloud challenges, we empower businesses to foster unparalleled innovation and expansion. Established in 1991, we stand out as leaders in cloud and data engineering, catering cutting-edge solutions to Fortune 100 corporations. Our headquarters are located in Los Gatos, California, while our development centers span across NOIDA, Indore, Gurugram, Bengaluru, Pune, and Hyderabad, boasting a global team of over 3000 professionals. Additionally, we have operational offices in Canada and Australia and maintain collaborative relationships with renowned organizations such as American Express, Bank of America, Capital One, Toyota, United Airlines, and Verizon. Skills Required: - Bigdata - Pyspark - Hive - Spark Optimization Good to have: - GCP,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
Salesforce has immediate opportunities for software developers who want their lines of code to have a significant and measurable positive impact for users, the company's bottom line, and the industry. You will be working with a group of world-class engineers to build the breakthrough features our customers will love, adopt, and use while keeping our trusted CRM platform stable and scalable. The software engineer role at Salesforce encompasses architecture, design, implementation, and testing to ensure we build products right and release them with high quality. We pride ourselves on writing high quality, maintainable code that strengthens the stability of the product and makes our lives easier. We embrace the hybrid model and celebrate the individual strengths of each team member while cultivating everyone on the team to grow into the best version of themselves. We believe that autonomous teams with the freedom to make decisions will empower the individuals, the product, the company, and the customers they serve to thrive. As a Senior Backend Software Engineer, your job responsibilities will include: - Building new and exciting components in an ever-growing and evolving market technology to provide scale and efficiency. - Developing high-quality, production-ready code that millions of users of our cloud platform can use. - Designing, implementing, and tuning robust APIs and API framework-related features that perform and scale in a multi-tenant environment. - Working in a Hybrid Engineering model and contributing to all phases of SDLC including design, implementation, code reviews, automation, and testing of the features. - Building efficient components/algorithms on a microservice multi-tenant SaaS cloud environment. - Conducting code reviews, mentoring junior engineers, and providing technical guidance to the team (depending on the seniority level). Required Skills: - Mastery of multiple programming languages and platforms. - 5+ years of backend software development experience including designing and developing distributed systems at scale. - Deep knowledge of object-oriented programming and other scripting languages: Java, Python, Scala C#, Go, Node.JS, and C++. - Strong PostgreSQL/SQL skills and experience with relational and non-relational databases including writing queries. - A deeper understanding of software development best practices and demonstrate leadership skills. - Degree or equivalent relevant experience required. Experience will be evaluated based on the core competencies for the role (e.g. extracurricular leadership roles, military experience, volunteer roles, work experience, etc.). Preferred Skills: - Experience with developing SAAS products over public cloud infrastructure - AWS/Azure/GCP. - Experience with Big-Data/ML and S3. - Hands-on experience with Streaming technologies like Kafka. - Experience with Elastic Search. - Experience with Terraform, Kubernetes, Docker. - Experience working in a high-paced and rapidly growing multinational organization. Benefits & Perks: - Comprehensive benefits package including well-being reimbursement, generous parental leave, adoption assistance, fertility benefits, and more. - World-class enablement and on-demand training with Trailhead.com. - Exposure to executive thought leaders and regular 1:1 coaching with leadership. - Volunteer opportunities and participation in our 1:1:1 model for giving back to the community. For more details, visit [Salesforce Benefits](https://www.salesforcebenefits.com/).,
Posted 2 weeks ago
5.0 - 12.0 years
0 Lacs
haryana
On-site
As a GCP Data Developer specializing in BigData and ETL, you will be an integral part of our technology services client's team in Bangalore and Gurugram. With 5-12 years of experience in the field, your role will involve creating and maintaining database standards and policies, managing database availability and performance, defining and implementing event triggers for performance or integrity issues, and carrying out database housekeeping tasks. Monitoring usage transaction volumes, response times, and concurrency levels will also be among your responsibilities. If you find this opportunity compelling, please send your updated resume to hema.g@s3staff.com.,
Posted 2 weeks ago
8.0 - 13.0 years
10 - 20 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Roles and Responsibilities: Lead Agile project management for data product initiatives, ensuring timely delivery and alignment with strategic business objectives. Collaborate with stakeholders across the organization to identify business needs and translate them into clear data product requirements and user stories. Facilitate Agile ceremonies (daily stand-ups, sprint planning, retrospectives) to maintain team focus and momentum. Manage and prioritize the product backlog in coordination with product owners and data experts to maximize value delivery. Ensure data quality, governance, and compliance standards are met throughout the product lifecycle. Foster cross-functional collaboration among data engineers, data scientists, analysts, and business teams to resolve impediments and steer delivery. Develop and maintain product roadmaps that reflect evolving business priorities and data capabilities. Track project progress using Agile metrics and provide transparent communication to stakeholders. Support continuous improvement by coaching the team on Agile best practices and adapting processes as needed. Define and track the KPIs that transparently reflect the status of key initiatives Direct a team of 5 or more people comprising leads, principals, etc. and indirectly co-ordinate with more people as part of a cross functional teams comprising of varied roles and functions Required Skills and Experience: Bachelors degree or above preferably in Software Engineering Strong understanding of Agile frameworks such as Scrum or Kanban, and experience facilitating Agile teams. Should have experience of leading all Agile ceremonies Knowledge of data product management principles, including requirements definition, data quality, and governance. Excellent communication and stakeholder management skills to bridge technical and business perspectives. Strong business communication, presentation and conflict management skills Experience working with data professionals (data engineers, data scientists, data quality engineers) and understanding data pipelines. Proficiency with Agile project management tools like ADO, Jira or equivalent. Ability to manage competing priorities and adapt plans based on feedback and changing requirements. Proficient in delivery and quality metrics, burn down charts, progress and status reporting Knowledge of 2 or more effort and cost estimation methodologies/ frameworks Proficient in scope (requirements)/ backlog management, quality management, defect prevention and risks and issues management Nice to Have Qualities & Skills Flexibility to learn and apply new methodologies Mortgage Industry experience /knowledge Strong commercial acumen i.e. understanding of pricing models, delivery P&L, budgeting, etc. Basic level understanding of contracts Relevant certifications like Certified Scrum Master (CSM, PMP, Agile Project Management, etc. Knowledge of compliance frameworks like RESPA, TILA, CFPB, and data security standards. Knowledge of Azure Cloud
Posted 2 weeks ago
3.0 - 7.0 years
3 - 8 Lacs
Chennai
Work from Office
Job Title: Senior Programmer- AI & Data Engineering Location: Work from Office Experience Required: 3+ Years Job Type: Full-Time Department: Technology / Engineering Job Summary: We are seeking a highly skilled and motivated Senior Programmer with a strong background in AI development, Python programming , and data engineering . The ideal candidate will have hands-on experience with OpenAI models , Machine Learning , Prompt Engineering , and frameworks such as NLTK , Pandas , and Numpy . You will work on developing intelligent systems, integrating APIs, and deploying scalable solutions using modern data and cloud technologies. Key Responsibilities: Design, develop, and optimize intelligent applications using OpenAI APIs and machine learning models. Create and refine prompts for Prompt Engineering to extract desired outputs from LLMs (Large Language Models). Build and maintain scalable, reusable, and secure REST APIs for AI and data applications. Work with large datasets using Pandas , NumPy , SQL , and integrate text analytics using NLTK . Collaborate with cross-functional teams to understand requirements and translate them into technical solutions. Use the Function Framework to encapsulate business logic and automate workflows. Apply basic knowledge of cloud platforms (AWS, Azure, or GCP) for deployment and scaling. Assist in data integration, processing, and transformation for Big Data systems. Write clean, maintainable, and efficient Python code. Conduct code reviews, mentor junior developers, and lead small projects as needed. Required Skills & Qualifications: Minimum 3 years of experience in Python development with a strong focus on AI and ML. Proven expertise in OpenAI tools and APIs . Hands-on experience with Machine Learning models and Prompt Engineering techniques. Solid programming skills using Python , along with libraries like Pandas , Numpy , and NLTK . Experience developing and integrating REST APIs . Working knowledge of SQL and relational database systems. Familiarity with Function Frameworks and modular design patterns. Basic understanding of cloud platforms (AWS/GCP/Azure) and Big Data concepts. Strong problem-solving skills and ability to work in a fast-paced environment. Excellent communication and collaboration skills. Preferred Qualifications: Exposure to Docker , Kubernetes , or similar container orchestration tools. Understanding of MLOps , data pipelines , or cloud-based AI deployments . Experience with version control systems like Git and CI/CD pipelines.
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
Salesforce has immediate opportunities for software developers who want their lines of code to have significant and measurable positive impact for users, the company's bottom line, and the industry. You will be working with a group of world-class engineers to build the breakthrough features our customers will love, adopt, and use while keeping our trusted CRM platform stable and scalable. The software engineer role at Salesforce encompasses architecture, design, implementation, and testing to ensure we build products right and release them with high quality. We pride ourselves on writing high quality, maintainable code that strengthens the stability of the product and makes our lives easier. We embrace the hybrid model and celebrate the individual strengths of each team member while cultivating everyone on the team to grow into the best version of themselves. We believe that autonomous teams with the freedom to make decisions will empower the individuals, the product, the company, and the customers they serve to thrive. As a Senior Backend Software Engineer, your job responsibilities will include: - Build new and exciting components in an ever-growing and evolving market technology to provide scale and efficiency. - Develop high-quality, production-ready code that millions of users of our cloud platform can use. - Design, implement, and tune robust APIs and API framework-related features that perform and scale in a multi-tenant environment. - Work in a Hybrid Engineering model and contribute to all phases of SDLC including design, implementation, code reviews, automation, and testing of the features. - Build efficient components/algorithms on a microservice multi-tenant SaaS cloud environment. - Code review, mentoring junior engineers, and providing technical guidance to the team (depending on the seniority level). Required Skills: - Mastery of multiple programming languages and platforms. - 5+ years of backend software development experience including designing and developing distributed systems at scale. - Deep knowledge of object-oriented programming and other scripting languages: Java, Python, Scala C#, Go, Node.JS, and C++. - Strong PostgreSQL/SQL skills and experience with relational and non-relational databases including writing queries. - A deeper understanding of software development best practices and demonstrate leadership skills. - Degree or equivalent relevant experience required. Experience will be evaluated based on the core competencies for the role (e.g. extracurricular leadership roles, military experience, volunteer roles, work experience, etc.). Preferred Skills: - Experience with developing SAAS products over public cloud infrastructure - AWS/Azure/GCP. - Experience with Big-Data/ML and S3. - Hands-on experience with Streaming technologies like Kafka. - Experience with Elastic Search. - Experience with Terraform, Kubernetes, Docker. - Experience working in a high-paced and rapidly growing multinational organization. Benefits & Perks: - Comprehensive benefits package including well-being reimbursement, generous parental leave, adoption assistance, fertility benefits, and more! - World-class enablement and on-demand training with Trailhead.com. - Exposure to executive thought leaders and regular 1:1 coaching with leadership. - Volunteer opportunities and participation in our 1:1:1 model for giving back to the community. For more details, visit https://www.salesforcebenefits.com/,
Posted 2 weeks ago
7.0 - 11.0 years
0 Lacs
chennai, tamil nadu
On-site
As a PySpark Data Reconciliation Engineer, you should have at least 7 years of relevant experience in technology and development. Your technical skillset should include proficiency in Java or Python, along with hands-on experience in BigData, Hadoop, Spark, and Kafka. Additionally, familiarity with APIs and microservices architecture is essential. If you have UI development and integration experience, it would be considered a strong advantage. Your domain expertise should lie in the capital market domain, with a preference for experience in Regulatory reporting or reconciliations within a technological context. It is crucial to have a proven track record of successfully delivering large-scale projects with globally distributed teams. Your application development skills, design paradigms knowledge, and any previous experience in the data domain would be beneficial for this role. Stakeholder management and the ability to lead a global technology team are key requirements for this position. You should consistently demonstrate clear and concise written and verbal communication skills. If you meet these qualifications and are ready to take on challenging projects in a dynamic environment, we encourage you to apply for this opportunity.,
Posted 2 weeks ago
6.0 - 8.0 years
7 - 11 Lacs
Bengaluru
Work from Office
Job Title : Big data Devloper Location State : Karnataka Location City : Bangalore Experience Required : 6 to 8 Year(s) CTC Range : 7 to 11 LPA Shift: Day Shift Work Mode: Onsite Position Type: C2H Openings: 2 Company Name: VARITE INDIA PRIVATE LIMITED About The Client: Client is an Indian multinational technology company specializing in information technology services and consulting. Headquartered in Mumbai, it is a part of the Tata Group and operates in 150 locations across 46 countries. About The Job: Bigdata and Hadoop Ecosystems Essential Job Functions: Bigdata and Hadoop Ecosystems Qualifications: Bigdata and Hadoop Ecosystems How to Apply: Interested candidates are invited to submit their resume using the apply online button on this job post. About VARITE: VARITE is a global staffing and IT consulting company providing technical consulting and team augmentation services to Fortune 500 Companies in USA, UK, CANADA and INDIA. VARITE is currently a primary and direct vendor to the leading corporations in the verticals of Networking, Cloud Infrastructure, Hardware and Software, Digital Marketing and Media Solutions, Clinical Diagnostics, Utilities, Gaming and Entertainment, and Financial Services. Equal Opportunity Employer: VARITE is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. We do not discriminate on the basis of race, color, religion, sex, sexual orientation, gender identity or expression, national origin, age, marital status, veteran status, or disability status. Unlock Rewards: Refer Candidates and Earn. If you're not available or interested in this opportunity, please pass this along to anyone in your network who might be a good fit and interested in our open positions. VARITE offers a Candidate Referral program, where you'll receive a one-time referral bonus based on the following scale if the referred candidate completes a three-month assignment with VARITE. Exp Req - Referral Bonus 0 - 2 Yrs. - INR 5,000 2 - 6 Yrs. - INR 7,500 6 + Yrs. - INR 10,000
Posted 2 weeks ago
5.0 - 8.0 years
10 - 19 Lacs
Pune, Chennai, Bengaluru
Hybrid
Hello Connections , Exciting Opportunity Alert !! We're on the hunt for passionate individuals to join our dynamic team as Data Engineer Job Profile : Data Engineers Experience : Minimum 5 to Maximum 8 Yrs of exp Location : Chennai / Hyderabad / Bangalore / Pune / Mumbai Mandatory Skills : Big Data | Hadoop | SCALA| spark | sparkSql | Hive Qualification : B.TECH / B.E / MCA / Computer Science Background - Any Specification How to Apply? Send your CV to: sipriyar@sightspectrum.in Contact Number - 6383476138 Don't miss out on this amazing opportunity to accelerate your professional career! #bigdata #dataengineer#hadoop#spark #python #hive #pysaprk
Posted 3 weeks ago
6.0 - 10.0 years
0 Lacs
hyderabad, telangana
On-site
Role Description Salesforce has immediate opportunities for software developers who want their lines of code to have significant and measurable positive impact for users, the company's bottom line, and the industry. You will be working with a group of world-class engineers to build the breakthrough features our customers will love, adopt, and use while keeping our trusted CRM platform stable and scalable. The software engineer role at Salesforce encompasses architecture, design, implementation, and testing to ensure we build products right and release them with high quality. We pride ourselves on writing high quality, maintainable code that strengthens the stability of the product and makes our lives easier. We embrace the hybrid model and celebrate the individual strengths of each team member while cultivating everyone on the team to grow into the best version of themselves. We believe that autonomous teams with the freedom to make decisions will empower the individuals, the product, the company, and the customers they serve to thrive. Your Impact As a Senior Backend Software Engineer, your job responsibilities will include: Build new and exciting components in an ever-growing and evolving market technology to provide scale and efficiency. Develop high-quality, production-ready code that millions of users of our cloud platform can use. Design, implement, and tune robust APIs and API framework-related features that perform and scale in a multi-tenant environment. Work in a Hybrid Engineering model and contribute to all phases of SDLC including design, implementation, code reviews, automation, and testing of the features. Build efficient components/algorithms on a microservice multi-tenant SaaS cloud environment Code review, mentoring junior engineers, and providing technical guidance to the team (depending on the seniority level) Required Skills: Mastery of multiple programming languages and platforms 6 + years of backend software development experience including designing and developing distributed systems at scale Deep knowledge of object-oriented programming and other scripting languages: Java, Python, Scala C#, Go, Node.JS and C++. Strong PostgreSQL/SQL skills and experience with relational and non-relational databases including writing queries. A deeper understanding of software development best practices and demonstrate leadership skills. Degree or equivalent relevant experience required. Experience will be evaluated based on the core competencies for the role (e.g. extracurricular leadership roles, military experience, volunteer roles, work experience, etc.) Preferred Skills: Experience with developing SAAS products over public cloud infrastructure - AWS/Azure/GCP. Experience with Big-Data/ML and S3 Hands-on experience with Streaming technologies like Kafka Experience with Elastic Search Experience with Terraform, Kubernetes, Docker Experience working in a high-paced and rapidly growing multinational organization BENEFITS & PERKS Comprehensive benefits package including well-being reimbursement, generous parental leave, adoption assistance, fertility benefits, and more! World-class enablement and on-demand training with Trailhead.com Exposure to executive thought leaders and regular 1:1 coaching with leadership Volunteer opportunities and participation in our 1:1:1 model for giving back to the community For more details, visit https://www.salesforcebenefits.com/,
Posted 3 weeks ago
12.0 - 16.0 years
0 Lacs
chennai, tamil nadu
On-site
The Applications Development Technology Lead Analyst is a senior position responsible for implementing new or revised application systems and programs in coordination with the Technology team. The primary objective is to lead applications systems analysis and programming activities. You will partner with Business and various technology teams within Data Fabric to ensure seamless integration of functions, identify necessary system enhancements for deploying new products, regulatory requirements, and process improvements. You will also be responsible for resolving high-impact problems/projects, possessing hands-on technical experience in Java, Big Data, Oracle, and SQL analytical skills, along with developing standards for coding, testing, debugging, and implementation. As a Technology Lead Analyst, you will develop a comprehensive understanding of business areas and regulatory reporting to achieve firmwide goals, while also mentoring mid-level developers and analysts. You will need to assess risks appropriately in business decisions, ensuring compliance with laws and regulations, and safeguarding Citigroup, its clients, and assets. Qualifications: - 12-15 years of relevant experience in Apps Development or systems analysis role - Experience in Regulatory projects and successful project management - Subject Matter Expert (SME) in at least one area of Applications Development Java or Oracle - Ability to adjust priorities quickly and strong leadership skills - Clear and concise written and verbal communication Mandatory Skills: - Core Java, Multithreading, Collections framework/Data structure, Exception handling, Oops, Design pattern - Intermediate application design - Performance measurement and tuning - Spring core (context, Transactions, ORM, AOP), Spring Boot, Cloud concepts - Regulatory Reporting Domain experience Added advantage skills: - Bigdata (Hadoop, Hive, Scala, etc) - Oracle SQL Education: - Bachelor's degree/University degree or equivalent experience - Master's degree preferred This job description offers an overview of the work performed. Other duties may be assigned as required. Citi is an equal opportunity and affirmative action employer, encouraging all qualified applicants to apply for career opportunities. If you require a reasonable accommodation due to a disability, review Accessibility at Citi.,
Posted 3 weeks ago
10.0 - 20.0 years
20 - 35 Lacs
Pune
Hybrid
Hi, Wishes from GSN!!! Pleasure connecting with you!!! We been into Corporate Search Services for Identifying & Bringing in Stellar Talented Professionals for our reputed IT / Non-IT clients in India. We have been successfully providing results to various potential needs of our clients for the last 20 years. At present, GSN is one of our leading MNC client. PFB the details for your better understanding: WORK LOCATION: PUNE Job Role: Big Data Solution Architect EXPERIENCE: 10 Yrs - 20 Yrs CTC Range: 25 LPA -35 LPA Work Type: Hybrid Required Skills & Experience: 10+ years of progressive experience in software development, data engineering, and solution architecture roles, with a strong focus on large-scale distributed systems. Expertise in Big Data Technologies : Apache Spark : Deep expertise in Spark architecture, Spark SQL, Spark Streaming, performance tuning, and optimization techniques. Experience with data processing paradigms (batch and real-time). Hadoop Ecosystem : Strong understanding of HDFS, YARN, Hive, and other related Hadoop components. Real-time Data Streaming: Apache Kafka: Expert-level knowledge of Kafka architecture, topics, partitions, producers, consumers, Kafka Streams, KSQL, and best practices for high-throughput, low-latency data pipelines. NoSQL Databases: Couchbase: In-depth experience with Couchbase (or similar document/key-value NoSQL databases like MongoDB, Cassandra), including data modeling, indexing, querying (N1QL), replication, scaling, and operational best practices. API Design & Development: Extensive experience in designing and implementing robust, scalable, and secure APIs (RESTful, GraphQL) for data access and integration. Programming & Code Review: Hands-on coding proficiency in at least one relevant language (Python, Scala, Java) with a preference for Python and/or Scala for data engineering tasks. Proven experience in leading and performing code reviews, ensuring code quality, performance, and adherence to architectural guidelines. Cloud Platforms : Extensive experience designing and implementing solutions on at least one major cloud platform (AWS, Azure, GCP), leveraging their Big Data, streaming, and compute services. Database Fundamentals: Solid understanding of relational database concepts, SQL, and data warehousing principles. System Design & Architecture Patterns : Deep knowledge of various architectural patterns (e.g., Microservices, Event-Driven Architecture, Lambda/Kappa Architecture, Data Mesh) and their application in data solutions. DevOps & CI/CD: Familiarity with DevOps principles, CI/CD pipelines, infrastructure as code (IaC), and automated deployment strategies for data platforms. If interested, kindly APPLY for IMMEDIATE response Thanks & Rgds SHOBANA GSN | Mob : 8939666294 (Whatsapp) | Email :Shobana@gsnhr.net | Web : www.gsnhr.net Google Reviews : https://g.co/kgs/UAsF9W
Posted 3 weeks ago
5.0 - 8.0 years
0 - 1 Lacs
Pune, Chennai
Hybrid
Hello Connections , Exciting Opportunity Alert !! We're on the hunt for passionate individuals to join our dynamic team as Data Engineer Job Profile : Data Engineers Experience : Minimum 5 to Maximum 8 Yrs of exp Location : Chennai / Pune Mandatory Skills : Big Data | Hadoop | pyspark | spark | sparkSql | Hive Qualification : B.TECH / B.E / MCA / Computer Science Background - Any Specification How to Apply? Send your CV to: sipriyar@sightspectrum.in Contact Number - 6383476138 Don't miss out on this amazing opportunity to accelerate your professional career! #bigdata #dataengineer #hadoop #spark #python #hive #pysaprk
Posted 3 weeks ago
8.0 - 12.0 years
15 - 25 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
Warm welcome from SP Staffing Services! Reaching out to you regarding permanent opportunity !! Job Description: Exp: 8-12 yrs Location: Chennai/Hyderabad/Bangalore/Pune/Bhubaneshwar/Kochi Skill: Senior Data Modellers Interested can share your resume to sangeetha.spstaffing@gmail.com with below inline details. Full Name as per PAN: Mobile No: Alt No/ Whatsapp No: Total Exp: Relevant Exp in Data Modelling: Rel Exp in Data Warehousing: Rel Exp in AWS: Current CTC: Expected CTC: Notice Period (Official): Notice Period (Negotiable)/Reason: Date of Birth: PAN number: Reason for Job Change: Offer in Pipeline (Current Status): Availability for virtual interview on weekdays between 10 AM- 4 PM(plz mention time): Current Res Location: Preferred Job Location: Whether educational % in 10th std, 12th std, UG is all above 50%? Do you have any gaps in between your education or Career? If having gap, please mention the duration in months/year:
Posted 4 weeks ago
8.0 - 12.0 years
15 - 25 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
Warm welcome from SP Staffing Services! Reaching out to you regarding permanent opportunity !! Job Description: Exp: 8-12 yrs Location: Chennai/Hyderabad/Bangalore/Pune/Bhubaneshwar/Kochi Skill: Pyspark/AWS Glue Implementing data ingestion pipelines from different types of data sources i.e Databases, S3, Files etc.. Experience in building ETL/ Data Warehouse transformation process. Developing Big Data and non-Big Data cloud-based enterprise solutions in PySpark and SparkSQL and related frameworks/libraries, Developing scalable and re-usable, self-service frameworks for data ingestion and processing, Integrating end to end data pipelines to take data from data source to target data repositories ensuring the quality and consistency of data, Processing performance analysis and optimization, Bringing best practices in following areas: Design & Analysis, Automation (Pipelining, IaC), Testing, Monitoring, Documentation. Experience working with structured and unstructured data. Good to have (Knowledge) 1.Experience in cloud-based solutions, 2.Knowledge of data management principles. Interested can share your resume to sangeetha.spstaffing@gmail.com with below inline details. Full Name as per PAN: Mobile No: Alt No/ Whatsapp No: Total Exp: Relevant Exp in Pyspark: Rel Exp in Python: Rel Exp in AWS Glue: Current CTC: Expected CTC: Notice Period (Official): Notice Period (Negotiable)/Reason: Date of Birth: PAN number: Reason for Job Change: Offer in Pipeline (Current Status): Availability for virtual interview on weekdays between 10 AM- 4 PM(plz mention time): Current Res Location: Preferred Job Location: Whether educational % in 10th std, 12th std, UG is all above 50%? Do you have any gaps in between your education or Career? If having gap, please mention the duration in months/year:
Posted 4 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough