Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
As an ETL Testing & Big Data professional, you will be responsible for designing and implementing ETL test strategies based on business requirements. Your role involves reviewing and analyzing ETL source code, as well as developing and executing test plans and test cases for ETL processes. Data validation and reconciliation using SQL queries will be a key aspect of your responsibilities. Monitoring ETL jobs, resolving issues affecting data accuracy, and performing performance testing on ETL processes to focus on optimization are crucial tasks in this role. Ensuring data quality and integrity across various data sources, along with coordinating with development teams to troubleshoot issues and suggest improvements, are essential for success. You will be expected to utilize automation tools to enhance the efficiency of testing processes and conduct regression testing after ETL releases or updates. Documenting test results, issues, and proposals for resolution, as well as providing support to business users regarding data-related queries, are integral parts of your responsibilities. Staying updated with the latest trends in ETL testing and big data technologies, working closely with data architects to ensure effective data modeling, and participating in technical discussions to contribute to knowledge sharing are key aspects of this role. Qualifications: - Bachelor's degree in Computer Science, Information Technology, or a related field. - 3+ years of experience in ETL testing and big data environments. - Strong proficiency in SQL and data modeling techniques. - Hands-on experience with Hadoop ecosystem and related tools. - Familiarity with ETL tools such as Informatica, Talend, or similar. - Experience with data quality frameworks and methodologies. - Knowledge of big data technologies like Spark, Hive, or Pig. - Excellent analytical and problem-solving skills. - Proficient communication skills for effective collaboration. - Ability to manage multiple tasks and meet deadlines efficiently. - Experience in Java or scripting languages is a plus. - Strong attention to detail and a commitment to delivering quality work. - Certifications in data management or testing are a plus. - Ability to work independently and as part of a team. - Willingness to adapt to evolving technologies and methodologies. Skills required: - Scripting languages - Data modeling - Data quality frameworks - Hive - Talend - Analytical skills - SQL - Performance testing - Automation tools - Pig - Hadoop ecosystem - ETL testing - Informatica - Hadoop - Data quality - Big data - Java - Regression testing - Spark,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
As a Data Engineering Senior Specialist (Databricks) at Nasdaq Bangalore, you will be joining the Bangalore technology center in India, where innovation and effectiveness are the driving forces. Nasdaq is at the forefront of revolutionizing markets and constantly evolving by adopting new technologies to create innovative solutions, aiming to shape the future. In this role, your primary responsibility will be to analyze defined business requirements, providing analytical insights, modeling, dimensional modeling, and testing to design solutions that meet customer needs effectively. You will focus on understanding business data needs and translating them into adaptable, extensible, and sustainable data structures. As a Databricks Data Engineer, your role will involve designing, building, and maintaining data pipelines within the Databricks Lakehouse Platform. Your expertise will be crucial in enabling efficient data processing, analysis, and reporting for data-driven initiatives. You will utilize the Databricks Lakehouse Platform for data engineering tasks, implement ETL tasks using Apache Spark SQL and Python, and develop ETL pipelines following the Medallion Architecture. Moreover, you will be responsible for adding new sources to the Lakehouse platform, reviewing technology platforms on AWS cloud, supervising data extraction methods, resolving technical issues, and ensuring project delivery within the assigned timeline and budget. You will also lead administrative tasks, ensuring completeness and accuracy in administration processes. To excel in this role, you are expected to have 8-10 years of overall experience with at least 5-6 years of specific Data Engineering experience on Databricks. Proficiency in SQL and Python for data manipulation, knowledge of modern data technologies, cloud computing platforms like AWS, data modeling, architecture, best practices, and familiarity with AI/ML Ops in Databricks are essential. A Bachelor's/Master's degree in a relevant field or equivalent qualification is required. It would be advantageous if you have knowledge of Terraform and hold certifications in relevant fields. Nasdaq offers a vibrant and entrepreneurial work environment where taking initiative, challenging the status quo, and embracing intelligent risks are encouraged. The company values diversity, inclusivity, and work-life balance in a hybrid-first environment. As an employee, you can benefit from various perks such as an annual monetary bonus, becoming a Nasdaq shareholder, health insurance, flexible working schedules, internal mentorship programs, and a wide selection of online learning resources. If you believe you possess the required skills and experience for this role, we encourage you to submit your application in English as soon as possible. The selection process is ongoing, and we aim to get back to you within 2-3 weeks. At Nasdaq, we are committed to providing reasonable accommodations to individuals with disabilities throughout the job application and interview process, ensuring equal access to employment opportunities. If you require any accommodations, please reach out to us to discuss your needs.,
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
karnataka
On-site
As a Data Engineer, your primary responsibility will be to build and manage systems that collect, store, clean, and deliver data for various teams within the company, such as analysts, data scientists, and business users. Working with large datasets, you will focus on improving data quality, optimizing performance, and ensuring others can access the necessary data reliably and securely. Your key responsibilities will include building and maintaining data pipelines to pull data from diverse sources, processing it, and storing it for easy access. You will assemble large, complex datasets that align with both business and technical requirements, automate and optimize processes to enhance efficiency, and create tools to assist data teams in uncovering valuable insights like customer behavior and performance metrics. Collaboration with different teams, including product, design, and executive teams, will be essential to address data-related challenges, while ensuring the security and protection of sensitive data. Supporting data scientists by developing tools and systems to streamline their work processes is also part of your role, along with continuously seeking improvements in how data is utilized within the business. To excel in this role, you must possess expertise in data architecture and modeling to design scalable systems for easy analysis, proficiency in building and managing data pipelines, and ensuring data cleanliness for usability. Strong collaboration skills, the ability to document processes clearly, and basic knowledge of machine learning concepts are crucial. Proficiency in technical tools such as Python for writing clean, reusable code, Spark (PySpark and SparkSQL), SQL for data querying and manipulation, as well as familiarity with Git, GitHub, and JIRA for version control and task tracking is essential. Effective communication skills are necessary to convey data concepts to non-technical individuals. Additionally, having experience with Azure Cloud tools like Databricks or Synapse Analytics, familiarity with Azure Data Management tools such as Azure SQL Data Warehouse or Cosmos DB, and hands-on experience in deploying machine learning models in real-world applications are considered advantageous for this role.,
Posted 1 week ago
0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Role Overview: Capital Consultants are responsible for managing deal flow for transactions under ₹30 crore annual revenue, with selective deals ranging from ₹30 crore to ₹80 crore annual revenue. They act as intermediaries between companies and lenders, ensuring a smooth and efficient funding process by managing deal details, lender communications, and documentation. The CapC role emphasizes client relationship management, deal curation, deal structuring, ensuring process and product hygiene, and portfolio monitoring Key Responsibilities: Deal Requirement Assessment: - Understand customers requirements, including funding quantum, structure, pricing, security, and use case. - Collaborate with customers to gather relevant information and align deal structure to lender preferences. Engagement and Documentation: - Sign engagement letters (ELs) as per the deal - Vet and validate one-pagers prepared by Financial Analysts on Product to ensure accuracy and completeness. Data and Deal Management: - Collect necessary deal data and follow up on missing information as required. - Allocate deals into appropriate funnels (Spark, Swift, Scale) based on size and complexity. Lender Coordination and Communication on Deals: - Pitch deals to activated lenders from product and DCM team and actively follow up to ensure deal closure with lenders - Liaise with lenders for all documentation requirements, escalate to the Pod Owner or DCM if needed. Portfolio Management and Repayments: - Focus on collection activities, particularly on Early Warning Signals (EWS) to ensure low Days Past Due (DPD). - Maintain strong relationships with portfolio companies to capitalize on re-deployment opportunities and enhance customer experience Platform and Process Management: - Maintain accurate deal information in HubSpot to ensure deal hygiene. - Resolve support-related queries in collaboration. - Select appropriate lenders (with support from the product team) and route deals effectively. - Training & adoption of platform Invoicing and Realisation: - Oversee invoicing and realisation processes in partnership with Finance / Ops, as an escalation point. Sharing your resume at bhumika.bisht@recur.club
Posted 1 week ago
6.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description Senior Data Engineer – Databricks (Azure/AWS) Role Overview: We are looking for a hands-on Senior Data Engineer experienced in migrating and building large-scale data pipelines on Databricks using Azure or AWS platforms. The role will focus on implementing batch and streaming pipelines, applying the bronze-silver-gold data lakehouse model, and ensuring scalable and reliable data solutions. Required Skills and Experience: 6+ years of hands-on data engineering experience, with 2+ years specifically working on Databricks in Azure or AWS. Proficiency in building and optimizing Spark pipelines (batch and streaming). Strong experience implementing bronze/silver/gold data models. Working knowledge of cloud storage systems (ADLS, S3) and compute services. Experience migrating data from RDBMS (Oracle, SQL Server) or Hadoop ecosystems. Familiarity with Airflow, Azure Data Factory, or AWS Glue for orchestration. Good scripting skills (Python, Scala, SQL) and version control (Git). Preferred Qualifications: Databricks Certified Data Engineer Associate or Professional. Experience with Delta Live Tables (DLT) and Databricks SQL. Understanding of cloud security best practices (IAM roles, encryption, ACLs). Responsibilities Key Responsibilities: Design, develop, and operationalize scalable data pipelines on Databricks following medallion architecture principles. Migrate and transform large data volumes from traditional on-prem systems (Oracle, Hadoop, Exadata) into cloud data platforms. Develop efficient Spark (PySpark/Scala) jobs for ingestion, transformation, aggregation, and publishing of data. Implement data quality checks, error handling, retries, and data validation frameworks. Build automation scripts and CI/CD pipelines for Databricks workflows and deployment. Tune Spark jobs and optimize cost and performance in cloud environments. Collaborate with data architects, product owners, and analytics teams. Attributes for Success: Strong analytical and problem-solving skills. Attention to scalability, resilience, and cost efficiency. Collaborative attitude and passion for clean, maintainable code. Diversity and Inclusion: An Oracle career can span industries, roles, Countries, and cultures, allowing you to flourish in new roles and innovate while blending work life in. Oracle has thrived through 40+ years of change by innovating and operating with integrity while delivering for the top companies in almost every industry. To nurture the talent that makes this happen, we are committed to an inclusive culture that celebrates and values diverse insights and perspectives, a workforce that inspires thought leadership and innovation. Oracle offers a highly competitive suite of Employee Benefits designed on the principles of parity, consistency, and affordability. The overall package includes certain core elements such as Medical, Life Insurance, access to Retirement Planning, and much more. We also encourage our employees to engage in the culture of giving back to the communities where we live and do business.At Oracle, we believe that innovation starts with diversity and inclusion and to create the future we need talent from various backgrounds, perspectives, and abilities. We ensure that individuals with disabilities are provided reasonable accommodation to successfully participate in the job application, and interview process, and in potential roles. To perform crucial job functions. That’s why we’re committed to creating a workforce where all individuals can do their best work. It’s when everyone’s voice is heard and valued that we’re inspired to go beyond what’s been done before. About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law.
Posted 1 week ago
8.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description Data Architect – Databricks (Azure/AWS) Role Overview: We are seeking an experienced Data Architect specializing in Databricks to lead the architecture, design, and migration of enterprise data workloads from on-premises systems (e.g., Oracle, Exadata, Hadoop) to Databricks on Azure or AWS . The role involves designing scalable, secure, and high-performing data platforms based on the medallion architecture (bronze, silver, gold layers), supporting large-scale ingestion, transformation, and publishing of data. Required Skills and Experience: 8+ years of experience in data architecture or engineering roles, with at least 3+ years specializing in cloud-based big data solutions. Hands-on expertise with Databricks on Azure or AWS. Deep understanding of Delta Lake, medallion architecture (bronze/silver/gold zones), and data governance tools (e.g., Unity Catalog, Purview). Strong experience migrating large datasets and batch/streaming pipelines from on-prem to Databricks. Expertise with Spark (PySpark/Scala) at scale and optimizing Spark jobs. Familiarity with ingestion from RDBMS (Oracle, SQL Server) and legacy Hadoop ecosystems. Proficiency in orchestration tools (Databricks Workflows, Airflow, Azure Data Factory, AWS Glue Workflows). Strong understanding of cloud-native services for storage, compute, security, and networking. Preferred Qualifications: Databricks Certified Data Engineer or Architect. Azure/AWS cloud certifications. Experience with real-time/streaming ingestion (Kafka, Event Hubs, Kinesis). Familiarity with data quality frameworks (e.g., Deequ, Great Expectations). Responsibilities Key Responsibilities: Define and design cloud-native data architecture on Databricks using Delta Lake, Unity Catalog, and related services. Develop migration strategies for moving on-premises data workloads (Oracle, Hadoop, Exadata, etc.) to Databricks on Azure/AWS. Architect and oversee data pipelines supporting ingestion, curation, transformation, and analytics in a multi-layered (bronze/silver/gold) model. Lead data modeling, schema design, performance optimization, and data governance best practices. Collaborate with data engineering, platform, and security teams to build production-ready solutions. Create standards for ingestion frameworks, job orchestration (e.g., workflows, Airflow), and data quality validation. Support cost optimization, scalability design, and operational monitoring frameworks. Guide and mentor engineering teams during the build and migration phases. Attributes for Success: Ability to lead architecture discussions with technical and business stakeholders. Passion for modern cloud data architectures and continuous learning. Pragmatic and solution-driven approach to migrations. Diversity and Inclusion : An Oracle career can span industries, roles, Countries, and cultures, allowing you to flourish in new roles and innovate while blending work life in. Oracle has thrived through 40+ years of change by innovating and operating with integrity while delivering for the top companies in almost every industry. To nurture the talent that makes this happen, we are committed to an inclusive culture that celebrates and values diverse insights and perspectives, a workforce that inspires thought leadership and innovation. Oracle offers a highly competitive suite of Employee Benefits designed on the principles of parity, consistency, and affordability. The overall package includes certain core elements such as Medical, Life Insurance, access to Retirement Planning, and much more. We also encourage our employees to engage in the culture of giving back to the communities where we live and do business. At Oracle, we believe that innovation starts with diversity and inclusion and to create the future we need talent from various backgrounds, perspectives, and abilities. We ensure that individuals with disabilities are provided reasonable accommodation to successfully participate in the job application, and interview process, and in potential roles. To perform crucial job functions. That’s why we’re committed to creating a workforce where all individuals can do their best work. It’s when everyone’s voice is heard and valued that we’re inspired to go beyond what’s been done before. About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law.
Posted 1 week ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Description Data Architect – Databricks (Azure/AWS) Role Overview: We are seeking an experienced Data Architect specializing in Databricks to lead the architecture, design, and migration of enterprise data workloads from on-premises systems (e.g., Oracle, Exadata, Hadoop) to Databricks on Azure or AWS . The role involves designing scalable, secure, and high-performing data platforms based on the medallion architecture (bronze, silver, gold layers), supporting large-scale ingestion, transformation, and publishing of data. Required Skills and Experience: 8+ years of experience in data architecture or engineering roles, with at least 3+ years specializing in cloud-based big data solutions. Hands-on expertise with Databricks on Azure or AWS. Deep understanding of Delta Lake, medallion architecture (bronze/silver/gold zones), and data governance tools (e.g., Unity Catalog, Purview). Strong experience migrating large datasets and batch/streaming pipelines from on-prem to Databricks. Expertise with Spark (PySpark/Scala) at scale and optimizing Spark jobs. Familiarity with ingestion from RDBMS (Oracle, SQL Server) and legacy Hadoop ecosystems. Proficiency in orchestration tools (Databricks Workflows, Airflow, Azure Data Factory, AWS Glue Workflows). Strong understanding of cloud-native services for storage, compute, security, and networking. Preferred Qualifications: Databricks Certified Data Engineer or Architect. Azure/AWS cloud certifications. Experience with real-time/streaming ingestion (Kafka, Event Hubs, Kinesis). Familiarity with data quality frameworks (e.g., Deequ, Great Expectations). Responsibilities Key Responsibilities: Define and design cloud-native data architecture on Databricks using Delta Lake, Unity Catalog, and related services. Develop migration strategies for moving on-premises data workloads (Oracle, Hadoop, Exadata, etc.) to Databricks on Azure/AWS. Architect and oversee data pipelines supporting ingestion, curation, transformation, and analytics in a multi-layered (bronze/silver/gold) model. Lead data modeling, schema design, performance optimization, and data governance best practices. Collaborate with data engineering, platform, and security teams to build production-ready solutions. Create standards for ingestion frameworks, job orchestration (e.g., workflows, Airflow), and data quality validation. Support cost optimization, scalability design, and operational monitoring frameworks. Guide and mentor engineering teams during the build and migration phases. Attributes for Success: Ability to lead architecture discussions with technical and business stakeholders. Passion for modern cloud data architectures and continuous learning. Pragmatic and solution-driven approach to migrations. Diversity and Inclusion : An Oracle career can span industries, roles, Countries, and cultures, allowing you to flourish in new roles and innovate while blending work life in. Oracle has thrived through 40+ years of change by innovating and operating with integrity while delivering for the top companies in almost every industry. To nurture the talent that makes this happen, we are committed to an inclusive culture that celebrates and values diverse insights and perspectives, a workforce that inspires thought leadership and innovation. Oracle offers a highly competitive suite of Employee Benefits designed on the principles of parity, consistency, and affordability. The overall package includes certain core elements such as Medical, Life Insurance, access to Retirement Planning, and much more. We also encourage our employees to engage in the culture of giving back to the communities where we live and do business. At Oracle, we believe that innovation starts with diversity and inclusion and to create the future we need talent from various backgrounds, perspectives, and abilities. We ensure that individuals with disabilities are provided reasonable accommodation to successfully participate in the job application, and interview process, and in potential roles. To perform crucial job functions. That’s why we’re committed to creating a workforce where all individuals can do their best work. It’s when everyone’s voice is heard and valued that we’re inspired to go beyond what’s been done before. About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law.
Posted 1 week ago
6.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Description Senior Data Engineer – Databricks (Azure/AWS) Role Overview: We are looking for a hands-on Senior Data Engineer experienced in migrating and building large-scale data pipelines on Databricks using Azure or AWS platforms. The role will focus on implementing batch and streaming pipelines, applying the bronze-silver-gold data lakehouse model, and ensuring scalable and reliable data solutions. Required Skills and Experience: 6+ years of hands-on data engineering experience, with 2+ years specifically working on Databricks in Azure or AWS. Proficiency in building and optimizing Spark pipelines (batch and streaming). Strong experience implementing bronze/silver/gold data models. Working knowledge of cloud storage systems (ADLS, S3) and compute services. Experience migrating data from RDBMS (Oracle, SQL Server) or Hadoop ecosystems. Familiarity with Airflow, Azure Data Factory, or AWS Glue for orchestration. Good scripting skills (Python, Scala, SQL) and version control (Git). Preferred Qualifications: Databricks Certified Data Engineer Associate or Professional. Experience with Delta Live Tables (DLT) and Databricks SQL. Understanding of cloud security best practices (IAM roles, encryption, ACLs). Responsibilities Key Responsibilities: Design, develop, and operationalize scalable data pipelines on Databricks following medallion architecture principles. Migrate and transform large data volumes from traditional on-prem systems (Oracle, Hadoop, Exadata) into cloud data platforms. Develop efficient Spark (PySpark/Scala) jobs for ingestion, transformation, aggregation, and publishing of data. Implement data quality checks, error handling, retries, and data validation frameworks. Build automation scripts and CI/CD pipelines for Databricks workflows and deployment. Tune Spark jobs and optimize cost and performance in cloud environments. Collaborate with data architects, product owners, and analytics teams. Attributes for Success: Strong analytical and problem-solving skills. Attention to scalability, resilience, and cost efficiency. Collaborative attitude and passion for clean, maintainable code. Diversity and Inclusion: An Oracle career can span industries, roles, Countries, and cultures, allowing you to flourish in new roles and innovate while blending work life in. Oracle has thrived through 40+ years of change by innovating and operating with integrity while delivering for the top companies in almost every industry. To nurture the talent that makes this happen, we are committed to an inclusive culture that celebrates and values diverse insights and perspectives, a workforce that inspires thought leadership and innovation. Oracle offers a highly competitive suite of Employee Benefits designed on the principles of parity, consistency, and affordability. The overall package includes certain core elements such as Medical, Life Insurance, access to Retirement Planning, and much more. We also encourage our employees to engage in the culture of giving back to the communities where we live and do business.At Oracle, we believe that innovation starts with diversity and inclusion and to create the future we need talent from various backgrounds, perspectives, and abilities. We ensure that individuals with disabilities are provided reasonable accommodation to successfully participate in the job application, and interview process, and in potential roles. To perform crucial job functions. That’s why we’re committed to creating a workforce where all individuals can do their best work. It’s when everyone’s voice is heard and valued that we’re inspired to go beyond what’s been done before. About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law.
Posted 1 week ago
6.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
Job Description Senior Data Engineer – Databricks (Azure/AWS) Role Overview: We are looking for a hands-on Senior Data Engineer experienced in migrating and building large-scale data pipelines on Databricks using Azure or AWS platforms. The role will focus on implementing batch and streaming pipelines, applying the bronze-silver-gold data lakehouse model, and ensuring scalable and reliable data solutions. Required Skills and Experience: 6+ years of hands-on data engineering experience, with 2+ years specifically working on Databricks in Azure or AWS. Proficiency in building and optimizing Spark pipelines (batch and streaming). Strong experience implementing bronze/silver/gold data models. Working knowledge of cloud storage systems (ADLS, S3) and compute services. Experience migrating data from RDBMS (Oracle, SQL Server) or Hadoop ecosystems. Familiarity with Airflow, Azure Data Factory, or AWS Glue for orchestration. Good scripting skills (Python, Scala, SQL) and version control (Git). Preferred Qualifications: Databricks Certified Data Engineer Associate or Professional. Experience with Delta Live Tables (DLT) and Databricks SQL. Understanding of cloud security best practices (IAM roles, encryption, ACLs). Responsibilities Key Responsibilities: Design, develop, and operationalize scalable data pipelines on Databricks following medallion architecture principles. Migrate and transform large data volumes from traditional on-prem systems (Oracle, Hadoop, Exadata) into cloud data platforms. Develop efficient Spark (PySpark/Scala) jobs for ingestion, transformation, aggregation, and publishing of data. Implement data quality checks, error handling, retries, and data validation frameworks. Build automation scripts and CI/CD pipelines for Databricks workflows and deployment. Tune Spark jobs and optimize cost and performance in cloud environments. Collaborate with data architects, product owners, and analytics teams. Attributes for Success: Strong analytical and problem-solving skills. Attention to scalability, resilience, and cost efficiency. Collaborative attitude and passion for clean, maintainable code. Diversity and Inclusion: An Oracle career can span industries, roles, Countries, and cultures, allowing you to flourish in new roles and innovate while blending work life in. Oracle has thrived through 40+ years of change by innovating and operating with integrity while delivering for the top companies in almost every industry. To nurture the talent that makes this happen, we are committed to an inclusive culture that celebrates and values diverse insights and perspectives, a workforce that inspires thought leadership and innovation. Oracle offers a highly competitive suite of Employee Benefits designed on the principles of parity, consistency, and affordability. The overall package includes certain core elements such as Medical, Life Insurance, access to Retirement Planning, and much more. We also encourage our employees to engage in the culture of giving back to the communities where we live and do business.At Oracle, we believe that innovation starts with diversity and inclusion and to create the future we need talent from various backgrounds, perspectives, and abilities. We ensure that individuals with disabilities are provided reasonable accommodation to successfully participate in the job application, and interview process, and in potential roles. To perform crucial job functions. That’s why we’re committed to creating a workforce where all individuals can do their best work. It’s when everyone’s voice is heard and valued that we’re inspired to go beyond what’s been done before. About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law.
Posted 1 week ago
8.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
Job Description Data Architect – Databricks (Azure/AWS) Role Overview: We are seeking an experienced Data Architect specializing in Databricks to lead the architecture, design, and migration of enterprise data workloads from on-premises systems (e.g., Oracle, Exadata, Hadoop) to Databricks on Azure or AWS . The role involves designing scalable, secure, and high-performing data platforms based on the medallion architecture (bronze, silver, gold layers), supporting large-scale ingestion, transformation, and publishing of data. Required Skills and Experience: 8+ years of experience in data architecture or engineering roles, with at least 3+ years specializing in cloud-based big data solutions. Hands-on expertise with Databricks on Azure or AWS. Deep understanding of Delta Lake, medallion architecture (bronze/silver/gold zones), and data governance tools (e.g., Unity Catalog, Purview). Strong experience migrating large datasets and batch/streaming pipelines from on-prem to Databricks. Expertise with Spark (PySpark/Scala) at scale and optimizing Spark jobs. Familiarity with ingestion from RDBMS (Oracle, SQL Server) and legacy Hadoop ecosystems. Proficiency in orchestration tools (Databricks Workflows, Airflow, Azure Data Factory, AWS Glue Workflows). Strong understanding of cloud-native services for storage, compute, security, and networking. Preferred Qualifications: Databricks Certified Data Engineer or Architect. Azure/AWS cloud certifications. Experience with real-time/streaming ingestion (Kafka, Event Hubs, Kinesis). Familiarity with data quality frameworks (e.g., Deequ, Great Expectations). Responsibilities Key Responsibilities: Define and design cloud-native data architecture on Databricks using Delta Lake, Unity Catalog, and related services. Develop migration strategies for moving on-premises data workloads (Oracle, Hadoop, Exadata, etc.) to Databricks on Azure/AWS. Architect and oversee data pipelines supporting ingestion, curation, transformation, and analytics in a multi-layered (bronze/silver/gold) model. Lead data modeling, schema design, performance optimization, and data governance best practices. Collaborate with data engineering, platform, and security teams to build production-ready solutions. Create standards for ingestion frameworks, job orchestration (e.g., workflows, Airflow), and data quality validation. Support cost optimization, scalability design, and operational monitoring frameworks. Guide and mentor engineering teams during the build and migration phases. Attributes for Success: Ability to lead architecture discussions with technical and business stakeholders. Passion for modern cloud data architectures and continuous learning. Pragmatic and solution-driven approach to migrations. Diversity and Inclusion : An Oracle career can span industries, roles, Countries, and cultures, allowing you to flourish in new roles and innovate while blending work life in. Oracle has thrived through 40+ years of change by innovating and operating with integrity while delivering for the top companies in almost every industry. To nurture the talent that makes this happen, we are committed to an inclusive culture that celebrates and values diverse insights and perspectives, a workforce that inspires thought leadership and innovation. Oracle offers a highly competitive suite of Employee Benefits designed on the principles of parity, consistency, and affordability. The overall package includes certain core elements such as Medical, Life Insurance, access to Retirement Planning, and much more. We also encourage our employees to engage in the culture of giving back to the communities where we live and do business. At Oracle, we believe that innovation starts with diversity and inclusion and to create the future we need talent from various backgrounds, perspectives, and abilities. We ensure that individuals with disabilities are provided reasonable accommodation to successfully participate in the job application, and interview process, and in potential roles. To perform crucial job functions. That’s why we’re committed to creating a workforce where all individuals can do their best work. It’s when everyone’s voice is heard and valued that we’re inspired to go beyond what’s been done before. About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law.
Posted 1 week ago
2.0 - 3.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Company Description Shipsy is a Global SaaS company focused on the Logistics tech space. Logistics is a multi trillion dollar industry, but still largely run in a manual manner. Shipsy offers cutting-edge solutions helping Shippers & Logistics with Warehousing and Transportation Automation solutions, thereby, reducing logistics costs and enhancing customer experience. Our customers span across Middle East, India and South East Asia and include enterprises like Reliance (their entire Retail operations, exports and imports runs on Shipsy), Domino's, Gulf Marketing Group, UPS Gulf, DTDC Express, Burger King, Landmark Group - Homecentre, More Retail and many more. We process over 2 million shipments a day and about 10 PERCENT of India's container trade is tracked on our platform. We are backed by Global investors such as Peak XV Partners, Infoedge, A91 partners and have raised ~$35mn till date. We are over a 280 member team now, with offices across Gurgaon (HO), Mumbai, Bangalore, and Dubai. Our team is composed of excellent individuals from top institutes across the country like IITs, IIITs, NITs with experience in Big Data, Software Architecture, ML, AI, Robotics, Blockchain. In combination, we have previously worked at Samsung Korea, MIT Media Labs, CMU Robotics, Deutsche Bank, Morgan Stanley, Samsung Research, GE Research, Qualcomm Research, etc. and have also been entrepreneurs. We have numerous research publications and patents. The core team has computer scientists, electrical engineers from IIT Delhi and Madras, and this core tech focus would contribute tremendously to your learning. We also have some world-class employee benefits such as the scholarship program and more that will further enhance your learning. We serve clients from across various industries and geographies, and pride in having a young, energetic, diverse team. We aim to make Shipsys work culture fun. We all work hard, but enjoy what we do. We are fostering a supportive, empathetic environment that supports you, so you can deliver the results you aspire to and grow as a professional. To learn about our leadership team and gain greater insights into our growth story and solutions, visit https://shipsy.io/about-us/. Job Title: Business Development Representative/Sr. Business Development Representative Location: Gurgaon Department: Demand Generation Reports To: Head of Demand Generation Marketing About Shipsy At Shipsy, were not just about logistics; were about revolutionizing how businesses move and operate! Join us on our mission to streamline operations and make logistics smarter, faster, and more efficient. If youre looking for a place where innovation meets fun, youve found it! Job Summary Are you a go-getter with a passion for connecting with people? As our Outbound Business Development Representative (BDR), youll be the spark that ignites new business opportunities! Youll dive into the world of outbound prospecting, crafting compelling outreach that makes potential clients say, Tell me more! Your energy and enthusiasm will be key in driving our demand generation efforts. Key Responsibilities Outbound Prospecting: Get ready to roll up your sleeves and reach out to potential clients through cold calls, emails, and social media. Your mission? Make them excited about Shipsy! Lead Qualification: Engage with prospects to uncover their needs, ensuring theyre a perfect fit for our solutions (and vice versa). Market Research: Channel your inner detective to identify and analyze target markets and industries. Whos out there waiting for Shipsy? Team Collaboration: Work hand-in-hand with our awesome sales and marketing teams to align strategies and share your insights. Teamwork makes the dream work! CRM Wizardry: Keep track of all your interactions in our CRM system. Youll be the master of follow-ups and reporting! Performance Metrics: Track your outbound activities like a champion and share your wins (and learnings) with the team. Continuous Improvement: Stay ahead of industry trends and best practices, always looking for ways to enhance your approach. Qualifications Qualifications Bachelors degree in Business, Marketing, or a related field (bonus points for relevant experience). 2-3 years of experience in outbound sales or business development, preferably in a B2B environment. Excellent communication skillsyou're a natural at engaging with prospects and making connections! A self-motivated, goal-oriented attitude with a knack for meeting (and smashing) targets. Proficiency in CRM tools and MS Office Suite; familiarity with sales automation tools is a plus. An adventurous spirit ready to thrive in our dynamic environment! Additional Information What We Offer Competitive salary and performance-based incentives that reward your hard work. Opportunities for professional development and career growthbecause we want to see you shine! A fun, collaborative, and inclusive workplace culture where your ideas matter. How To Apply If youre ready to embark on an exciting journey with us and be a key player in our growth, we want to hear from you! Send your resume and a cover letter that showcases your personality. Locations - Gurugram, India
Posted 1 week ago
8.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Job Description Data Architect – Databricks (Azure/AWS) Role Overview: We are seeking an experienced Data Architect specializing in Databricks to lead the architecture, design, and migration of enterprise data workloads from on-premises systems (e.g., Oracle, Exadata, Hadoop) to Databricks on Azure or AWS . The role involves designing scalable, secure, and high-performing data platforms based on the medallion architecture (bronze, silver, gold layers), supporting large-scale ingestion, transformation, and publishing of data. Required Skills and Experience: 8+ years of experience in data architecture or engineering roles, with at least 3+ years specializing in cloud-based big data solutions. Hands-on expertise with Databricks on Azure or AWS. Deep understanding of Delta Lake, medallion architecture (bronze/silver/gold zones), and data governance tools (e.g., Unity Catalog, Purview). Strong experience migrating large datasets and batch/streaming pipelines from on-prem to Databricks. Expertise with Spark (PySpark/Scala) at scale and optimizing Spark jobs. Familiarity with ingestion from RDBMS (Oracle, SQL Server) and legacy Hadoop ecosystems. Proficiency in orchestration tools (Databricks Workflows, Airflow, Azure Data Factory, AWS Glue Workflows). Strong understanding of cloud-native services for storage, compute, security, and networking. Preferred Qualifications: Databricks Certified Data Engineer or Architect. Azure/AWS cloud certifications. Experience with real-time/streaming ingestion (Kafka, Event Hubs, Kinesis). Familiarity with data quality frameworks (e.g., Deequ, Great Expectations). Responsibilities Key Responsibilities: Define and design cloud-native data architecture on Databricks using Delta Lake, Unity Catalog, and related services. Develop migration strategies for moving on-premises data workloads (Oracle, Hadoop, Exadata, etc.) to Databricks on Azure/AWS. Architect and oversee data pipelines supporting ingestion, curation, transformation, and analytics in a multi-layered (bronze/silver/gold) model. Lead data modeling, schema design, performance optimization, and data governance best practices. Collaborate with data engineering, platform, and security teams to build production-ready solutions. Create standards for ingestion frameworks, job orchestration (e.g., workflows, Airflow), and data quality validation. Support cost optimization, scalability design, and operational monitoring frameworks. Guide and mentor engineering teams during the build and migration phases. Attributes for Success: Ability to lead architecture discussions with technical and business stakeholders. Passion for modern cloud data architectures and continuous learning. Pragmatic and solution-driven approach to migrations. Diversity and Inclusion : An Oracle career can span industries, roles, Countries, and cultures, allowing you to flourish in new roles and innovate while blending work life in. Oracle has thrived through 40+ years of change by innovating and operating with integrity while delivering for the top companies in almost every industry. To nurture the talent that makes this happen, we are committed to an inclusive culture that celebrates and values diverse insights and perspectives, a workforce that inspires thought leadership and innovation. Oracle offers a highly competitive suite of Employee Benefits designed on the principles of parity, consistency, and affordability. The overall package includes certain core elements such as Medical, Life Insurance, access to Retirement Planning, and much more. We also encourage our employees to engage in the culture of giving back to the communities where we live and do business. At Oracle, we believe that innovation starts with diversity and inclusion and to create the future we need talent from various backgrounds, perspectives, and abilities. We ensure that individuals with disabilities are provided reasonable accommodation to successfully participate in the job application, and interview process, and in potential roles. To perform crucial job functions. That’s why we’re committed to creating a workforce where all individuals can do their best work. It’s when everyone’s voice is heard and valued that we’re inspired to go beyond what’s been done before. About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law.
Posted 1 week ago
6.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Job Description Senior Data Engineer – Databricks (Azure/AWS) Role Overview: We are looking for a hands-on Senior Data Engineer experienced in migrating and building large-scale data pipelines on Databricks using Azure or AWS platforms. The role will focus on implementing batch and streaming pipelines, applying the bronze-silver-gold data lakehouse model, and ensuring scalable and reliable data solutions. Required Skills and Experience: 6+ years of hands-on data engineering experience, with 2+ years specifically working on Databricks in Azure or AWS. Proficiency in building and optimizing Spark pipelines (batch and streaming). Strong experience implementing bronze/silver/gold data models. Working knowledge of cloud storage systems (ADLS, S3) and compute services. Experience migrating data from RDBMS (Oracle, SQL Server) or Hadoop ecosystems. Familiarity with Airflow, Azure Data Factory, or AWS Glue for orchestration. Good scripting skills (Python, Scala, SQL) and version control (Git). Preferred Qualifications: Databricks Certified Data Engineer Associate or Professional. Experience with Delta Live Tables (DLT) and Databricks SQL. Understanding of cloud security best practices (IAM roles, encryption, ACLs). Responsibilities Key Responsibilities: Design, develop, and operationalize scalable data pipelines on Databricks following medallion architecture principles. Migrate and transform large data volumes from traditional on-prem systems (Oracle, Hadoop, Exadata) into cloud data platforms. Develop efficient Spark (PySpark/Scala) jobs for ingestion, transformation, aggregation, and publishing of data. Implement data quality checks, error handling, retries, and data validation frameworks. Build automation scripts and CI/CD pipelines for Databricks workflows and deployment. Tune Spark jobs and optimize cost and performance in cloud environments. Collaborate with data architects, product owners, and analytics teams. Attributes for Success: Strong analytical and problem-solving skills. Attention to scalability, resilience, and cost efficiency. Collaborative attitude and passion for clean, maintainable code. Diversity and Inclusion: An Oracle career can span industries, roles, Countries, and cultures, allowing you to flourish in new roles and innovate while blending work life in. Oracle has thrived through 40+ years of change by innovating and operating with integrity while delivering for the top companies in almost every industry. To nurture the talent that makes this happen, we are committed to an inclusive culture that celebrates and values diverse insights and perspectives, a workforce that inspires thought leadership and innovation. Oracle offers a highly competitive suite of Employee Benefits designed on the principles of parity, consistency, and affordability. The overall package includes certain core elements such as Medical, Life Insurance, access to Retirement Planning, and much more. We also encourage our employees to engage in the culture of giving back to the communities where we live and do business.At Oracle, we believe that innovation starts with diversity and inclusion and to create the future we need talent from various backgrounds, perspectives, and abilities. We ensure that individuals with disabilities are provided reasonable accommodation to successfully participate in the job application, and interview process, and in potential roles. To perform crucial job functions. That’s why we’re committed to creating a workforce where all individuals can do their best work. It’s when everyone’s voice is heard and valued that we’re inspired to go beyond what’s been done before. About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law.
Posted 1 week ago
8.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
Job Description Data Architect – Databricks (Azure/AWS) Role Overview: We are seeking an experienced Data Architect specializing in Databricks to lead the architecture, design, and migration of enterprise data workloads from on-premises systems (e.g., Oracle, Exadata, Hadoop) to Databricks on Azure or AWS . The role involves designing scalable, secure, and high-performing data platforms based on the medallion architecture (bronze, silver, gold layers), supporting large-scale ingestion, transformation, and publishing of data. Required Skills and Experience: 8+ years of experience in data architecture or engineering roles, with at least 3+ years specializing in cloud-based big data solutions. Hands-on expertise with Databricks on Azure or AWS. Deep understanding of Delta Lake, medallion architecture (bronze/silver/gold zones), and data governance tools (e.g., Unity Catalog, Purview). Strong experience migrating large datasets and batch/streaming pipelines from on-prem to Databricks. Expertise with Spark (PySpark/Scala) at scale and optimizing Spark jobs. Familiarity with ingestion from RDBMS (Oracle, SQL Server) and legacy Hadoop ecosystems. Proficiency in orchestration tools (Databricks Workflows, Airflow, Azure Data Factory, AWS Glue Workflows). Strong understanding of cloud-native services for storage, compute, security, and networking. Preferred Qualifications: Databricks Certified Data Engineer or Architect. Azure/AWS cloud certifications. Experience with real-time/streaming ingestion (Kafka, Event Hubs, Kinesis). Familiarity with data quality frameworks (e.g., Deequ, Great Expectations). Responsibilities Key Responsibilities: Define and design cloud-native data architecture on Databricks using Delta Lake, Unity Catalog, and related services. Develop migration strategies for moving on-premises data workloads (Oracle, Hadoop, Exadata, etc.) to Databricks on Azure/AWS. Architect and oversee data pipelines supporting ingestion, curation, transformation, and analytics in a multi-layered (bronze/silver/gold) model. Lead data modeling, schema design, performance optimization, and data governance best practices. Collaborate with data engineering, platform, and security teams to build production-ready solutions. Create standards for ingestion frameworks, job orchestration (e.g., workflows, Airflow), and data quality validation. Support cost optimization, scalability design, and operational monitoring frameworks. Guide and mentor engineering teams during the build and migration phases. Attributes for Success: Ability to lead architecture discussions with technical and business stakeholders. Passion for modern cloud data architectures and continuous learning. Pragmatic and solution-driven approach to migrations. Diversity and Inclusion : An Oracle career can span industries, roles, Countries, and cultures, allowing you to flourish in new roles and innovate while blending work life in. Oracle has thrived through 40+ years of change by innovating and operating with integrity while delivering for the top companies in almost every industry. To nurture the talent that makes this happen, we are committed to an inclusive culture that celebrates and values diverse insights and perspectives, a workforce that inspires thought leadership and innovation. Oracle offers a highly competitive suite of Employee Benefits designed on the principles of parity, consistency, and affordability. The overall package includes certain core elements such as Medical, Life Insurance, access to Retirement Planning, and much more. We also encourage our employees to engage in the culture of giving back to the communities where we live and do business. At Oracle, we believe that innovation starts with diversity and inclusion and to create the future we need talent from various backgrounds, perspectives, and abilities. We ensure that individuals with disabilities are provided reasonable accommodation to successfully participate in the job application, and interview process, and in potential roles. To perform crucial job functions. That’s why we’re committed to creating a workforce where all individuals can do their best work. It’s when everyone’s voice is heard and valued that we’re inspired to go beyond what’s been done before. About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law.
Posted 1 week ago
6.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
Job Description Senior Data Engineer – Databricks (Azure/AWS) Role Overview: We are looking for a hands-on Senior Data Engineer experienced in migrating and building large-scale data pipelines on Databricks using Azure or AWS platforms. The role will focus on implementing batch and streaming pipelines, applying the bronze-silver-gold data lakehouse model, and ensuring scalable and reliable data solutions. Required Skills and Experience: 6+ years of hands-on data engineering experience, with 2+ years specifically working on Databricks in Azure or AWS. Proficiency in building and optimizing Spark pipelines (batch and streaming). Strong experience implementing bronze/silver/gold data models. Working knowledge of cloud storage systems (ADLS, S3) and compute services. Experience migrating data from RDBMS (Oracle, SQL Server) or Hadoop ecosystems. Familiarity with Airflow, Azure Data Factory, or AWS Glue for orchestration. Good scripting skills (Python, Scala, SQL) and version control (Git). Preferred Qualifications: Databricks Certified Data Engineer Associate or Professional. Experience with Delta Live Tables (DLT) and Databricks SQL. Understanding of cloud security best practices (IAM roles, encryption, ACLs). Responsibilities Key Responsibilities: Design, develop, and operationalize scalable data pipelines on Databricks following medallion architecture principles. Migrate and transform large data volumes from traditional on-prem systems (Oracle, Hadoop, Exadata) into cloud data platforms. Develop efficient Spark (PySpark/Scala) jobs for ingestion, transformation, aggregation, and publishing of data. Implement data quality checks, error handling, retries, and data validation frameworks. Build automation scripts and CI/CD pipelines for Databricks workflows and deployment. Tune Spark jobs and optimize cost and performance in cloud environments. Collaborate with data architects, product owners, and analytics teams. Attributes for Success: Strong analytical and problem-solving skills. Attention to scalability, resilience, and cost efficiency. Collaborative attitude and passion for clean, maintainable code. Diversity and Inclusion: An Oracle career can span industries, roles, Countries, and cultures, allowing you to flourish in new roles and innovate while blending work life in. Oracle has thrived through 40+ years of change by innovating and operating with integrity while delivering for the top companies in almost every industry. To nurture the talent that makes this happen, we are committed to an inclusive culture that celebrates and values diverse insights and perspectives, a workforce that inspires thought leadership and innovation. Oracle offers a highly competitive suite of Employee Benefits designed on the principles of parity, consistency, and affordability. The overall package includes certain core elements such as Medical, Life Insurance, access to Retirement Planning, and much more. We also encourage our employees to engage in the culture of giving back to the communities where we live and do business.At Oracle, we believe that innovation starts with diversity and inclusion and to create the future we need talent from various backgrounds, perspectives, and abilities. We ensure that individuals with disabilities are provided reasonable accommodation to successfully participate in the job application, and interview process, and in potential roles. To perform crucial job functions. That’s why we’re committed to creating a workforce where all individuals can do their best work. It’s when everyone’s voice is heard and valued that we’re inspired to go beyond what’s been done before. About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law.
Posted 1 week ago
0.0 - 1.0 years
0 - 0 Lacs
Borivali, Mumbai, Maharashtra
On-site
If you have the creative spark in you and are passionate about advertising, Mirror is the right place to flourish your skills. Looking for candidates that have the zest to prove their skills in designing and motion graphics and are ready to go beyond their boundaries of imagination and creativity. The candidate will be working multiple client projects which will include, social media work, digital campaigns and also concept designing work. Kindly share your updated resume and work portfolio on mirroradvertising1@gmail.com Job Type: Full-time Pay: ₹25,000.00 - ₹30,000.00 per month Ability to commute/relocate: Borivali, Mumbai Suburban - 400092, Maharashtra: Reliably commute or planning to relocate before starting work (Required) Experience: Motion graphics: 1 year (Required) Work Location: In person Application Deadline: 11/08/2025
Posted 1 week ago
5.0 - 10.0 years
0 Lacs
Chennai, Tamil Nadu, India
Remote
JOB DESCRIPTION -DATA SCIENTIST Role: Data Scientist Experience: 5 to 10 Years Work Mode: Remote Immediate Joiners Preferred Required Skills & Qualification: An ideal candidate will have experience, as we are building an AI-powered workforce intelligence platform that helps businesses optimize talent strategies, enhance decision-making, and drive operational efficiency. Our software leverages cutting-edge AI, NLP, and data science to extract meaningful insights from vast amounts of structured and unstructured workforce data. As part of our new AI team, you will have the opportunity to work on real-world AI applications, contribute to innovative NLP solutions, and gain experience in building AI-driven products from the ground up. Required Skills & Qualification • Strong experience in Python programming • 5-10 years of experience in Data Science/NLP (Freshers with strong NLP projects are welcome). • Proficiency in Python, PyTorch, Scikit-learn, and NLP libraries (NLTK, SpaCy, Hugging Face). • Basic knowledge of cloud platforms (AWS, GCP, or Azure). -Retrieval, Machine Learning, Artificial Intelligence, Generative AI, Semantic search, Reranking & evaluating search performance. • Experience with SQL for data manipulation and analysis. • Assist in designing, training, and optimizing ML/NLP models using PyTorch, NLTK, Scikit- learn, and Transformer models (BERT, GPT, etc.). • Familiarity with MLOps tools like Airflow, MLflow, or similar. • Experience with Big Data processing (Spark, Pandas, or Dask). • Help deploy AI/ML solutions on AWS, GCP, or Azure. • Collaborate with engineers to integrate AI models into production systems. • Expertise in using SQL and Python to clean, preprocess, and analyze large datasets. • Learn & Innovate – Stay updated with the latest advancements in NLP, AI, and ML frameworks. • Strong analytical and problem-solving skills. • Willingness to learn, experiment, and take ownership in a fast-paced startup environment. Nice to Have Requirements for the Candidate • Desire to grow within the company • Team player and Quicker learner • Performance-driven • Strong networking and outreach skills • Exploring aptitude & killer attitude • Ability to communicate and collaborate with the team at ease. • Drive to get the results and not let anything get in your way. • Critical and analytical thinking skills, with a keen attention to detail. • Demonstrate ownership and strive for excellence in everything you do. • Demonstrate a high level of curiosity and keep abreast of the latest technologies & tools • Ability to pick up new software easily and represent yourself peers and co-ordinate during meetings with Customers. What We Offer: - We offer a market-leading salary along with a comprehensive benefits package to support your well-being. -Enjoy a hybrid or remote work setup that prioritizes work-life balance and personal wellbeing. -We invest in your career through continuous learning and internal growth opportunities. -Be part of a dynamic, inclusive, and vibrant workplace where your contributions are recognized and rewarded. -We believe in straightforward policies, open communication, and a supportive work environment where everyone thrives. About the Company: https://predigle.com/ https://www.espergroup.com/ Predigle, an EsperGroup company, focuses on building disruptive technology platforms to transform daily business operations. Predigle has expanded rapidly to offer various products and services. Predigle Intelligence (Pi) is a comprehensive portable AI platform that offers a low-code/no-code AI design solution for solving business problems.
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. Data Engineer What You Will Do Let’s do this. Let’s change the world. In this vital role you will be responsible for designing, building, maintaining, analyzing, and interpreting data to provide actionable insights that drive business decisions. This role involves working with large datasets, developing reports, supporting and implementing data initiatives and, visualizing data to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has strong technical skills, experience with big data technologies, and a deep understanding of data architecture and ETL processes Design, develop, and maintain data solutions for data generation, collection, and processing Be a key team member that assists in design and development of the data pipeline Create data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency Implement data security and privacy measures to protect sensitive data Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions Collaborate and communicate effectively with product teams Collaborate with Data Architects, Business SMEs, and Data Scientists to design and develop end-to-end data pipelines to meet fast paced business needs across geographic regions Identify and resolve complex data-related challenges Adhere to standard processes for coding, testing, and designing reusable code/component Explore new tools and technologies that will help to improve ETL platform performance Participate in sprint planning meetings and provide estimations on technical implementation What We Expect Of You We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Master's degree / bachelor’s degree and 5 to 9 years’ experience in Computer Science, IT or related field Must Have: Hands on experience with big data technologies and platforms, such as Databricks, Apache Spark (PySpark, SparkSQL), Snowflake, workflow orchestration, performance tuning on big data processing Proficiency in data analysis tools (eg. SQL) Proficient in SQL for extracting, transforming, and analyzing complex datasets from relational data stores Experience with ETL tools such as Apache Spark, and various Python packages related to data processing, machine learning model development Strong understanding of data modeling, data warehousing, and data integration concepts Proven ability to optimize query performance on big data platforms Experience in Real World Data/ Health Care Preferred Qualifications: Experience with Software engineering best-practices, including but not limited to version control, infrastructure-as-code, CI/CD, and automated testing Knowledge of Python/R, Databricks, SageMaker, cloud data platforms Strong understanding of data governance frameworks, tools, and best practices. Knowledge of data protection regulations and compliance requirements (e.g., GDPR, CCPA) Professional Certifications: Databricks Certificate preferred AWS Data Engineer/Architect Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills What You Can Expect Of Us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
Posted 1 week ago
5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Title - Principal / Senior Software Engineer (Java) Job Location - Baner, Pune, Maharashtra Domain - Security About the Role - Are you a passionate Software Engineer who has a proven track record of solving complex problems and being at the forefront of innovation? Pursuing a career at our client will allow you to write code and manipulate data in ways that have never been done before, driving automation of threat detection and response for one of the world’s fastest growing industries. You will lead the creation, testing, and deployment of cutting-edge security technology to enterprise customers across the globe. Above all else, this role will allow you to work and learn from some of the most talented people in the business as well as have a direct contribution to the growth and success of the company. The everyday hustle: Research and develop creative solutions across a wide range of cutting-edge technologies to continuously evolve our client's platform. Create REST API’s and integrations between various products to improve and automate our customer’s threat detection. Manage the continuous integration and deployment processes of complex technologies. Perform code reviews to ensure consistent improvement. Proactively automate and improve all stages of the software development lifecycle. Interface closely with various parts of the business, both internally and externally, to ensure all users are leveraging the product with ease and to its full potential. Provide training and support to other team members as well as cultivate a culture of constant collaboration. Do you have what it takes? 5+ Years of Software Development experience in Java, Springboot, microservices. Must be proficient in the English language, both written and verbal Knowledge and ability to apply application security principles to our software process What makes you uncommon? Hands on experience with one or more of the following technologies (Elasticsearch, Kafka, Apache Spark, Logstash, Hadoop/hive, TensorFlow, Kibana, Athena/Presto/BigTable, Angular, React). Experience with cloud platforms such as AWS, GCP, or Azure. Solid understanding of unit testing, continuous integration and deployment practices. Experience with Agile Methodology. Higher education/relevant certifications.
Posted 1 week ago
0.0 years
0 Lacs
Pune, Maharashtra
On-site
Job description We are looking for skilled Data Engineering Trainees to work on some complex big data projects. This is an internship program which will eventually convert into an employment if the performance of the candidate is found excellent. Job Terms This is work from office internship. Job location is Pune, Maharashtra Internship duration will be 6 months A bond of 30 months including internship if the internship is converted to job Must have taken a formal training on Python & Spark Work Experience No experience is expected Candidates who graduated in 2023 or earlier will be preferred. Job Responsibilities Build scalable streaming data pipelines Write complex SQL queries to transform the source data Write an enterprise grade code which is stable and does not break Data pipeline deployment by working closely with the DevOps team Build automated job scheduling and monitoring scripts Skills Exceptional Programming skills in Python, Spark, Kafka, Pyspark and C++ Very good knowledge of SQL and complex query writing Very good knowledge of Pandas and Numpy Must be familiar with Exploratory Data analysis and data pre-processing Knowledge of Databricks will be an advantage Education B.Tech BE Who are pass out in 2023 and before 2023 Job Types: Full-time, Permanent Pay: ₹5,000.00 per month Schedule: Day shif Job Types: Full-time, Fresher, Internship Contract length: 4 months Pay: ₹5,000.00 per month Work Location: In person
Posted 1 week ago
4.0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Established in 2004, OLIVER is the world’s first and only specialist in designing, building, and running bespoke in-house agencies and marketing ecosystems for brands. We partner with over 300 clients in 40+ countries and counting. Our unique model drives creativity and efficiency, allowing us to deliver tailored solutions that resonate deeply with audiences. As a part of The Brandtech Group , we're at the forefront of leveraging cutting-edge AI technology to revolutionise how we create and deliver work. Our AI solutions enhance efficiency, spark creativity, and drive insightful decision-making, empowering our teams to produce innovative and impactful results. Role: Account Lead Location: Mumbai, India About the role: Working in true collaboration with our client, we have one goal in mind: ‘to be the leading agency partner for the development of stunning and effective needs-based content and digital media campaigns’‘to be the core craft and digital partner powering global and regional creative excellence, achieved by elevating our people, expertise and culture to deliver shared success” This is a great opportunity for a level headed, strategic thinker, who has gravitas with a client and would like to play a big part in defining and advancing a positive and highly productive design culture for OLIVER. The Account Lead is the client’s partner to bounce things around with and to own the relationship, and they are the agency leader to be responsible for projects, the team resource required and the work to be delivered. They think ahead, spotting problems before they occur, managing client and internal expectations, support their team both managing up and down, and contingency planning as they go. They develop close relationships and a sense of partnership with their clients and foster trust based on their ability to deliver. This trust enables them to develop, nurture and protect the best work possible through the client process. Internally they have strong relationships with all team members and a thorough understanding of how to lead their team to create great work and help problem solve when it’s needed. They work in partnership with both Strategy and Creative, and are fluent in the strategic debates regarding their brands. They are passionate about our creative product and know what best in class looks like both in terms of creativity and results. What you will be doing: Ability to manage a project with the client from beginning to end To provide leadership and expertise to the onsite creative team at Unilever Identify the best resource within your team to deliver the brief. Schedule and manage team priorities and deadlines across client projects Championing a lasting and strategic partnership that cultivates a client experience to engage and delight Financial accountability. Stellar project management, fully responsible for financial management regarding jobs/accounts including forecasting. Process Development and fulfilment – maintaining ongoing communications, both internal and external, to keep processes and resources streamlined Brand guardianship Presenting your work internally and to clients and manage workloads within agreed timings. Resource Management - Working alongside the Creative Director to ensure you have the right people at the right time to deliver to client/ project needs What you need to be great in this role: 4 to 6 years of experience working with major FMCG clients, as well as beauty or cosmetic brands a huge bonus Understanding of how to integrate with a client-side team whilst maintaining a top tier agency service. Excellent client engagement skills with the ability to proactively organise and influence clients and build strong and effective working relationships. Passion for deep-diving into a client’s business to get under the skin of it and fully understand their brands, products, and ways of working The ability to manage and filter workflow as well as organise and prioritise workloads to maximise productivity. A strong understanding and experience of working with end to end digital creative solutions; particularly across Social Media, eCommerce and Social Commerce Knowledge of account management, project management and invoicing. Highly creative with the ability to generate ideas and practically contribute to studio output. Ambition to push for the best and create award-winning work Embodies the “can-do attitude” and is seen as a constant positive force on the team Passion for and inquisitive about AI and new technologies Understanding and knowledge of AI tools is beneficial, but ability to learn and digest benefits and features of AI tools is critical Req ID: 13373 Our values shape everything we do: Be Ambitious to succeed Be Imaginative to push the boundaries of what’s possible Be Inspirational to do groundbreaking work Be always learning and listening to understand Be Results-focused to exceed expectations Be actively pro-inclusive and anti-racist across our community, clients and creations OLIVER, a part of the Brandtech Group, is an equal opportunity employer committed to creating an inclusive working environment where all employees are encouraged to reach their full potential, and individual differences are valued and respected. All applicants shall be considered for employment without regard to race, ethnicity, religion, gender, sexual orientation, gender identity, age, neurodivergence, disability status, or any other characteristic protected by local laws. OLIVER has set ambitious environmental goals around sustainability, with science-based emissions reduction targets. Collectively, we work towards our mission, embedding sustainability into every department and through every stage of the project lifecycle.
Posted 1 week ago
5.0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Established in 2004, OLIVER is the world’s first and only specialist in designing, building, and running bespoke in-house agencies and marketing ecosystems for brands. We partner with over 300 clients in 40+ countries and counting. Our unique model drives creativity and efficiency, allowing us to deliver tailored solutions that resonate deeply with audiences. As a part of The Brandtech Group , we're at the forefront of leveraging cutting-edge AI technology to revolutionise how we create and deliver work. Our AI solutions enhance efficiency, spark creativity, and drive insightful decision-making, empowering our teams to produce innovative and impactful results. Role: Performance Marketing Manager (Marketplaces) Location: Mumbai, India About the role: We are looking for a Performance Marketing Manager (Marketplaces) to lead and grow our premium skin cleansing category online business. This role will be responsible for managing e-commerce platforms, global marketplaces, and ensuring the achievement of annual revenue and profitability targets. The ideal candidate will develop and execute strategies to enhance customer experience, optimize conversion rates, and drive online sales growth. What you will be doing: Ensure premium skin cleansing category is optimally positioned across various e-commerce platforms. Drive top-line revenue growth while maintaining profitability within the allocated budget. Develop and execute strategies to improve site performance, including conversion rates, AOV, and other key metrics. Manage website content and merchandising to align with brand campaigns, seasonal promotions, and marketing strategies. Negotiate budgets, set performance goals, and report on key financial and operational metrics. Plan and execute brand campaigns across digital platforms, optimizing for performance and engagement. Build brand awareness and recognition through targeted online initiatives. Develop and refine processes to enhance customer retention and loyalty. Utilize web analytics to analyze user experience across touchpoints and implement improvements. Drive revenue through audience segmentation, list growth, and optimized marketing efforts. What you need to be great in this role: Bachelor’s degree in Marketing, Business Administration, or a related field. 5+ years of experience in e-commerce management, preferably in the beauty or skincare industry. Strong understanding of D2C e-commerce frameworks and digital marketing strategies. Ability to manage multiple projects in a fast-paced, high-growth environment. High attention to detail with the capability to balance multiple priorities effectively. Creative and strategic thinker with a customer-first approach. Strong interpersonal and communication skills, with experience collaborating across teams and functions. Passion for and inquisitive about AI and new technologies Understanding and knowledge of AI tools is beneficial, but ability to learn and digest benefits and features of AI tools is critical This is an exciting opportunity to shape and grow premium skin cleansing category’s D2C business , driving innovation and success in the online beauty space. Req ID: 12580 Our values shape everything we do: Be Ambitious to succeed Be Imaginative to push the boundaries of what’s possible Be Inspirational to do groundbreaking work Be always learning and listening to understand Be Results-focused to exceed expectations Be actively pro-inclusive and anti-racist across our community, clients and creations OLIVER, a part of the Brandtech Group, is an equal opportunity employer committed to creating an inclusive working environment where all employees are encouraged to reach their full potential, and individual differences are valued and respected. All applicants shall be considered for employment without regard to race, ethnicity, religion, gender, sexual orientation, gender identity, age, neurodivergence, disability status, or any other characteristic protected by local laws. OLIVER has set ambitious environmental goals around sustainability, with science-based emissions reduction targets. Collectively, we work towards our mission, embedding sustainability into every department and through every stage of the project lifecycle.
Posted 1 week ago
20.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Description Job Title: Senior Software Engineer Location: Bangalore Position Type: Full-time Position Level: 3 Who We Are Xactly is a leader in Sales Performance Management Solutions and a part of Vista Equity Partners portfolio companies since 2017. The Xactly Intelligent Revenue Platform helps businesses improve go-to-market outcomes through increased collaboration, greater efficiencies, and connecting data from all critical functions of the revenue lifecycle on a single platform. Born in the cloud almost 20 years ago, Xactly provides customers with extensive experience in solving the most challenging problems customers of all sizes face, backed by almost 20 years of proprietary data and award-winning AI. Named among the best workplaces in the U.S. by Great Place to Work six times, honored on FORTUNE Magazine’s inaugural list of the 100 Best Workplaces for Millennials, and chosen as the “Market Leader in Incentive Compensation” by CRM magazine. We’re building a culture of success and are looking for motivated professionals to join us! The Team The Xactly development team is full of smart engineers from top companies and universities and they execute quickly! In order to build and ship high quality products extremely fast, efficiently and in a continuous manner, Xactly engineers rely on their leaders to remove any obstacles and guide them through engineering practices. At Xactly, we build teams that are helpful, respect each other, maintain a high level of customer focus, inclusive of everyone and we strive for strong product ownership by the team. The Opportunity The ideal candidate will be extremely proficient and proven in the design and implementation of modern web application architectures. You must be strong in all aspects of mico-services, data access layers, API first design, single click deployment and technologies such as scala, NoSQL key-value data stores, hadoop , spark, chef and containers. Not only do we offer strong growth opportunities for top performers, but we also have a top-notch culture, benefits and more. Our strong C.A.R.E. values – Customer Focus, Accountability, Respect & Excellence – guide our every move, allowing us to be a leader in the incentive compensation & performance management market. We set the example with excellent customer experience and deliver an award winning SaaS (Software-as-a-Service) product!. Xactly, we believe everyone has a unique story to tell, and these small differences between us have a big impact. When bright, diverse minds come together, we’re challenged to think different ways, generate creative ideas, be more innovative, and take on new perspectives. Our customers come from different cultures and walks of life all around the world, and we believe our teams should reflect that to build strong and lasting relationships Required Skills Masters plus 5 years or bachelors plus 8 years experience in web application development and architecture. Extensive experience using open source software libraries Strong experience in at least one MVC architecture or application of the pattern Solid hands on experience with Java Strong experience with SQL ( Oracle, MySQL, Postgres) Strong experience in Springboot and REST Services Must have built end to end continuous integration and deployment infrastructure for micro services Strong commitment to good engineering discipline and process including code reviews and delivering unit tests in conjunction with feature delivery Must possess excellent communication and teamwork skills. Strong presentation and facilitation skills are required. Self-starter that is results focused with the ability to work independently and in teams. Good To Have Prior experience building modular, common and scalable services Experience using chef, puppet or other deployment automation tools Experience working within a distributed engineering team including offshore Bonus points if you have contributed to an open source project Familiarity and experience with agile (scrum) development process Proven track record of identifying and championing new technologies that enhance the end-user experience, software quality, and developer productivity WITHIN ONE MONTH, YOU’LL Become familiar with the code base, development processes, and deployments. Become familiar with the product as customers will use it. You may even have your first PR approved and in production. WITHIN THREE MONTHS, YOU’LL Become a contributor to the overall code base. Have PRs approved and deployed to production Contribute to design WITHIN SIX MONTHS, YOU’LL Working more autonomously and closer with product Helping troubleshoot issues Contribute new ideas to the product and development WITHIN TWELVE MONTHS, YOU’LL Become a UI expert for your project. Take full ownership of features and processes of the product Benefits and Perks Comprehensive Insurance Coverage Tuition Reimbursement XactlyFit Gym/Fitness Program Reimbursement Kitchen Stocked Daily with Tasty Snacks, Fruit and Drinks Free Parking and Subsidized Bus Pass (a go-green initiative!) About Xactly Corporation Xactly is a leading provider of enterprise-class, cloud-based, incentive compensation solutions for employee and sales performance management. We address a critical business need: To incentivize employees and align their behaviors with company goals. Our products allow organizations to make more strategic decisions, increase employee performance, improve margins, and mitigate risk. Our core values are key to our success, and each day we’re committed to upholding them by delivering the best we can to our customers. Xactly is proud to be an Equal Opportunity Employer. Xactly provides equal employment opportunities to all employees and applicants without regard to race, color, religion, sex, age, national origin, disability, veteran status, pregnancy, sexual orientation, or any other characteristic protected by law. This means we believe in celebrating diversity and creating an inclusive workplace environment, where everyone feels valued, heard, and has a sense of belonging. By doing this, everyone in the Xactly family has the power to make a difference and unleash their full potential. We do not accept resumes from agencies, headhunters, or other suppliers who have not signed a formal agreement with us.
Posted 1 week ago
0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Location : Bengaluru We are looking for an outstanding talent in an Individual contributor role to help build a new data lakehouse. Shown experience as a data engineer(IC, hands on role) or similar role, with a focus on cloud distributed data processing platform for spark, and modern open table concept like delta/iceberg. Solid experience with Azure: Synapse Analytics, Data Factory, Data Lake, Databricks, Microsoft Purview, Monitor, SQL Database, SQL Managed Instance, Stream Analytics, Cosmos DB, Storage Services, ADLS, Azure Functions, Log Analytics, Serverless Architecture, ARM Templates. Strong proficiency in Spark, SQL, and Python/scala/Java. Must have Skills : Python, Spark, Azure Data Factory, Azure Fabric, Azure Functions, ADLS, SQL, Azure SQL, Log Analytics Experience in building Lakehouse architecture using open-source table formats like delta, parquet and tools like Jupyter notebook. Strong notions of security standard methodologies (e.g., using Azure Key Vault, IAM, RBAC, Monitor etc.). Proficient in integrating, redefining, and consolidating data from various structured and unstructured data systems into a structure that is suitable for building analytics solutions. Understand the data through exploration, experience with processes related to data retention, validation, visualization, preparation, matching, fragmentation, segmentation, and improvement. Demonstrates ability to understand business requirements Agile development processes (SCRUM and Kanban) Good communication, presentation, documentation, and social skills Able to self-manage and work independently in a fast-paced environment with multifaceted requirements and priorities. LSEG is a leading global financial markets infrastructure and data provider. Our purpose is driving financial stability, empowering economies and enabling customers to create sustainable growth. Our purpose is the foundation on which our culture is built. Our values of Integrity, Partnership , Excellence and Change underpin our purpose and set the standard for everything we do, every day. They go to the heart of who we are and guide our decision making and everyday actions. Working with us means that you will be part of a dynamic organisation of 25,000 people across 65 countries. However, we will value your individuality and enable you to bring your true self to work so you can help enrich our diverse workforce. You will be part of a collaborative and creative culture where we encourage new ideas and are committed to sustainability across our global business. You will experience the critical role we have in helping to re-engineer the financial ecosystem to support and drive sustainable economic growth. Together, we are aiming to achieve this growth by accelerating the just transition to net zero, enabling growth of the green economy and creating inclusive economic opportunity. LSEG offers a range of tailored benefits and support, including healthcare, retirement planning, paid volunteering days and wellbeing initiatives. We are proud to be an equal opportunities employer. This means that we do not discriminate on the basis of anyone’s race, religion, colour, national origin, gender, sexual orientation, gender identity, gender expression, age, marital status, veteran status, pregnancy or disability, or any other basis protected under applicable law. Conforming with applicable law, we can reasonably accommodate applicants' and employees' religious practices and beliefs, as well as mental health or physical disability needs. Please take a moment to read this privacy notice carefully, as it describes what personal information London Stock Exchange Group (LSEG) (we) may hold about you, what it’s used for, and how it’s obtained, your rights and how to contact us as a data subject. If you are submitting as a Recruitment Agency Partner, it is essential and your responsibility to ensure that candidates applying to LSEG are aware of this privacy notice.
Posted 1 week ago
3.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Description About Amazon.com: Amazon.com strives to be Earth's most customer-centric company where people can find and discover virtually anything they want to buy online. By giving customers more of what they want - low prices, vast selection, and convenience - Amazon.com continues to grow and evolve as a world-class e-commerce platform. Amazon's evolution from Web site to e-commerce partner to development platform is driven by the spirit of innovation that is part of the company's DNA. The world's brightest technology minds come to Amazon.com to research and develop technology that improves the lives of shoppers and sellers around the world. Overview of the role The Business research Analyst will be responsible for data and Machine learning part of continuous improvement projects across compatibility and basket building space. This will require collaboration with local and global teams, which have process and technical expertise. Therefore, RA should be a self-starter who is passionate about discovering and solving complicated problems, learning complex systems, working with numbers, and organizing and communicating data and reports. In compatibility program, RA perform Big data analysis to identify patterns, train model to generate product to product relationship and product to brand & model relationship. RA also continuously improve the ML solution for higher solution accuracy, efficiency and scalability. RA should writes clear and detailed functional specifications based on business requirements as well as writes and reviews business cases. Key job responsibilities Scoping, driving and delivering complex projects across multiple teams. Performs root cause analysis by understand the data need, get data / pull the data and analyze it to form the hypothesis and validate it using data. Conducting a thorough analysis of large datasets to identify patterns, trends, and insights that can inform the development of NLP applications. Developing and implementing machine learning models and deep learning architectures to improve NLP systems. Designing and implementing core NLP tasks such as named entity recognition, classification and part-of-speech tagging. Dive deep to drive product pilots, build and analyze large data sets, and construct problem hypotheses that help steer the product feature roadmap (e.g. with use of Python), tools for database (e.g. SQL, spark) and ML platform (tensorflow, pytorch) Conducting regular code reviews and implementing quality assurance processes to maintain high standards of code quality and performance optimization. Providing technical guidance and mentorship to junior team members and collaborating with external partners to integrate cutting-edge technologies. Find the scalable solution for business problem by executing pilots and build Deterministic and ML model (plug and play on readymade ML models and python skills). Performs supporting research, conduct analysis of the bigger part of the projects and effectively interpret reports to identify opportunities, optimize processes, and implement changes within their part of project. Coordinates design effort between internal team and external team to develop optimal solutions for their part of project for Amazon’s network. Ability to convince and interact with stakeholders at all level either to gather data and information or to execute and implement according to the plan. About The Team Amazon.com operates in a virtual, global eCommerce environment without boundaries, and operates a diverse set of businesses in 14 countries, including Retail, third party marketplaces, eCommerce platforms, web services for developers. Retail Business Service (RBS) organization is a core part of leading customer experience and selling partners experience optimization. This team is part of RBS Customer Experience business unit. The team’s primary role is to create and enhance retail selection on the worldwide Amazon online catalog. The compatibility program handled by this team has a direct impact on customer buying decisions and online user experience. Compatibility program aims to address Customer purchase questions if two products works together, as well as reduce return due to incompatibility. Basic Qualifications Basic Qualifications Ability to analyse and then articulate business issues to a wide range of audiences using strong data, written and verbal communication skills Good mastery of BERT and other NLP frameworks such as GPT-2, XLNet, and Transformer models Experience in NLP techniques such as tokenization, parsing, lexing, named entity recognition, sentiment analysis and spellchecking Strong problem-solving skills, creativity and ability to overcome challenges SQL/ETL, Automation Tools Relevant bachelor’s degree or higher 3+ years combined of relevant work experience in a related field/s (project management, customer advocate, product owner, engineering, business analysis) - Diverse experience will be favored eg. a mix of experience across different roles Be self motivated and autonomous with an ability to prioritize well, and remain focused when working within a team located in across several countries and time zones Preferred Qualifications Preferred Qualifications 3+ years combined of relevant work experience in a related field/s (project management, customer advocate, product owner, engineering, business analysis) - Diverse experience will be favored eg. a mix of experience across different roles Understanding of machine learning concepts including developing models and tuning the hyper-parameters, as well as deploying models and building ML service Experience with computer vision algorithms and libraries such as OpenCV, TensorFlow, Caffe or PyTorch. Technical expertise, experience in Data science and ML Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI - BLR 14 SEZ Job ID: A3031496
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
40098 Jobs | Dublin
Wipro
19612 Jobs | Bengaluru
Accenture in India
17156 Jobs | Dublin 2
EY
15921 Jobs | London
Uplers
11674 Jobs | Ahmedabad
Amazon
10661 Jobs | Seattle,WA
Oracle
9470 Jobs | Redwood City
IBM
9401 Jobs | Armonk
Accenture services Pvt Ltd
8745 Jobs |
Capgemini
7998 Jobs | Paris,France