Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 12.0 years
0 Lacs
Coimbatore, Tamil Nadu, India
On-site
Data Software Engineer Chennai & Coimbatore Walkin in on 2 Aug 25 Hybrid Role 5-12 Years of in Big Data & Data related technology experience Expert level understanding of distributed computing principles Expert level knowledge and experience in Apache Spark Hands on programming with Python Proficiency with Hadoop v2, Map Reduce, HDFS, Sqoop Experience with building stream-processing systems, using technologies such as Apache Storm or Spark-Streaming Good understanding of Big Data querying tools, such as Hive, and Impala Experience with integration of data from multiple data sources such as RDBMS (SQL Server, Oracle), ERP, Files Good understanding of SQL queries, joins, stored procedures, relational schemas Experience with NoSQL databases, such as HBase, Cassandra, MongoDB Knowledge of ETL techniques and frameworks Performance tuning of Spark Jobs Experience with AZURE Databricks Ability to lead a team efficiently Experience with designing and implementing Big data solutions Practitioner of AGILE methodology
Posted 1 day ago
0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Intuitive Apps Inc. is one of the fastest growing Consulting companies, working on a mission to take a plunge to provide best digital transformation and intuitive experience for our customers. The Role Roles And Responsibilities Adherence to ISO 9001:2008, ISO 27001, Policies & Procedures Proven experience troubleshooting security issues across various technologies Customer-centric career experience and excellent Time management skills. Ability to work within customer focused team and Excellent communication skills Take ownership of customer issues reported and see problems through to resolution. Troubleshoot and resolve issues through sharing best practices and direct resolution. Excellent written and verbal communication and effective organizational and multi-tasking skills. Proven ability to quickly learn new technical domains and train others. Should be flexible to work in an operational environment, rotational shifts and on-call schedule. Other general responsibilities as instructed by management. Ideal Profile ITIL Framework knowledge, Adherence to ISO 9001:2008, ISO 27001, Policies & Procedures. In-depth knowledge in SQL & PL/SQL. Oracle Database hands on Knowledge. Well versed with Shell Scripting, Linux and Windows Platform. Must accept to rotational shifts (24*7). Banking Working days. Application support profile. Knowledge on Banking domain and products. IBM MQ Support, JBoss, Apache Tomacat, Java knowledge is desirable. In-depth knowledge in SQL & PL/SQL. Oracle Database hands on Knowledge Well versed with Shell Scripting, Linux and Windows Platform. Hands on in SWIFT, SFMS, NEFT/RTGS, Export, Import. ITIL Framework knowledge. What's on Offer? Leadership Role Great work environment Attractive salary & benefits
Posted 1 day ago
0 years
0 Lacs
Navi Mumbai, Maharashtra, India
On-site
Intuitive Apps Inc. is one of the fastest growing Consulting companies, working on a mission to take a plunge to provide best digital transformation and intuitive experience for our customers. The Role Roles And Responsibilities Adherence to ISO 9001:2008, ISO 27001, Policies & Procedures Proven experience troubleshooting security issues across various technologies Customer-centric career experience and excellent Time management skills. Ability to work within customer focused team and Excellent communication skills Take ownership of customer issues reported and see problems through to resolution. Troubleshoot and resolve issues through sharing best practices and direct resolution. Excellent written and verbal communication and effective organizational and multi-tasking skills. Proven ability to quickly learn new technical domains and train others. Should be flexible to work in an operational environment, rotational shifts and on-call schedule. Other general responsibilities as instructed by management. Ideal Profile ITIL Framework knowledge, Adherence to ISO 9001:2008, ISO 27001, Policies & Procedures. In-depth knowledge in SQL & PL/SQL. Oracle Database hands on Knowledge. Well versed with Shell Scripting, Linux and Windows Platform. Must accept to rotational shifts (24*7). Banking Working days. Application support profile. Knowledge on Banking domain and products. IBM MQ Support, JBoss, Apache Tomacat, Java knowledge is desirable. In-depth knowledge in SQL & PL/SQL. Oracle Database hands on Knowledge Well versed with Shell Scripting, Linux and Windows Platform. Hands on in SWIFT, SFMS, NEFT/RTGS, Export, Import. ITIL Framework knowledge. What's on Offer? Leadership Role Great work environment Attractive salary & benefits
Posted 1 day ago
0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Exp : 4yrs to 8yrs Key requirements: Managing database servers, storage, and network components. Conducting regular database health checks and implementing proactive measures. Responsibilities include designing, installing, upgrading, patching, tuning, monitoring and troubleshooting databases with large amounts of data (Terabytes) and high transaction rates. Experience with DataGuard setup and management Experience in defining security standards and supporting security audits. Experience with Perl, Unix Shell Scripting, and Python for custom monitoring and automation · Experience working with the development team during design sessions to provide performance and scaling inputs Experience with troubleshooting problems relating to the Oracle database that may originate or exist in the infrastructure that supports the data store including application server, network, storage, and operating system component/layer · Exadata feature enablement, monitoring, review and update placement strategy · Working knowledge on Exadata 7 and above platforms. Monitor resource utilization and Oracle Compute Unit (OCPU) enablement · Good exposure and hands-on with Oracle ExaCC (Exadata Cloud @ Customer) architecture, managing and creating resources using cloud dashboard · Create/manage Oracle SR and support team in issue resolution Strong experience in Oracle Core Database administration and monitoring including · Exposure to Performance of Exadata Experience with migration of Oracle Databases from On-premise to Oracle Exadata Cloud at Customer Experience with Oracle Enterprise Manager (OEM). Experience with Patch/Upgrade and release management (database, storage node, server, network, Oracle Exadata) · Develop and maintain system documentation related to your Oracle Exadata and software configuration · Apply process and architectural improvements to continually improve the availability, capacity, and performance of database systems · Communicate appropriately and efficiently with management, customers, and vendors · Sustain a team-oriented, fast-paced environment · On-call rotation is required to support a 24/7 environment and is also expected to be able to work outside business hours to support a global delivery. Candidate must be capable of providing off-hours support to avoid impacting database availability during normal business hours. Experience with RMAN database backup and recovery using ZDLRA, Proficient in using real-time monitoring tools like Grafana. Background: The candidate should be degree educated and needs to have at least 6 plus years solid experience as Production Support DBA. Added advantage to German Speaking candidates Technology buzzwords: Monitoring and reporting tool. Tools: Tivoli performance viewer, Remedy, Word, Visio, Excel etc. Server AIX, Solaris, Linux & Windows Server Knowledge of Backup products like TSM, NetBackup, Networker. Database stacks like Oracle, MS-SQL, Informix Web & Middleware stacks like WebLogic Application server, MQ, Apache, Tomcat, and IIS.
Posted 1 day ago
2.0 years
0 Lacs
Greater Bengaluru Area
On-site
About the Company 6thStreet.com is an omnichannel fashion & lifestyle destination that offers 1400+ fashion & beauty brands in the UAE, KSA, Kuwait, Oman, Bahrain & Qatar. Customers can shop the latest on-trend outfits, shoes, bags, beauty essentials and accessories from international brands such as Tommy Hilfiger, Calvin Klein, Hugo, Marks & Spencers, Dune London, Charles & Keith, Aldo, Crocs, Birkenstock, Skechers, Levi’s, Nike, Adidas, Loreal and Inglot amongst many more. 6thStreet.com recently opened GCC’s first phygital store at Dubai Hills Mall; an innovative tech-led space which combines the best of both online & offline shopping with online browsing & smart fitting rooms. Overview The ML Engineer will extract insights and build models that will drive key business decisions. The candidate will work closely with other data scientists, software engineers and product managers to design, build, optimize and deploy machine learning systems and solutions. This role is ideal for someone with a strong analytical mindset, a passion for data, and a desire to grow in a fast-paced e-commerce environment. Necessary Skills Python: Proficiency in python, with knowledge of popular libraries like pandas, numpy, scipy, scikit-learn, tensorflow, pytorch SQL: Strong ability to write and optimize complex SQL queries to extract and manipulate large datasets from relational databases Data Analysis & Visualization: Ability to work with large datasets and extract meaningful insights and able to leverage data visualization tools and libraries Data Wrangling & Preprocessing: Expertise in cleaning and transforming raw data into structured formats Statistical Analysis: A solid understanding of descriptive and inferential statistics, including hypothesis testing and probability theory Machine Learning & Deep Learning: Familiarity with supervised and unsupervised learning algorithms such as regression, tree based methods, clustering, boosting and bagging methodologies Machine learning workflows: feature engineering, model training, model optimization , validation and evaluation ML Deployment: Deploying machine learning models to production environments, ensuring they meet the scalability, reliability, and performance requirements DevOps: Git, CI/CD pipelines, dockerization, model versioning (mlflow), monitoring platforms Cloud Platforms: Experience with cloud platforms like AWS, Google Cloud or Azure for deploying models Problem-Solving & Analytical Thinking: Ability to approach complex problems methodically and implement robust solutions Collaboration & Communication: Strong ability to work with cross-functional teams and communicate technical concepts to non-technical stakeholders. Adaptability & Learning: Willingness to quickly learn new tools, technologies, and algorithms Attention to Detail: Ability to carefully test and validate models, ensuring they work as intended in production Good to have: Familiarity with big data technologies such as Spark or Hadoop Object-oriented programming (OOP) Knowledge of data privacy and security practices when working with sensitive data Experience working with big data tools (e.g., Apache Kafka, Apache Flink) for streaming data processing Familiarity with feature stores like Feast Experience working with e-commerce data Responsibilities Design and implement machine learning models, algorithms, and systems Build and maintain end-to-end machine learning pipelines- model training, validation, and deployment Experiment with different algorithms and approaches to optimize model performance Collaborate with software engineers, product managers, analysts to build scalable, production-ready solutions Communicate complex technical concepts to non-technical stakeholders Stay updated with the latest advancements in machine learning and deep learning. Evaluate and experiment with new tools, libraries, and algorithms that could improve model performance Collaborate on proof-of-concept (POC) projects to validate new approaches and techniques Benefits Full-time role Competitive salary Company employee discounts across all brands Medical & health insurance Collaborative work environment Good vibes work culture Qualifications Bachelor's degree or equivalent experience in quantative field (Statistics, Mathematics, Computer Science, Engineering, etc.) At least 2 years' of experience in quantitative analytics or data modeling and development Deep understanding of predictive modeling, machine-learning, clustering and classification techniques, and algorithms Fluency in a programming language (Python, C,C++, Java, SQL)
Posted 1 day ago
5.0 years
0 Lacs
Ahmedabad, Gujarat, India
Remote
Urgent Hiring | Freelancers Position: Senior Full-Stack Developer (Java/Kotlin + Vanilla JavaScript) Location: Remote Employment Type: Freelancer Experience Required: 5+ years in Full-Stack Development Must-Have Skills: Backend: Minimum 5 years of hands-on experience with Java (Spring Framework) Experience working with Kotlin and transitioning from Java Proficient with Gradle , Tomcat , and Apache server setup Strong SQL skills using MariaDB Understanding of monolithic architecture Experience with TeamCity CI/CD or similar tools Frontend: Strong proficiency in Vanilla JavaScript (5+ years) Ability to build high-performance UIs without frameworks Open to working with React in future developments Testing & Tools: UI & API testing experience using Cypress Familiar with Git , IntelliJ IDEA , Slack , and Trello Nice-to-Have: Experience in native Android (Kotlin) or iOS (Swift/Flutter) development Exposure to desktop app development (Electron, Swift, Kotlin Multiplatform) Familiarity with server scaling and performance optimization Tech Stack Overview: Languages: Java, Kotlin, Vanilla JS Backend Framework: Spring Database: MariaDB Frontend Frameworks: None currently; React planned for future Infrastructure: Debian Linux server (no containers or virtualization) CI/CD: Gradle + TeamCity Testing: Cypress Version Control & Tools: Git, IntelliJ IDEA, Slack, Trello Note: We prefer immediate joiners. If interested, kindly share your updated CV at hr@hashtechy.com Thanks & Regards, Yamini Patel HR Manager 8511190784
Posted 1 day ago
7.0 - 10.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
We’re looking for a Cloud Architect / Lead to design, build, and manage scalable AWS infrastructure that powers our analytics and data product initiatives. This role focuses on automating infrastructure provisioning, application/API hosting, and enabling data and GenAI workloads through a modern, secure cloud environment. Key Responsibilities Design and provision AWS infrastructure using Terraform or AWS CloudFormation to support evolving data product needs. Develop and manage CI/CD pipelines using Jenkins, AWS CodePipeline, CodeBuild, or GitHub Actions. Deploy and host internal tools, APIs, and applications using ECS, EKS, Lambda, API Gateway, and ELB. Provision and support analytics and data platforms using S3, Glue, Redshift, Athena, Lake Formation, and orchestration tools like Step Functions or Apache Airflow (MWAA). Implement cloud security, networking, and compliance using IAM, VPC, KMS, CloudWatch, CloudTrail, and AWS Config. Collaborate with data engineers, ML engineers, and analytics teams to align infrastructure with application and data product requirements. Support GenAI infrastructure, including Amazon Bedrock, SageMaker, or integrations with APIs like OpenAI. Requirements 7-10 years of experience in cloud engineering, DevOps, or cloud architecture roles. Strong hands-on expertise with the AWS ecosystem and tools listed above. Proficiency in scripting (e.g., Python, Bash) and infrastructure automation. Experience deploying containerized workloads using Docker, ECS, EKS, or Fargate. Familiarity with data engineering and GenAI workflows is a plus. AWS certifications (e.g., Solutions Architect, DevOps Engineer) are preferred.
Posted 1 day ago
6.0 years
0 Lacs
Jaipur, Rajasthan, India
On-site
JOB DESCRIPTION: DATA ENGINEER (Databricks & AWS) Overview: As a Data Engineer, you will work with multiple teams to deliver solutions on the AWS Cloud using core cloud data engineering tools such as Databricks on AWS, AWS Glue, Amazon Redshift, Athena, and other Big Data-related technologies. This role focuses on building the next generation of application-level data platforms and improving recent implementations. Hands-on experience with Apache Spark (PySpark, SparkSQL), Delta Lake, Iceberg, and Databricks is essential. Locations: Jaipur, Pune, Hyderabad, Bangalore, Noida. Responsibilities: • Define, design, develop, and test software components/applications using AWS-native data services: Databricks on AWS, AWS Glue, Amazon S3, Amazon Redshift, Athena, AWS Lambda, Secrets Manager • Build and maintain ETL/ELT pipelines for both batch and streaming data. • Work with structured and unstructured datasets at scale. • Apply Data Modeling principles and advanced SQL techniques. • Implement and manage pipelines using Apache Spark (PySpark, SparkSQL) and Delta Lake/Iceberg formats. • Collaborate with product teams to understand requirements and deliver optimized data solutions. • Utilize CI/CD pipelines with DBX and AWS for continuous delivery and deployment of Databricks code. • Work independently with minimal supervision and strong ownership of deliverables. Must Have: • 6+ years of experience in Data Engineering on AWS Cloud. • Hands-on expertise in: o Apache Spark (PySpark, SparkSQL) o Delta Lake / Iceberg formats o Databricks on AWS o AWS Glue, Amazon Athena, Amazon Redshift • Strong SQL skills and performance tuning experience on large datasets. • Good understanding of CI/CD pipelines, especially using DBX and AWS tools. • Experience with environment setup, cluster management, user roles, and authentication in Databricks. • Certified as a Databricks Certified Data Engineer – Professional (mandatory). Good To Have: • Experience migrating ETL pipelines from on-premise or other clouds to AWS Databricks. • Experience with Databricks ML or Spark 3.x upgrades. • Familiarity with Airflow, Step Functions, or other orchestration tools. • Experience integrating Databricks with AWS services in a secured, production-ready environment. • Experience with monitoring and cost optimization in AWS. Key Skills: • Languages: Python, SQL, PySpark • Big Data Tools: Apache Spark, Delta Lake, Iceberg • Databricks on AWS • AWS Services: AWS Glue, Athena, Redshift, Lambda, S3, Secrets Manager • Version Control & CI/CD: Git, DBX, AWS CodePipeline/CodeBuild • Other: Data Modeling, ETL Methodology, Performance Optimization
Posted 2 days ago
0 years
0 Lacs
India
Remote
Company Description ThreatXIntel is a startup cyber security company specializing in protecting businesses and organizations from cyber threats. Our tailored services include cloud security, web and mobile security testing, cloud security assessment, and DevSecOps. We prioritize delivering affordable solutions that cater to the specific needs of our clients, regardless of their size. Our proactive approach to security involves continuous monitoring and testing to identify vulnerabilities before they can be exploited. Role Description We are seeking an experienced GCP Data Engineer for a contract engagement focused on building, optimizing, and maintaining high-scale data processing pipelines using Google Cloud Platform services . You’ll work on designing robust ETL/ELT solutions, transforming large data sets, and enabling analytics for critical business functions. This role is ideal for a hands-on engineer with strong expertise in BigQuery , Cloud Composer (Airflow) , Python , and Cloud SQL/PostgreSQL , with experience in distributed data environments and orchestration tools. Key Responsibilities Design, develop, and maintain scalable data pipelines and ETL/ELT workflows using GCP Composer (Apache Airflow) Work with BigQuery , Cloud SQL , and PostgreSQL to manage and optimize data storage and retrieval Build automation scripts and data transformations using Python (PySpark knowledge is a strong plus) Optimize queries for large-scale, distributed data processing systems Collaborate with cross-functional teams to translate business and analytics requirements into scalable technical solutions Support data ingestion from multiple structured and semi-structured sources including Hive , MySQL , and NoSQL databases Apply HDFS and distributed file system experience where necessary Ensure data quality, reliability, and consistency across platforms Provide ongoing maintenance and support for deployed pipelines and services Required Qualifications Strong hands-on experience with GCP services , particularly: BigQuery Cloud Composer (Apache Airflow) Cloud SQL / PostgreSQL Proficiency in Python for scripting and data pipeline development Experience in designing & optimizing high-volume data processing workflows Good understanding of distributed systems , HDFS , and parallel processing frameworks Strong analytical and problem-solving skills Ability to work independently and collaborate across remote teams Excellent communication skills for technical and non-technical audiences Preferred Skills Knowledge of PySpark for big data processing Familiarity with Hive , MySQL , and NoSQL databases Experience with Java in a data engineering context Exposure to data governance, access control, and cost optimization on GCP Prior experience in a contract or freelance capacity with enterprise clients
Posted 2 days ago
7.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Job Title: Data Engineer Experience: 7+ years Location: Gurugram (On-Site) Job Type: Full-Time Job Description: Skills: Data warehouse, Architectural Patterns, Modern data engineering tools and framework, AWS, SQL, File Formats A seasoned Data Engineer with a minimum of 7+ years of experience. Deep experience in designing and building robust, scalable data pipelines – both batch and real-time using modern data engineering tools and frameworks. Proficiency in AWS Data Services (S3, Glue, Athena, EMR, Kinesis etc.). Strong grip on SQL queries, various file formats like Apache Parquet, Delta Lake, Apache Iceberg or Hudi and CDC patterns. Experience in stream processing frameworks like Apache Flink or Kafka Streams or any other distributed data processing frameworks like pySpark. Expertise in workflow orchestration using Apache Airflow. Strong analytical and problem-solving skills, with the ability to work independently in a fast-paced environment. In-depth knowledge of database systems (both relational and NoSQL) and experience with data warehousing concepts. Hands-on experience with data integration tools and a strong familiarity with cloud-based data warehousing and processing is highly desirable. Excellent communication and interpersonal skills, facilitating effective collaboration with both technical and non-technical stakeholders. A strong desire to stay current with emerging technologies and industry best practices in the data landscape.
Posted 2 days ago
0.0 - 6.0 years
0 Lacs
Indore, Madhya Pradesh
On-site
Become a part of Belgium Webnet where work and fun go hand in hand. Belgium Webnet is looking for a Full Stack Developer to join our Indore office, who is Exceptionally good full-stack product developer with experience on MVC, CI, Angular, React JS, Node JS with MongoDB & MySQL database, we are providing technical services to our USA clients by generating powerful & Flexible E-Commerce Website Platform from last 4 years. We are also engaged in Wholesale Diamond Trading Since 1998, we are located in the heart of New York City’s famed Diamond District on 47th Street & India’s cleanest City Indore, Madhya Pradesh. Job Location: Indore Salary: As per Company Standards (Between Rs. 4,00,000- 9,00,000 p.a.) Experience: Minimum 2 to 6 years of experience required Job Summary Knowledge of PHP, CI, HTML/CSS and JQuery & JavaScript Frameworks such as Angular.JS, Node.JS, Opencart, Ajax. Good proficiency on Database modeling &design (e.g. MySQL, SQL is must and Mongo DB will be an added advantage), web servers (e.g. Apache) and UI/UX design. Should have an expertise in developing E-Commerce Website. Data migration, transformation, and scripting. Integrate data from various back-end services and databases Expertise in developing REST APIs with any back-end framework. Exposure to AWS services like S3, Cloudfront, Cloudwatch, lambda & API gateway. Familiarity with the whole web stack, including protocols and web server optimization techniques. Highly motivated with experience with Java/NodeJS/Python based microservices / backend and AngularJS based frontends. Expertise in handling payment systems, especially payment gateway integrations with Paypal, Stripe / Braintree, is a plus. Server-side languages like PHP, Python, Ruby, Java, JavaScript, and .Net Good Understanding of MVC design patterns and frameworks Proficient in Web services REST / SOAP / XML. Experience of third-party APIs like Google, Facebook, Twitter Strong debugging skills and ability to understand and work on existing code Understanding client requirements & functional specifications. Good with Logical Problem Solving. Should have good written/verbal communication skills in English. Great problem-solving skills and ability to abstract functional requirements Skills Required- Ecommerce, Website Development, PHP, CI, Angular, Node, React, HTML, CSS, J Query, Javascript, MYSQL, MongoDB, AWS and Google Cloud Platform, Team Lead, Full Stack developer. Job Type: Full Time Job Location: Indore Experience: 2 to 6 Years
Posted 2 days ago
0.0 - 9.0 years
0 Lacs
Hyderabad, Telangana
On-site
General information Country India State Telangana City Hyderabad Job ID 45479 Department Development Description & Requirements Senior Java Developer is responsible for architecting and developing advanced Java solutions. This role involves leading the design and implementation of microservice architectures with Spring Boot, optimizing services for performance and scalability, and ensuring code quality. The Senior Developer will also mentor junior developers and collaborate closely with cross-functional teams to deliver comprehensive technical solutions. Essential Duties: Lead the development of scalable, robust, and secure Java components and services. Architect and optimize microservice solutions using Spring Boot. Translate customer requirements into comprehensive technical solutions. Conduct code reviews and maintain high code quality standards. Optimize and scale microservices for performance and reliability. Collaborate effectively with cross-functional teams to innovate and develop solutions. Experience in leading projects and mentoring engineers in best practices and innovative solutions. Coordinate with customer and client-facing teams for effective solution delivery. Basic Qualifications: Bachelor’s degree in Computer Science or a related field. 7-9 years of experience in Java development. Expertise in designing and implementing Microservices with Spring Boot. Extensive experience in applying design patterns, system design principles, and expertise in event-driven and domain-driven design methodologies. Extensive experience with multithreading, asynchronous and defensive programming. Proficiency in MongoDB, SQL databases, and S3 data storage. Experience with Kafka, Kubernetes, AWS services & AWS SDK. Hands-on experience with Apache Spark. Strong knowledge of Linux, Git, and Docker. Familiarity with Agile methodologies and tools like Jira and Confluence. Excellent communication and leadership skills. Preferred Qualifications Experience with Spark using Spring Boot. Familiarity with the C4 Software Architecture Model. Experience using tools like Lucidchart for architecture and flow diagrams. About Infor Infor is a global leader in business cloud software products for companies in industry specific markets. Infor builds complete industry suites in the cloud and efficiently deploys technology that puts the user experience first, leverages data science, and integrates easily into existing systems. Over 60,000 organizations worldwide rely on Infor to help overcome market disruptions and achieve business-wide digital transformation. For more information visit www.infor.com Our Values At Infor, we strive for an environment that is founded on a business philosophy called Principle Based Management™ (PBM™) and eight Guiding Principles: integrity, stewardship & compliance, transformation, principled entrepreneurship, knowledge, humility, respect, self-actualization. Increasing diversity is important to reflect our markets, customers, partners, and communities we serve in now and in the future. We have a relentless commitment to a culture based on PBM. Informed by the principles that allow a free and open society to flourish, PBM™ prepares individuals to innovate, improve, and transform while fostering a healthy, growing organization that creates long-term value for its clients and supporters and fulfillment for its employees. Infor is an Equal Opportunity Employer. We are committed to creating a diverse and inclusive work environment. Infor does not discriminate against candidates or employees because of their sex, race, gender identity, disability, age, sexual orientation, religion, national origin, veteran status, or any other protected status under the law. If you require accommodation or assistance at any time during the application or selection processes, please submit a request by following the directions located in the FAQ section at the bottom of the infor.com/about/careers webpage.
Posted 2 days ago
3.0 - 7.0 years
0 Lacs
haryana
On-site
As a self-driven support engineer, you thrive in fast-paced environments and excel at adapting to changing priorities. Your experience in directly interfacing with customers via phone and email equips you to provide technical support through advanced troubleshooting, root cause analysis, and code fixes. Your knack for creative problem-solving extends to designing and creating tools that enhance both internal operations and customer experiences. Your expertise spans web services, databases, networking, and coding, making you a well-rounded lover of technology. In your role, you will act as a trusted advisor to customers across various engagements, offering technical development, product support, and business analysis. You will be the go-to subject matter expert for resolving broad and complex technical issues through first-call resolution via phone and email. Additionally, you will design, write, and enhance tools for internal and external users, while also training customers on best practices for platform usage. Your ability to leverage customer insights will empower other teams to better serve clients and improve the platform. You will contribute to team enhancement by identifying process and technical gaps, all while ensuring high customer satisfaction levels through surveys and feedback. The ideal candidate for this role will possess a B.S. degree in Computer Science or Engineering from a leading university and be comfortable with a 24/7 support role. Proficiency in web stack and web services applications, relational and no-SQL database concepts, as well as basic object-oriented programming and scripting knowledge are essential. Exceptional troubleshooting and analytical skills, along with effective verbal and written communication skills for technical and non-technical audiences, are key requirements. Nice-to-have skills include experience with Java, servlets, or J2EE framework support, Apache, nginx, or Tomcat administration, Oracle PL/SQL, building RESTful services in NodeJS, front-end technologies such as JavaScript, jQuery, CSS, and HTML, Docker containerization, virtualization, basic networking knowledge, and proficiency in Linux and command line environments. If you value autonomy, a flexible work environment, work-life balance, motivating conditions, and a supportive management team, this opportunity offers competitive remuneration, an environment conducive to learning new technologies daily, and the chance to work on cutting-edge solutions.,
Posted 2 days ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
The purpose of this role is to provide significant technical expertise in architecture planning and design of the concerned tower (platform, database, middleware, backup etc) as well as managing its day-to-day operations. You will be responsible for Advanced JBoss Architecture and Internals, including Installation and Configuration, Migration of Jboss EAP to WildFly, JDK or Open JDK upgrade to use with Jboss and WildFly, Application Deployment and Management (Jenkins, CD/CI), Clustering and High Availability, Performance Tuning and Optimization, Advanced Troubleshooting Techniques, Backup, Recovery, and Patch Management, Scripting and Automation, Disaster Recovery and Failover Strategies, and JBoss Upgrades, Migrations, and Patch Management. Key skills required for this role include Jboss EAP, WildFly, JAVA/Open JDK, Apache, and Linux. In addition to technical responsibilities, you will also be involved in Team Management, including resourcing, talent management, performance management, employee satisfaction and engagement. Your performance will be measured based on Operations of the tower (SLA adherence, Knowledge management, CSAT/Customer Experience, Identification of risk issues and mitigation plans), and New projects (Timely delivery, Avoid unauthorized changes, No formal escalations). Mandatory Skills for this role include Jboss Admin. Experience required for this position is 5-8 Years. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.,
Posted 2 days ago
4.0 years
0 Lacs
Greater Kolkata Area
On-site
Job Title: Software Developer Location: Sector 5, Salt Lake, Kolkata Shift Timings: Flexible Day Shift or Afternoon shift Week Offs: Saturday and Sunday Employment Type: Full Time On-Site or Hybrid Industry: Telecommunication, IT and Security Employment Type: Full Time Onsite Salary: Upto 18 LPA Who We Are Salescom Services Private Limited is a one hundred percent subsidiary of a British Technology business. We provide IT, security and Telecommunication products and services to Enterprise and SMEs. We as an organization value people who bring forth a combination of Talent, proactiveness and a never say never attitude! We enable you with the right kind of knowledge and skills that will help you develop into a productive and outstanding professional. Our expertise lies in 360-degree project management, customer success, revenue assurance, account management, billing & analytics, quality and compliance, web security and IT Helpdesk in the space of technology and telecommunications. We are backed by a combined experience of over two decades that the board members have in this space, operating successful ventures, and acquisitions over the years. The founding members of Salescom have operated in Australia and the United Kingdom, running successful, and widely known technology and telecommunication ventures, and in Dec-2019, decided to launch its first captive unit in the heart of the IT workforce space, - Sector V - Kolkata, West Bengal. Job Overview We are looking for an experienced Software Developer specializing in ASP.NET to build software using languages and technologies of the .NET framework. You should be a pro with third-party API integrations and user application programming journeys. In this role, you should be able to write smooth & functional code with a sharp eye for spotting defects. You should be a team player and an excellent communicator. If you are also passionate about the .NET framework and software design/architecture, we’d like to meet you. Your goal will be to work with internal teams to design, develop and maintain functional software of all kinds. Key Responsibilities Desing And Develop Web Application Build robust and scalable web-based solutions using ASP.NET and C#. Optimize database interactions using SQL and NoSQL technologies like Microsoft SQL Server, PostgreSQL, and SQLite. Front End Implementation Develop interactive user interfaces using modern frameworks (Blazor, React). Implement responsive design using Bootstrap, HTML, CSS, JavaScript, and jQuery. API Integration & Management Integrate and maintain third-party SOAP and RESTful APIs. Ensure secure and efficient data exchanges across external systems. Testing & Quality Assurance Use tools such as Jenkins to automate testing processes. Write and maintain unit and integration tests for consistent performance. Troubleshooting & Optimization Identify and resolve software bugs and performance bottlenecks. Analyse prototype feedback and iterate quickly to improve solutions. Collaboration & Communication Work closely with cross-functional teams to understand requirements. Document development progress and articulate technical solutions effectively. Continuous Improvement Stay up to date with emerging technologies and coding practices. Contribute to code reviews and mentor junior developers. Pre Requisites Required at least 4 years of Software development using ASP.NET, C#, SQL/ NoSQL (Microsoft SQL, PostgreSQL, SQlite etc) Experience with Modern Front-End Frameworks (Blazor, React etc) Hands on experience in Third Party SOAP and Rest API integrations. Experienced in Bootstrap, jQuery, HTML, CSS and JavaScript. Knowledge of standard unit testing tools such as Jenkins. Excellent troubleshooting skills in software prototypes. Excellent verbal and written communication skills. BSc / B Tech/ BCA in Computer Science, Engineering, or a related field Good To Have Skill Set Knowledge of .NET MVC Knowledge of .NET MAUI (Xamarin) Experience with CRM development Experience working in ISP, Telephony and MSP Experience with Apache HTTP & Nginx Experience with Debian & Debian based Linux Server Distributions (Eg – Ubuntu) What's In It For You Competitive salary, periodic reviews and performance-based bonuses. Comprehensive health insurance coverage for self and chosen family defendants. Professional development opportunities, including training and company funded certifications Collaborative and inclusive work environment that values diversity and creativity Café facilities Free drop services back home Businesses We Own & Operate https://www.v4consumer.co.uk/ https://www.v4one.co.uk How To Apply Interested candidates are invited to submit their resume and cover letter to hr@salescom.in in confidence. Please label “Senior Software Developer Application” in the email subject line. All candidates will be treated equally, and we will base decisions on appointments on the merits of the candidates. We welcome applications from all candidates, regardless of any protected characteristic and are an equal opportunity employer
Posted 2 days ago
0 years
0 Lacs
Gurugram, Haryana, India
On-site
About Us Welcome to FieldAssist, where Innovation meets excellence!! We are a top-tier SaaS platform that specializes in optimizing Route-to-Market strategies and enhancing brand relationships within the CPG partner ecosystem. With over 1,00,000 sales users representing over 600+ CPG brands across 10+ countries in South East Asia, the Middle East, and Africa, we reach 10,000 distributors and 7.5 million retail outlets every day. FieldAssist is a 'Proud Partner to Great Brands' like Godrej Consumers, Saro Africa, Danone, Tolaram, Haldiram’s, Eureka Forbes, Bisleri, Nilon’s, Borosil, Adani Wilmar, Henkel, Jockey, Emami, Philips, Ching’s and Mamaearth among others. Do you crave a dynamic work environment where you can excel and enjoy the journey? We have the perfect opportunity for you!! Responsibilities Build and maintain robust backend services and REST APIs using Python (Django, Flask, or FastAPI). Develop end-to-end ML pipelines including data preprocessing, model inference, and result delivery. Integrate and scale AI/LLM models, including RAG (Retrieval Augmented Generation) and intelligent agents. Design and optimize ETL pipelines and data workflows using tools like Apache Airflow or Prefect. Work with Azure SQL and Cosmos DB for transactional and NoSQL workloads. Implement and query vector databases for similarity search and embedding-based retrieval (e.g., Azure Cognitive Search, FAISS, or Pinecone). Deploy services on Azure Cloud, using Docker and CI/CD practices. Collaborate with cross-functional teams to bring AI features into product experiences. Write unit/integration tests and participate in code reviews to ensure high code quality. e and maintain applications using the .NET platform and environment Who we're looking for: Strong command of Python 3.x, with experience in Django, Flask, or FastAPI. Experience building and consuming RESTful APIs in production systems. Solid grasp of ML workflows, including model integration, inferencing, and LLM APIs (e.g., OpenAI). Familiarity with RAG, vector embeddings, and prompt-based workflows. Proficient with Azure SQL and Cosmos DB (NoSQL). Experience with vector databases (e.g., FAISS, Pinecone, Azure Cognitive Search). Proficiency in containerization using Docker, and deployment on Azure Cloud. Experience with data orchestration tools like Apache Airflow. Comfortable working with Git, CI/CD pipelines, and observability tools. Strong debugging, testing (pytest/unittest), and optimization skills. Good to Have: Experience with LangChain, transformers, or LLM fine-tuning. Exposure to MLOps practices and Azure ML. Hands-on experience with PySpark for data processing at scale. Contributions to open-source projects or AI toolkits. Background working in startup-like environments or cross-functional product teams. FieldAssist on the Web: Website: https://www.fieldassist.com/people-philosophy-culture/ Culture Book: https://www.fieldassist.com/fa-culture-book CEO's Message: https://www.youtube.com/watch?v=bl_tM5E5hcw LinkedIn: https://www.linkedin.com/company/fieldassist/
Posted 2 days ago
10.0 years
0 Lacs
Greater Kolkata Area
Remote
Java Back End Engineer with AWS Location : Remote Experience : 10+ Years Employment Type : Full-Time Job Overview We are looking for a highly skilled Java Back End Engineer with strong AWS cloud experience to design and implement scalable backend systems and APIs. You will work closely with cross-functional teams to develop robust microservices, optimize database performance, and contribute across the tech stack, including infrastructure automation. Core Responsibilities Design, develop, and deploy scalable microservices using Java, J2EE, Spring, and Spring Boot. Build and maintain secure, high-performance APIs and backend services on AWS or GCP. Use JUnit and Mockito to ensure test-driven development and maintain code quality. Develop and manage ETL workflows using tools like Pentaho, Talend, or Apache NiFi. Create High-Level Design (HLD) and architecture documentation for system components. Collaborate with cross-functional teams (DevOps, Frontend, QA) as a full-stack contributor when needed. Tune SQL queries and manage performance on MySQL and Amazon Redshift. Troubleshoot and optimize microservices for performance and scalability. Use Git for source control and participate in code reviews and architectural discussions. Automate infrastructure provisioning and CI/CD processes using Terraform, Bash, and pipelines. Primary Skills Languages & Frameworks : Java (v8/17/21), Spring Boot, J2EE, Servlets, JSP, JDBC, Struts Architecture : Microservices, REST APIs Cloud Platforms : AWS (EC2, S3, Lambda, RDS, CloudFormation, SQS, SNS) or GCP Databases : MySQL, Redshift Secondary Skills (Good To Have) Infrastructure as Code (IaC) : Terraform Additional Languages : Python, Node.js Frontend Frameworks : React, Angular, JavaScript ETL Tools : Pentaho, Talend, Apache NiFi (or equivalent) CI/CD & Containers : Jenkins, GitHub Actions, Docker, Kubernetes Monitoring/Logging : AWS CloudWatch, DataDog Scripting : Bash, Shell scripting Nice To Have Familiarity with agile software development practices Experience in a cross-functional engineering environment Exposure to DevOps culture and tools (ref:hirist.tech)
Posted 2 days ago
1.0 - 5.0 years
0 Lacs
kochi, kerala
On-site
As a Python Developer at Yatnam, you will be responsible for developing user-friendly websites and applications that are visually appealing and functional across all devices. You should have a strong proficiency in Python and its frameworks such as Django and Flask. Your expertise in web development will be crucial in turning designs into actual web pages using HTML and ensuring seamless connectivity to other systems through formats like JSON and XML. Your role will also involve understanding and implementing software design patterns, conducting automated testing using tools like pytest and unittest, and effectively debugging code using tools like pdb. You should have experience with Continuous Integration and Continuous Deployment pipelines, as well as familiarity with cloud platforms like AWS, Azure, and Google Cloud. Additionally, knowledge of microservices architecture, containerization tools like Docker and Kubernetes, and security best practices in software development will be essential in this role. You should also be comfortable using project management and collaboration tools such as Jira and Trello, and possess strong analytical and problem-solving skills. Being a team player with excellent communication skills, the ability to mentor junior developers, and a commitment to continuous learning and professional development are qualities we are looking for in potential candidates. If you are someone who thrives in a collaborative environment and enjoys taking on new challenges, we encourage you to apply by sending your resume to careers@yatnam.com.,
Posted 2 days ago
2.0 - 6.0 years
0 Lacs
punjab
On-site
We are searching for an experienced Python Developer to become a part of our dynamic development team. The ideal candidate should possess 2 to 5 years of experience in constructing scalable backend applications and APIs using contemporary Python frameworks. This position necessitates a solid foundation in object-oriented programming, web technologies, and collaborative software development. Your responsibilities will involve close collaboration with the design, frontend, and DevOps teams to deliver sturdy and high-performance solutions. Your key responsibilities will include developing, testing, and maintaining backend applications utilizing Django, Flask, or FastAPI. You will also be responsible for building RESTful APIs and incorporating third-party services to enrich platform capabilities. Utilization of data handling libraries such as Pandas and NumPy for efficient data processing is essential. Additionally, writing clean, maintainable, and well-documented code conforming to industry best practices, participating in code reviews, and mentoring junior developers are part of your role. You will collaborate within Agile teams using Scrum or Kanban workflows and troubleshoot and debug production issues proactively and analytically. Required qualifications for this position include 2 to 5 years of backend development experience with Python, proficiency in core and advanced Python concepts, strong command over at least one Python framework (Django, Flask, or FastAPI), experience with data libraries like Pandas and NumPy, understanding of authentication/authorization mechanisms, middleware, and dependency injection, familiarity with version control systems like Git, and comfort working in Linux environments. Must-have skills for this role consist of expertise in backend Python development and web frameworks, experience with Generative AI frameworks (e.g., LangChain, Transformers, OpenAI APIs), strong debugging, problem-solving, and optimization skills, experience with API development and microservices architecture, and a deep understanding of software design principles and security best practices. Good-to-have skills include exposure to Machine Learning libraries (e.g., Scikit-learn, TensorFlow, PyTorch), knowledge of containerization tools (Docker, Kubernetes), familiarity with web servers (e.g., Apache, Nginx) and deployment architectures, understanding of asynchronous programming and task queues (e.g., Celery, AsyncIO), familiarity with Agile practices and tools like Jira or Trello, and exposure to CI/CD pipelines and cloud platforms (AWS, GCP, Azure). In return, we offer competitive compensation based on your skills and experience, generous time off with 18 annual holidays to maintain a healthy work-life balance, continuous learning opportunities while working on cutting-edge projects, and valuable experience in client-facing roles to enhance your professional growth.,
Posted 2 days ago
0 years
0 Lacs
Navi Mumbai, Maharashtra, India
On-site
Job Responsibilities Collaborate with data scientists, software engineers, and business stakeholders to understand data requirements and design efficient data models. Develop, implement, and maintain robust and scalable data pipelines, ETL processes, and data integration solutions. Extract, transform, and load data from various sources, ensuring data quality, integrity, and consistency. Optimize data processing and storage systems to handle large volumes of structured and unstructured data efficiently. Perform data cleaning, normalization, and enrichment tasks to prepare datasets for analysis and modelling. Monitor data flows and processes, identify and resolve data-related issues and bottlenecks. Contribute to the continuous improvement of data engineering practices and standards within the organization. Stay up-to-date with industry trends and emerging technologies in data engineering, artificial intelligence, and dynamic pricing Candidate Profile Strong passion for data engineering, artificial intelligence, and problem-solving. Solid understanding of data engineering concepts, data modeling, and data integration techniques. Proficiency in programming languages such as Python, SQL and Web Scrapping. Understanding of databases like No Sql , relational database, In Memory database and technologies like MongoDB, Redis, Apache Spark would be add on.. Knowledge of distributed computing frameworks and big data technologies (e.g., Hadoop, Spark) is a plus. Excellent analytical and problem-solving skills, with a keen eye for detail. Strong communication and collaboration skills, with the ability to work effectively in a team- oriented environment. Self-motivated, quick learner, and adaptable to changing priorities and technologies. (ref:hirist.tech)
Posted 2 days ago
5.0 - 9.0 years
0 Lacs
kochi, kerala
On-site
As a Lead Backend Developer at Admaren Tech Private Limited, you will play a crucial role in designing, developing, and maintaining back-end systems that support our maritime operations platform. Your responsibility will include collaborating closely with cross-functional teams to ensure seamless integration of backend services with frontend applications, databases, and other core systems. Your expertise in backend technologies will be instrumental in delivering high-performance and scalable solutions that cater to the evolving needs of the maritime sector. In this role, you will lead and manage a team of software developers, providing guidance, mentorship, and support to enhance the team's effectiveness and productivity. You will contribute to the design and development of scalable, maintainable, and efficient software architecture. Taking full ownership of the product lifecycle, from conceptualization through development and deployment, will be part of your responsibilities. You will be responsible for implementing Test-Driven Development practices by writing comprehensive unit tests to ensure the delivery of high-quality, reliable, and maintainable code. Building reusable, modular code and libraries to promote efficiency and consistency across projects will be essential. Additionally, you will continuously optimize applications for maximum performance, speed, and scalability while enforcing security best practices to safeguard sensitive information and ensure compliance with data protection regulations. Maintaining clear and comprehensive technical documentation for all systems and processes related to the products you own will also be part of your duties. The ideal candidate for this role should have proven experience of 5+ years in software development, leadership, or management roles. Strong programming foundation in Python, expertise in web applications and APIs, and knowledge of frameworks such as Django or Flask are required. Experience in secure software development practices, system design, implementation, testing, deployment, and maintenance of enterprise-wide application systems is essential. Proficiency in Source Code Repository, Git, DB design & Architecture, caching techniques, web servers, containerization tools like Docker, and enterprise-level deployment and scaling are necessary for this role. Strong analytical and critical thinking skills, along with excellent leadership, communication, and interpersonal abilities, are desired qualities. A degree in computer science, Information Technology, or a related field, or equivalent practical experience is required. Joining Admaren Tech Private Limited will offer you the opportunity to work on cutting-edge technology that has a real-world impact on the maritime industry. You will collaborate with a dynamic and supportive team focused on continuous learning, providing ample opportunities for career growth, advancement, and skill development.,
Posted 2 days ago
0 years
0 Lacs
Navi Mumbai, Maharashtra, India
On-site
Job Description The primary role of the Software Developer will be to carry out a variety of software/web application development activities to support internal and external projects, including but not limited to : Job Responsibilities Facilitate solution efficiencies, scalability and technology stack leadership. Ensure foolproof and robust applications through unit tests and other quality control measures. Follow an agile development process, and enable rapid solutions to business challenges. Take inputs from internal and external clients and constantly strive to improve solutions. Follow software design, development, testing and documentation best practices. Data engineering : Extract and parse data from online and local data sources; Clean up data, audit data for accuracy, consistency and completeness. Use tools such as - but not limited to - (Excel, SQL, Python, SAS, R, MATLAB, etc.) and extract valuable and actionable insights. Data processing and visualization : Summarize insights in a simple yet powerful chart, reports, slides, etc. Data storage and management : MySQL (AWS RDS) MongoDB. Application frameworks : React, React Native, Django. Data Integration technologies : RESTful APIs, AWS S3 and UI data uploads. Project operations : For internal and external client projects, use our proprietary tools for performing data engineering, analytics and visualization activities. Responsible for project deliveries, escalation, continuous improvement, and customer success. Modify the software to fix errors, adapt it to new hardware, improve its performance, or upgrade interfaces. Follow the system testing and validation procedures. Candidate Profile Strong knowledge of MVC frameworks, SQL coding, and at least one of the following : AngularJS, Django, Flask, Ruby on Rails, and NodeJS are a must. Proficiency in software development. Strong in algorithm design, database programming (RDBMS), and text analytics. Knowledge of NoSQL and Big Data technologies like MongoDB, Apache Spark, Hadoop stack, and Python data science stack is a plus. High problem-solving skills : able to logically break down problems into incremental milestones, prioritize high-impact deliverables first, identify bottlenecks and work around them. Self-learner : highly curious, self-starter and can work with minimum supervision and guidance. An entrepreneurial mind set with a positive attitude is a must. Track record of excellence in academics or non-academic areas, with significant accomplishments. Excellent written & oral communication and interpersonal skills, with a high degree of comfort working in teams and making teams successful. (ref:hirist.tech)
Posted 2 days ago
7.0 years
0 Lacs
Greater Kolkata Area
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Manager Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. Responsibilities: Job Description: · Analyses current business practices, processes, and procedures as well as identifying future business opportunities for leveraging Microsoft Azure Data & Analytics Services. · Provide technical leadership and thought leadership as a senior member of the Analytics Practice in areas such as data access & ingestion, data processing, data integration, data modeling, database design & implementation, data visualization, and advanced analytics. · Engage and collaborate with customers to understand business requirements/use cases and translate them into detailed technical specifications. · Develop best practices including reusable code, libraries, patterns, and consumable frameworks for cloud-based data warehousing and ETL. · Maintain best practice standards for the development or cloud-based data warehouse solutioning including naming standards. · Designing and implementing highly performant data pipelines from multiple sources using Apache Spark and/or Azure Databricks · Integrating the end-to-end data pipeline to take data from source systems to target data repositories ensuring the quality and consistency of data is always maintained · Working with other members of the project team to support delivery of additional project components (API interfaces) · Evaluating the performance and applicability of multiple tools against customer requirements · Working within an Agile delivery / DevOps methodology to deliver proof of concept and production implementation in iterative sprints. · Integrate Databricks with other technologies (Ingestion tools, Visualization tools). · Proven experience working as a data engineer · Highly proficient in using the spark framework (python and/or Scala) · Extensive knowledge of Data Warehousing concepts, strategies, methodologies. · Direct experience of building data pipelines using Azure Data Factory and Apache Spark (preferably in Databricks). · Hands on experience designing and delivering solutions using Azure including Azure Storage, Azure SQL Data Warehouse, Azure Data Lake, Azure Cosmos DB, Azure Stream Analytics · Experience in designing and hands-on development in cloud-based analytics solutions. · Expert level understanding on Azure Data Factory, Azure Synapse, Azure SQL, Azure Data Lake, and Azure App Service is required. · Designing and building of data pipelines using API ingestion and Streaming ingestion methods. · Knowledge of Dev-Ops processes (including CI/CD) and Infrastructure as code is essential. · Thorough understanding of Azure Cloud Infrastructure offerings. · Strong experience in common data warehouse modelling principles including Kimball. · Working knowledge of Python is desirable · Experience developing security models. · Databricks & Azure Big Data Architecture Certification would be plus · Must be team oriented with strong collaboration, prioritization, and adaptability skills required Mandatory skill sets: Azure Databricks Preferred skill sets: Azure Databricks Years of experience required: 7-10 Years Education qualification: BE, B.Tech, MCA, M.Tech Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Technology, Bachelor of Engineering Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Databricks Platform Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Coaching and Feedback, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling {+ 32 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date
Posted 2 days ago
2.0 - 6.0 years
0 Lacs
raipur
On-site
You are an experienced Laravel Developer along with core PHP with 2-3 years of experience in a similar position. Your responsibility will include designing and implementing high-performance, scalable web applications using the Laravel framework. You will deliver robust solutions while collaborating closely with a team of designers and developers. You will design, develop, and manage scalable web applications using the Laravel framework. Additionally, you should implement and optimize Laravel application performance, security, and database interactions. Writing robust, effective, and scalable code for both front-end and back-end components of Laravel-based applications will be part of your role. Monitoring and optimizing the performance and responsiveness of Laravel applications across various devices and platforms is essential. Integrating third-party services, tools, and APIs into Laravel applications will also be a key responsibility. Collaborating with designers, developers, and stakeholders to ensure seamless integration of components and user experiences is crucial. Running unit, integration, and performance tests to verify the stability and functionality of Laravel applications is required. You will also conduct code reviews and maintain adherence to security, scalability, and best practices in Laravel web development. Participation in agile development processes related to Laravel and contributing to improving the Laravel development workflow is expected. Skills required include 2-3 years+ of work experience in a similar position, proficiency with front-end technologies such as HTML5, CSS3, and JavaScript, strong understanding of PHP programming, Laravel ORM, and RESTful API development, experience with databases such as MySQL, PostgreSQL, or SQLite in the context of Laravel applications, familiarity with integrating third-party services, APIs, and libraries into Laravel applications, understanding of version control systems such as GIT, deployment and management of Laravel/PHP applications on AWS services, configuration and maintenance of web servers (Apache/Nginx) and PHP environments, knowledge of performance optimization and debugging tools for Laravel applications, experience with performance optimization, security, and monitoring tools for web applications, excellent communication and collaboration skills, strong analytical and problem-solving skills, proficiency in English and Hindi. Desired Candidate Profile: - Education Qualification: Bachelor of Engineering/Technology - Computer Science, Master of Computer Application - Computer Application - Work Experience: Minimum 2-3 years - Job Location: Raipur (Chhattisgarh) - Salary Package: Best as per industry standards.,
Posted 2 days ago
20.0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Who is Forcepoint? Forcepoint simplifies security for global businesses and governments. Forcepoint’s all-in-one, truly cloud-native platform makes it easy to adopt Zero Trust and prevent the theft or loss of sensitive data and intellectual property no matter where people are working. 20+ years in business. 2.7k employees. 150 countries. 11k+ customers. 300+ patents. If our mission excites you, you’re in the right place; we want you to bring your own energy to help us create a safer world. All we’re missing is you! What You Will Be Doing Helping drive innovation through improvements to the CI/CD pipeline Appropriately and efficiently automating through tools such as Terraform, Ansible and Python Supporting teams who use our tools such as Engineering, Professional Services and Sales Engineering Running our infrastructure as code via git and webhooks Building a strong DevOps culture while also fostering strong collaboration with all areas of development, product, and QA Documenting all the things! Mentoring junior team members and foster a collaborative, process-mature team Being passionate about championing new tools, processes and helping adoption across our organization DevSecOps best practices championed across the team and organization Responsibilities Include Develop and support automated, scalable solutions to deploy and manage our global infrastructure. Implement effective CI/CD processes and pipelines. Build integrations between services to create fully automated processes. Maintaining and improving the functionality of automation tools for infrastructure provisioning, configuration, and deployment. Identify opportunities to optimize current solutions and perform hands-on troubleshooting of problems related systems and performance in Prod/QA/Dev environments. Preferred Skills BS in Computer Science or similar degree and 5+ years of related experience, or equivalent work experience Experience working with automated server configuration and deployment tools Strong working knowledge of AWS, certifications preferred Proficiency working in Linux environments, particularly with customer-facing systems Knowledge of IP networking, VPNs, DNS, load balancing, and firewalling. Strong working knowledge of Infrastructure as Code (Terraform /Cloud Formation) Nginx, Apache, HAproxy Object oriented programing experience, language such as Java, Python, C++ Deployment and support of containers such as Docker, Kubernetes GitHub, Jenkins, Artifactory Minimum Qualifications Strong practical Linux based systems administration skills in a Cloud or Virtualized environment. Scripting (BASH/ Python) Automation (Ansible/Chef/Puppet) CI/CD Tools, primarily Jenkins Familiarity with Continuous Integration and development pipeline processes. AWS, Google Cloud Compute, Azure (at least one of) Prior success in automating a real-world production environment. Experience in implementing monitoring tools and fine-tuning the metrics for optimal monitoring. Excellent written and oral communication skills; Ability to communicate effectively with technical and non-technical staff. Don’t meet every single qualification? Studies show people are hesitant to apply if they don’t meet all requirements listed in a job posting. Forcepoint is focused on building an inclusive and diverse workplace – so if there is something slightly different about your previous experience, but it otherwise aligns and you’re excited about this role, we encourage you to apply. You could be a great candidate for this or other roles on our team. The policy of Forcepoint is to provide equal employment opportunities to all applicants and employees without regard to race, color, creed, religion, sex, sexual orientation, gender identity, marital status, citizenship status, age, national origin, ancestry, disability, veteran status, or any other legally protected status and to affirmatively seek to advance the principles of equal employment opportunity. Forcepoint is committed to being an Equal Opportunity Employer and offers opportunities to all job seekers, including job seekers with disabilities. If you are a qualified individual with a disability or a disabled veteran, you may request a reasonable accommodation if you are unable or limited in your ability to use or access the Company’s career webpage as a result of your disability. You may request reasonable accommodations by sending an email to recruiting@forcepoint.com. Applicants must have the right to work in the location to which you have applied.
Posted 2 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough