Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 - 8.0 years
0 Lacs
karnataka
On-site
We are seeking full-stack engineers with a minimum of 4 years of experience, preferably within the product domain. We are interested in individuals who are continuously evolving, learning, and adept at solving challenges. Your role will involve refining software products to meet user expectations and company goals. By exploring new technologies and collaborating with a dynamic team, you will play a pivotal role in the team's growth and overall success. As a Software Engineer II in the Product Development department based in Bangalore (in office), you will contribute to the technical architecture, write clean and scalable code, and work in agile teams for continuous improvement. You will collaborate with QA to maintain high-quality standards, manage data effectively, and implement security practices. Additionally, you will optimize product scalability and performance, collaborate across teams, drive innovation, and mentor team members, fostering a culture of knowledge sharing and development. In terms of technology requirements, you must have expertise in Python development, AWS services, React and Front-End Development, as well as full-stack capabilities. Knowledge of security and compliance, version control, CI/CD, cross-functional collaboration, problem-solving, and agile methodologies is crucial. Experience in user-centric design, quality assurance, data management, and security fundamentals is highly desirable. Softway offers a cross-functional team structure, varied domains, and a flat hierarchy that provides exposure and learning opportunities. You will have the chance to interact with talented individuals, engage in continuous learning, and work in a supportive environment. We value open communication, encourage sharing of opinions, and have an ego-less workforce focused on bringing solutions to life. In addition to a competitive salary and great work culture, Softway prioritizes inclusion, empathy, vulnerability, trust, empowerment, and forgiveness within the workplace. Softway, established 21 years ago, has been on a mission since 2015 to bring humanity back to the workplace. We aim to create a work environment where individuals can bring their whole selves and look forward to work each day. Our core values emphasize the importance of inclusion, empathy, vulnerability, trust, empowerment, and forgiveness in building a successful team and business.,
Posted 4 days ago
6.0 - 10.0 years
0 Lacs
chennai, tamil nadu
On-site
Experience: With over 6 years of experience in a techno-functional role, you are sought after for your expertise in AWS services. You have a proven track record of managing AWS partnerships and building strong collaborations. Your skills in technology modernization and data engineering using AWS are unparalleled, enabling you to streamline workflows. Your hands-on experience with a wide range of AWS services such as EC2, S3, RDS, Redshift, Lambda, Glue, and SageMaker gives you the necessary toolkit to excel in this role. Your deep understanding of cloud architecture, DevOps principles, and data engineering frameworks allows you to build efficient solutions. Exposure to Generative AI and its applications is a plus in today's AI-driven world. Your excellent communication and relationship-building skills make teamwork and partnerships seamless. As a problem-solver with a solution-driven mindset, you are always looking for the best approach to tackle challenges. Responsibilities: In this role, you will build and maintain strong relationships with AWS teams to ensure smooth collaboration and drive better partnerships. Leveraging your connections within AWS, you will identify new business opportunities and promote innovation. Understanding customer challenges, technical needs, and goals will be crucial in offering the right AWS solutions. You will propose and validate high-level AWS architectures that align with both business and technical requirements. Leading modernization efforts, you will focus on scalability, efficiency, and cost savings using AWS services. Advocating for best practices in data management and analytics, you will make AWS-powered solutions more effective. Exploring and integrating Generative AI into business use cases, you will unlock new possibilities. Acting as a bridge between internal teams, AWS, and customers, you will provide guidance, training, and insights when needed. Benefits: By joining our team, you will gain deep expertise in AWS services and learn how to apply them effectively. You will level up in cloud architecture, data engineering, and DevOps, making workflows smoother and more efficient. Getting hands-on experience with Generative AI, you will explore its real-world business applications. Building strong relationships and sharpening communication skills, you will make teamwork effortless. Working in a collaborative, innovation-driven environment, you will solve problems using the latest technology. Additionally, other level-based benefits are available. Company Culture: OptiSol is your answer to a stress-free, balanced lifestyle. Certified as a GREAT PLACE TO WORK for four consecutive years, we are known for our culture that emphasizes open communication and accessible leadership. We celebrate diversity and promote work-life balance through flexible policies, allowing you to thrive personally and professionally. At OptiSol, you will be a part of the future of AI and innovation, where you can live and learn together. Explore and find out why we are made for each other by joining us at OptiSol.,
Posted 1 week ago
5.0 - 10.0 years
10 - 20 Lacs
Chennai, Coimbatore, Bengaluru
Hybrid
Job Summary: We are looking for a highly skilled Senior AWS Data Engineer to design, develop, and lead enterprise-grade data solutions on the AWS cloud. This position requires a blend of deep AWS technical proficiency, hands-on PySpark experience, and the ability to engage with business stakeholders in solution design. The ideal candidate will build scalable, secure, and high-performance data platforms using AWS-native tools and best practices. Role & responsibilities: Design and implement scalable AWS cloud-native data architectures, including data lakes, warehouses, and streaming pipelines Develop ETL/ELT pipelines using AWS Glue (PySpark/Scala), Lambda, and Step Functions Optimize Redshift-based data warehouses including schema design, data distribution, and materialized views Leverage Athena, Glue Data Catalog, and S3 for efficient serverless query patterns Implement IAM-based data access control, lineage tracking, and encryption for secure data workflows Automate infrastructure and data deployments using CDK, Terraform, or CloudFormation Drive data modelling standards (Star/Snowflake, 3NF, Data Vault) and ensure data quality and governance Collaborate with data scientists, DevOps, and business stakeholders to deliver end-to-end data solutions Mentor junior engineers and lead code reviews and architecture discussions Participate in client-facing activities including requirements gathering, technical proposal preparation, and solution demos Must-Have Qualifications: AWS Expertise: Proven hands-on experience with AWS Glue, Redshift, Athena, S3, Lake Formation, Kinesis, Lambda, Step Functions, EMR, and Cloud Watch PySpark & Big Data: Minimum 2 years of hands-on PySpark/Spark experience for large-scale data processing ETL/ELT Engineering: Expertise in Python, dbt, or similar automation frameworks Data Modelling: Proficiency in designing and implementing normalized and dimensional models Performance Optimization: Ability to tune Spark jobs with custom partitioning, broadcast joins, and memory management CI/CD & Automation: Experience with GitHub Actions, Code Pipeline, or similar tools Consulting & Pre-sales: Prior exposure to client-facing roles including proposal drafting and cost estimation Good-to-Have Skills: Knowledge of Iceberg, Hudi, or Delta Lake file formats Experience with Athena Federated Queries and AWS OpenSearch Familiarity with Data Zone, Data Brew , and data profiling tools Understanding of compliance frameworks like GDPR, HIPAA, SOC2 BI integration skills using Power BI, Quick Sight, or Tableau Knowledge of event-driven architectures (e.g., Kinesis, MSK, Lambda) Exposure to lake house or data mesh architectures Experience with Lucid chart, Miro , or other documentation/storyboarding tools Why Join Us? Work on cutting-edge AWS data platforms Collaborate with a high-performing team of engineers and architects Opportunity to lead key client engagements and shape large-scale solutions Flexible work environment and strong learning culture
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough