You are viewing this page in an unsupported browser. Please open this link in a browser such as Chrome, Firefox, Safari or Android browser.

The next CutShort event, {{next_cs_event.name}}, in partnership with UpGrad, will happen on {{next_cs_event.startDate | date: 'd MMMM'}}The next CutShort event, {{next_cs_event.name}}, in partnership with UpGrad, will begin in a few hoursThe CutShort event, {{next_cs_event.name}}, in partnership with UpGrad, is LIVE.Join now!

{{hours_remaining}}:{{minutes_remaining}}:{{seconds_remaining}}Want to save 90% of your recruiting time? Learn how in our next webinar on 22nd March at 3 pmLearn more

Systems EngineerAbout Intellicar Telematics Pvt LtdIntellicar Telematics Private Limited is a vehicular telematics organization founded in 2015 with the vision of connecting businesses and customers to their vehicles in a meaningful way. We provide vehicle owners with the ability to connect and diagnose vehicles remotely in real-time. Our team consists of individuals with an in-depth knowledge and understanding in automotive engineering, driver analytics and information technology. By leveraging our expertise in the automotive domain, we have created solutions to reduce operational and maintenance costs of large fleets, and ensure safety at all times.Solutions :- Enterprise Fleet Management, GPS Tracking- Remote engine diagnostics, Driver behavior & training- Technology Integration : GIS, GPS, GPRS, OBD, WEB, Accelerometer, RFID, On-board Storage.Intellicar's team of accomplished automotive Engineers, hardware manufacturers, Software Developers and Data Scientists have developed the best solutions to track vehicles and drivers, and ensure optimum performance, utilization and safety at all times.We cater to the needs of our clients across various industries such as: Self drive cars, Taxi cab rentals, Taxi cab aggregators, Logistics, Driver training, Bike Rentals, Construction, ecommerce, armored trucks, Manufacturing, dealership and more. Desired skills as a developer :- Education: BE/B.Tech in Computer Science or related field.- 4+ years of experience with scalable distributed systems applications and building scalable multi-threaded server applications.- Strong programming skills in Java or Scala on Linux or a Unix based OS.- Understanding of distributed systems like Hadoop, Spark, Cassandra, Kafka.- Good understanding of HTTP, SQL, Database internals.- Good understanding of Internet and how it works- Create new features from scratch, enhance existing features and optimize existing functionality, from conception and design through testing and deployment.- Work on projects that make our network more stable, faster, and secure.- Work with our development QA and system QA teams to come up with regression tests that cover new changes to our software

Role:
The role of lead is not a textbook checklist, however, there are technical responsibilities that a team must fulfill and we expect the tech lead to ensure these responsibilities are covered and be able to cover them themselves if needed.
We expect tech leads to taking a collaborative approach to leading their team. This is especially important when considering the amount of experience that each of our consultants brings. Given this, we expect you to:
● Respect the other members of your team and recognise you don’t always know best.
● Spot gaps in team capability and figure out how to fix them as a team.
● Be hands on, able and willing to contribute to development, however, don’t expect to be
coding all of the time.
● Encourage the team to be proactive, give them responsibility.
Responsibilities
● Have a clear understanding of the deployment architecture
● Have a clear understanding of the build pipeline
● Understand how you get changes into production
● Understand how all parts of the system work together
● Facilitate technical communication with other teams, both within your engagement and across other EE clients.
● Actively seek to remove knowledge silos within the team
● Ensure you have a release / branching strategy in place
● Act as the primary point of contact for your team when communicating with other teams
● Ensure there is a technical vision for the team
● Liaise with environment specialists to ensure smooth deployments to production
● Encourage the team to follow good development practices aligned to EE technical values
● Feedback to the delivery lead or engagement manager on the quality of your team (good and bad)
● Recognise team members that have the potential to grow into team leads
● Ensure the use of new technologies or dependencies does not block the team.
● Ensure the team keeps necessary architectural documents up to date
● Keep an eye on the long term consequences of architectural choices, and remind others when necessary
● Build good relationships with your team members. Act as a mentor when required
● Keep the client informed and engaged in the technical side of the project
● Build relationships across your client community
Technologies / Experience
The successful candidate must have the following experience:
● Worked as the tech lead of a development/delivery team in a large organisation
● Have worked with a variety of different technical architect roles
● Be deeply proficient in at least one programming language
● Be comfortable using other languages and evidence using multiple languages
● Have hands-on experience with some form of configuration management tooling, e.g. Ansible, Chef, Puppet
● Have hands-on experience of at least one continuous integration and continuous delivery technology, e.g. Jenkins, Go, Team City or Bamboo.
● Full stack development experience from user interface through to data persistence
● A strong proponent of XP practices such as TDD
● Working with a delivery team to formulate an automated test strategy
● Worked as part of a number of agile delivery teams and seen a number of different approaches to delivery
● Good appreciation of secure coding practices and end to end system security
The following exposure will also be looked on favorably:
● Performing an ‘architect’ role, while retaining hands-on involvement
● Working with cloud hosting platforms such as AWS, Rackspace, Azure etc.
● Infrastructure management technologies such as Cloud Formation or Terraform

We are looking for a passionate Software Engineer to design, develop and scale software solutions.
Software Engineer responsibilities include gathering user requirements, defining system functionality and writing code in various languages, like JavaScript, Python, Scala, Java.
Our ideal candidates are familiar with the software development lifecycle (SDLC) from preliminary system analysis to tests and deployment.
Ultimately, the role of the Software Engineer is to build high-quality, innovative and fully performing software that complies with coding standards and technical design.
RESPONSIBILITIES
Execute full software development life cycle
Develop flowcharts, layouts, and documentation to identify requirements and solutions
Write well-designed, testable code
Produce specifications and determine operational feasibility Integrate software components into a fully functional software system
Develop software verification plans and quality assurance procedures
Document and maintain software functionality
Troubleshoot, debug and upgrade existing systems
Deploy programs and evaluate user feedback
Comply with project plans and industry standards Ensure software is updated with the latest features
REQUIREMENTS
Proven work experience as a Software Engineer or Software Developer
Prior work experience of 1-2 years is welcome
Experience designing interactive applications
Understanding of algorithms and data structures
Ability to develop software in JavaScript, Python, Scala, Java or other programming languages
Excellent knowledge of databases, SQL and non-SQL technologies is a plus
Experience in developing web applications using at least one popular web-framework is a plus
Experience with test-driven development
Proficiency in software engineering tools
Ability to document requirements and specifications
Experience with Data Science is a plus
University/college degree in Computer Science, Engineering or relevant field

Greeting from Unify Technologies!
We are looking out for some great talent with hands-on Scala (or Python or Java) Programming with Spark and BigData for one of our top projects which we are working with one of the Top Product Development company in Technology. We are sure that this experience will help scale-up your career as well!
PFB the JD for your quick reference.
What are we looking: We are looking for candidates who have a keen interest in security, privacy, scalability, and performance, cater to customer experience and pay attention to details. You’ll be part of the team that develops Software, builds automated tests, does release and reliability engineering for extraordinary frontend and backend systems scaling to billions of users and devices.
Key Qualifications:
Scala Programming or Excellent Java Programming including Web Services
Hadoop Big Data Development - size 500 to 800TB
Individual contributors with strong coding skills
Location of work: Hyderabad
Member of HackerRank, HackerEarth, Stackoverflow et al.
Proficiency with Big Data processing technologies (Hadoop, Spark, Oozie).
Experience in building data pipelines and analysis tools using Scala, Java, Python
Experience building large-scale server-side systems with distributed processing algorithms.
Aptitude to independently learn new technologies.
Strong problem solving skills
Excellent oral and written English communication skills
Our Company: Unify Technologies
Our Website: http://unifytech.com/
Linked In: https://www.linkedin.com/company/9206998
Offices in: Gurgaon, Pune, Hyderabad - India, and Seattle-USA
Industry/Domain: Cloud/Product - Cloud Automation, Data Engineering, Mobile
Few words about Unify Technologies: Unify is a pioneer in developing technology solutions towards imparting greater value and creating collaboration amongst global businesses. Unify leads the way in changing the conventional wisdom to assure greater returns on investments made. Unify helps customers focus on their business while taking care of your software needs with a global strategy to transform their company.
Employment Type: Full-Time
Joining time: Immediate to 30 days
Work Location: Hyderabad - India
Education: Bachelor's degree or equivalent in Computer Science and others related fields from reputed colleges
Job Summary: A Job at UNIFY is Inspired and Innovative. If you enjoy working on unique and challenging problems? Our Project - Apple’s Enterprise Technology Services (ETS) needed engineers to be part of internet-scale systems and platforms that power all of Apple’s enterprise applications and customer-facing products including iCloud, iTunes, Retail and Online stores. Kindly check below detailed JD and company details. Let us know your interest in pursuing this position.
Thanks,
SudhEe
Sudheendra Srinivasan
Lead Recruiter | 91 333 73693
unifytech.com

Work with developers to design algorithms and flowcharts
Prepare GUI dummy screens for proposed Software development using Excel VBA. (To Give a overview how the software buttons and flow of information should happen)
Coordination with Software Developer team to explain the criteria
Produce clean, efficient code based on specifications
Integrate software components and third-party programs
Verify and deploy programs and systems
Troubleshoot, debug and upgrade existing software
Gather and evaluate user feedback
Recommend and execute improvements
Create technical documentation for reference and reporting
Proven experience as a Software Developer, Software Engineer or similar role
Familiarity with development methodologies
Experience with software design and development in a test-driven environment
Knowledge of coding languages (e.g. C#, C++) and frameworks/systems
Ability to learn new languages and technologies
Excellent communication skills
Resourcefulness and troubleshooting aptitude
Attention to detail
Sound technical knowledge, thorough knowledge of all related codes and section details is desired.
Thorough Knowledge of Design of Components of Residential / Commercial Structures is Desired.
Accuracy In Following The Process & Jobs Is Required.
Experience in Interaction with International Client Will Be Preferred

Greetings from Intelliswift!
Intelliswift Software Inc. is a premier software solutions and Services Company headquartered in the Silicon Valley, with offices across the United States, India, and Singapore. The company has a proven track record of delivering results through its global delivery centers and flexible engagement models for over 450 brands ranging from Fortune 100 to growing companies. Intelliswift provides a variety of services including Enterprise Applications, Mobility, Big Data / BI, Staffing Services, and Cloud Solutions. Growing at an outstanding rate, it has been recognized as the second largest private IT Company in the East Bay.
Domains: IT, Retail, Pharma, Healthcare, BFSI, and Internet & E-commerce
website https://www.intelliswift.com/
Experience: 4-8 Years
Job Location: Chennai
Job Description:
Skills: Spark, Scala, Big data, Hive
· Strong Working experience in Spark, Scala, big data, h base and hive.
· Should have good working experience in SQL and Spark SQL.
· Good to have knowledge or experience in Teradata.
· Familiar with General engineering Git, jenkins, sbt, maven.

Interested in building high performance search systems to handle petabytes of retail data, while working in an agile, small company environment? At CodeHall Technologies, you will have the opportunity to work with the newest technology in Search and Browse. We are working on systems that powers and personalizes site search, considering the user intent for every query, providing a wholly unique search experience that is engaging - designed to display the most relevant results through Findability. Primary responsibilities: Building high performance Search systems for personalization, optimization, and targeting Building systems with Hadoop, Solr, Cassandra, Flink, Spark, Mongo DB Deep understanding of HTTP and REST principles Good diagnostic and troubleshooting skills… Unit testing with JUnit, Performance testing and tuning Working with rapid and innovative development methodologies like: Kanban, Continuous Integration and Daily deployments Highly proficient Software engineering skills in Java Coordination with internal and external teams Mentoring junior engineers Participate in Product design discussions and decisions Minimum requirements: BS/MS in CS, Electrical Engineering or foreign equivalent plus relevant software development experience At least 5-8 years of software development experience Expert in Java, Scala or any other object oriented language Proficient in SQL concepts (HiveQL or Postgres a plus) Additional language skills for scripting and rapid application development Desired skills and experience: Working with large data sets in the PBs Familiarity with UNIX (systems skills a plus) Working experience in Solr, Cassandra, Mongo DB, and Hadoop Working in a distributed environment and has dealt with challenges around scaling and performance Proven ability to project and meet scheduled deadlines Self-driven, quick learner with attention to detail and quality

• 2-4 years of strong experience in Java, Scala, Java script, Node JS or Python programming
language to write performant, scalable and unit tested code
• Good object-oriented design skills, and knowledge of design patterns
• Working knowledge of Angular JS
• Proven commitment to quality and an ability to create maintainable and extensible code
• Experience in working with Agile software methodologies
• Good experience working with relational databases such as MariaDB/MySQL
• Proficient working in a Linux or UNIX environment
• Ability to maintain a balance between working independently and in collaboration with
internal stakeholders
• Experience with Nginx, Tomcat, Redis, Cassandra,Zookeeper, ActiveMQ and Hadoop is a
plus
• Bachelors or Masters in Computer Science engineering or related discipline

At Equal Experts we are a network of talented experienced software consultants specialising in Agile Delivery.
So, what do we do in our regular day at ​EE​? We indulge in all things that would excite you!
Like;
● Work on large-scale, custom distributed software systems using Java, Scala, C#/.NET, MongoDB, Neo4j, Groovy, Angular JS, ReactJS, .Net, Cucumber and the likes
● Be responsible for the quality of software and resolving any issues regarding client satisfaction
● Employ Agile development including task estimation, test automation, deployment automation and Continuous Integration to improve overall execution speed and product quality
● Work in a dynamic, collaborative, transparent, non-hierarchal, and ego-free culture where your talent is valued over a role title
● Spread the word about best practices in software development inside and outside Equal Experts community
● Speak at conferences like Experts Talk and others
● Learn something new everyday, write blogs
● We work almost exclusively on customer site providing a mix of delivery and consulting services, so you'll be flexible about travel.
Here is what we would like you to bring:
● Development and delivery experience with Java, .NET, Scala and the likes
● Passion for software engineering and craftsman-like coding prowess
● Great OO skills, including strong design patterns knowledge
● Experience working with Agile, Lean and/or Continuous Delivery approaches and best practices, such as Extreme Programming (XP)
● Keen to work collaboratively with people, sharing your ideas to solve real business problems.

Job Description
In this role you will help us build, improve and maintain our huge data infrastructure where we collect TB's of logs daily. Data driven decisioning is crucial to the success of our customers and this role is central to ensuring we have a cutting edge data infrastructure to do things faster, better, and cheaper!
Experience
1 - 3 Years
Required Skills
-Must be a polyglot with good command over Java, Scala and a scripting language
-A non trivial project experience in distributed computing frameworks like Apache Spark/Hadoop/Pig/Kafka/Storm with sound knowledge of their internals
-Expert knowledge of relational databases like MYSQL, and in-memory data stores like Redis
-Regular participation in coding/hacking contests like Top-Coder, Code-Jam and Hacker-Cup is a huge plus
Pre requisites
-Strong analytical skills and solid foundation in Computer Science fundamentals specially in -DataStructures/Algorithms, Object Oriented principles, Operating Systems, Computer Networks
-Ability and willingness to take ownership and work under minimum supervision, independently or as a part of a team
-Passion for innovation and "Never Say Die" attitude
-Strong verbal and written communication skills
Education
BTech/M.Tech/MS/Dual in Computer Science with above average academic credentials

Responsibilities:
You will interact directly with colleagues across all responsibility areas and Director Of Engineering. The successful candidate for this position:
- Designs and implements well-architected and scalable solutions
- Collaborate with various teams in releasing high-quality software
- Performs code reviews and contributes to healthy coding conventions
- Assists in integration with customer systems
- Provides timely responses to internal technical questions
- Demonstrates leadership skills in navigating through tense periods and keeping calm
Our Culture:
- Integrity and motivation is more important than skill and experience
- Cross-company team building and collaboration
- Diverse background and highly talented & passionate group of individuals
Ideal Candidate:
The ideal candidate is a senior engineer having substantial development experience and high standards for code quality & maintainability.
Basic Qualifications:
- 4-year degree in Computer Science or Computer Engineering
Preferred Qualifications:
- 5+ years of development experience
- Experience in Java or Scala
- Experience with all parts of SDLC including CI/CD and testing methodologies
- Experience in working with NoSQL technologies and message queue management
- Self-motivated and able to work with minimum guidance.
- Experience in a startup or rapid-growth product or project
- Comfortable with modern version control, and agile development
Bonus Points:
- Experience in working with micro-services, containers or big data technologies
- Working knowledge of cloud technologies like GCE and AWS
- Writes blog posts and has a strong record on StackOverflow and similar sites

Kreyon Systems is looking for Software Developers for amazing international projects. Candidates should be able to develop Custom applications on ASP.NET4.0,C#. Good DB knowledge of SQL Server 2008 & reporting on RDLC etc will be preferred. Hands on Knowledge of MVC entity framework 4+, code first, MVVM, WCF, Design patterns, jquery & Javascript.
Exposure/working knowledge of Cloud based applications. Must be creative should produce outstanding designs. Basic Qualifications:
Desired Candidate Profile:
• Ability to develop end to end solutions.
• Good work experience on ASP.NET4.0+, SQL-Server, Jquery/JavaScript, & strong programming fundamentals.
• Azure, Mobile programming experience will be helpful.
• The candidate with ability to Solve problems will be preferred.
• This will be a promising opportunity for the right candidates.
Skills:
• ASP.NET4.0+ , SQL-Server, LINQ, MVC, code first, Jquery, & javascript
Note:
o This position is based in Jabalpur development center.
o You should have a great passion for your work for this position.
o Only candidates who are willing to work at Jabalpur Development Center for Kreyon Systems will be screened for interviews.

It is my pleasure to introduce you to IDEAS2IT Technologies, Chennai. If you are looking for a challenging position as a BigData engineer solving complex business problems by applying the latest in Data Science, Machine Learning and AI read on. Ideas2IT is a high-end product engineering firm that rolls out its own products and also helps Silicon Valley firms with their product engineering. We are looking for above average programmers to be part of our Data Science Lab. You will be working on projects like An AI platform built using Google TensorFlow for a predictive hiring product. Betting odds platform that to match odds offered to leverage spreads PPO platform for predictive pricing and promotions for enterprise eCommerce. Part of your tool set will be Google TensorFlow, Python ML frameworks, Apache Spark, R, Google BigQuery, Scala / Octave, Kafka and so on. If you have any relevant experience great! If not, it doesn't matter. We believe in hiring people with high IQ and the right attitude over ready-made skills. As long as you are passionate about building world class enterprise products and understand whatever technology that you are working on in-depth, we will bring you up to speed on all the technologies we use. Oh BTW, did we mention that you need to be super smart? Sounds interesting? Ideas2IT is a high-end product firm. Started by an ex-Googler, Murali Vivekanandan, we count Siemens, Motorola, eBay, Microsoft and Zynga among our clients. We solve some very interesting problems in the USA startup ecosystem and have created great products in the process. When we build, we build great! We actively contribute to open source projects. We've built our own frameworks. We're betting the house on Big Data, and with a Stanford grad leading the team, we're sure to win. We have rolled 2 of our products as separate companies last year and raised institutional funds - Pipecandy, Idearx.

ITTStar global services is subsidiary unit in Bengaluru with head office in Atlanta, Georgia. We are primarily into data management and data life cycle solutions, which includes machine learning and artificial intelligence. For further info, visit ITTstar.com .
As discussed over the call, I am forwarding the job description.
We are looking for enthusiastic and experienced data engineers to be part of our bustling team of professionals for our Bengaluru location.
JOB DESCRIPTION:
1. Experience in Spark & Big Data is mandatory.
2. Strong Programming Skills in Python / Java / Scala /Node.js.
3. Hands on experience handling multiple data types JSON/XML/Delimited/Unstructured.
4. Hands on experience working at least one Relational and/or NoSQL Databases.
5. Knowledge on SQL Queries and Data Modeling.
6. Hands on experience working in ETL Use cases either in On-premise or Cloud.
7. Experience in any Cloud Platform (AWS, Azure, GCP, Alibaba).
8. Knowledge in one or more AWS Services like Kinesis, EC2, EMR, Hive Integration, Athena, FireHose, Lambda, S3, Glue Crawler, Redshift, RDS is a plus.
9. Good Communication Skills and Self Driven - should be able to deliver the projects with minimum instructions from Client.

All of our team members are expected to learn, learn, and learn! We are working on cutting edge technologies and areas of artificial intelligence that have never been explored before. We are looking for motivated software engineers with strong coding skills that want to work on problems and challenges they have never worked on before. All of our team members wear multiple hats so you will be expected to simultaneously work on multiple aspects of the products we ship.
Responsibilities
* Participate heavily in brainstorming of system architecture and feature design
* Interface with external customers and key stakeholders to understand and document design requirements
* Work cross functionally with Engineering, Data Science, Product, UX, BD, and Infrastructure teams
* Drive best coding practices across the company (i.e. documentation, code reviews, coding standards, etc)
* Perform security, legal, and license reviews of committed code
* Complete projects with little or no supervision from senior leadership
Required Qualifications
* Built and deployed customer-facing services and products at scale
* Developed unit and integration tests
* Worked on products where experimentation and data science are core to development
* Experience with large-scale distributed systems that have thousands of microservices and manages millions of transactions per day
* Solid instruction-level understanding of Object Oriented design, data structures, and software engineering principles
* Must have at least 4+ years experience in front-end and back-end web development with the the following tools: Python, Scala, Apache Tomcat, Django, HTML5, CSS, NodeJS, AWS or Azure, Java or C/C++, MySQL,
Desired Experience/Skills
* You have a strong background in natural language processing, statistical modeling, and/or machine learning
* Experience with the following tools: Google Cloud Platform, Objective C/Swift
* Experience with open source projects in a startup environment
* BE, B.Tech or M.Tech in Computer Science, Information Technology, or E&C

RESPONSIBILITIES: 1. Full ownership of Tech right from driving product decisions to architect to deployment. 2. Develop cutting edge user experience and build cutting edge technology solutions like instant messaging in poor networks, live-discussions, live-videos optimal matching. 3. Using Billions of Data Points to Build User Personalization Engine. 4. Building Data Network Effects Engine to increase Engagement & Virality. 5. Scaling the Systems to Billions of Daily Hits. 6. Deep diving into performance, power management, memory optimization & network connectivity optimization for the next Billion Indians. 7. Orchestrating complicated workflows, asynchronous actions, and higher order components. 8. Work directly with Product and Design teams. REQUIREMENTS: 1. Should have Hacked some (computer or non-computer) system to your advantage. 2. Built and managed systems with a scale of 10Mn+ Daily Hits 3. Strong architectural experience. 4. Strong experience in memory management, performance tuning and resource optimizations. 5. PREFERENCE- If you are a woman or an ex-entrepreneur or having a CS bachelors degree from IIT/BITS/NIT. P.S. If you don't fulfill one of the requirements, you need to be exceptional in the others to be considered.

Description Deep experience and understanding of Apache Hadoop and surrounding technologies required; Experience with Spark, Impala, Hive, Flume, Parquet and MapReduce. Strong understanding of development languages to include: Java, Python, Scala, Shell Scripting Expertise in Apache Spark 2. x framework principals and usages. Should be proficient in developing Spark Batch and Streaming job in Python, Scala or Java. Should have proven experience in performance tuning of Spark applications both from application code and configuration perspective. Should be proficient in Kafka and integration with Spark. Should be proficient in Spark SQL and data warehousing techniques using Hive. Should be very proficient in Unix shell scripting and in operating on Linux. Should have knowledge about any cloud based infrastructure. Good experience in tuning Spark applications and performance improvements. Strong understanding of data profiling concepts and ability to operationalize analyses into design and development activities Experience with best practices of software development; Version control systems, automated builds, etc. Experienced in and able to lead the following phases of the Software Development Life Cycle on any project (feasibility planning, analysis, development, integration, test and implementation) Capable of working within the team or as an individual Experience to create technical documentation

Simplilearn.com is the world’s largest professional certifications company and an Onalytica Top 20 influential brand. With a library of 400+ courses, we've helped 500,000+ professionals advance their careers, delivering $5 billion in pay raises. Simplilearn has over 6500 employees worldwide and our customers include Fortune 1000 companies, top universities, leading agencies and hundreds of thousands of working professionals. We are growing over 200% year on year and having fun doing it.
Description
We are looking for candidates with strong technical skills and proven track record in building predictive
solutions for enterprises. This is a very challenging role and provides an opportunity to work on
developing insights based Ed-Tech software products used by large set of customers across globe.
It provides an exciting opportunity to work across various advanced analytics & data science problem statement using cutting-edge modern technologies collaborating with product, marketing & sales teams.
Responsibilities
• Work on enterprise level advanced reporting requirements & data analysis.
• Solve various data science problems customer engagement, dynamic pricing, lead scoring, NPS improvement, optimization, chatbots etc.
• Work on data engineering problems utilizing our tech stack - S3 Datalake, Spark, Redshift, Presto, Druid, Airflow etc.
• Collect relevant data from source systems/Use crawling and parsing infrastructure to put together data sets.
• Craft, conduct and analyse A/B experiments to evaluate machine learning models/algorithms.
• Communicate findings and take algorithms/models to production with ownership.
Desired Skills
• BE/BTech/MSc/MS in Computer Science or related technical field.
• 2-5 years of experience in advanced analytics discipline with solid data engineering & visualization skills.
• Strong SQL skills and BI skills using Tableau & ability to perform various complex analytics in data.
• Ability to propose hypothesis and design experiments in the context of specific problems using statistics & ML algorithms.
• Good overlap with Modern Data processing framework such as AWS-lambda, Spark using Scala or Python.
• Dedication and diligence in understanding the application domain, collecting/cleaning data and conducting various A/B experiments.
• Bachelor Degree in Statistics or, prior experience with Ed-Tech is a plus

Couture.ai is building a patent-pending AI platform targeted towards vertical-specific solutions. The platform is already licensed by Reliance Jio and few European retailers, to empower real-time experiences for their combined >200 million end users.
For this role, credible display of innovation in past projects (or academia) is a must. We are looking for a candidate who lives and talks Data & Algorithms, love to play with BigData engineering, hands-on with Apache Spark, Kafka, RDBMS/NoSQL DBs, Big Data Analytics and handling Unix & Production Server.
Tier-1 college (BE from IITs, BITS-Pilani, top NITs, IIITs or MS in Stanford, Berkley, CMU, UW–Madison) or exceptionally bright work history is a must. Let us know if this interests you to explore the profile further.

Couture.ai is building a patent-pending AI platform targeted towards vertical-specific solutions. The platform is already licensed by Reliance Jio and few European retailers, to empower real-time experiences for their combined >200 million end users.
The founding team consists of BITS Pilani alumni with experience of creating global startup success stories. The core team, we are building, consists of some of the best minds in India in artificial intelligence research and data engineering.
We are looking for multiple different roles with 2-7 year of research/large-scale production implementation experience with:
- Rock-solid algorithmic capabilities.
- Production deployments for massively large-scale systems, real-time personalization, big data analytics, and semantic search.
- Or credible research experience in innovating new ML algorithms and neural nets.
Github profile link is highly valued.
For right fit into the Couture.ai family, compensation is no bar.

Couture.ai is building a patent-pending AI platform targeted towards vertical-specific solutions. The platform is already licensed by Reliance Jio and few European retailers, to empower real-time experiences for their combined >200 million end users.
For this role, credible display of innovation in past projects is a must.
We are looking for hands-on leaders in data engineering with the 5-11 year of research/large-scale production implementation experience with:
- Proven expertise in Spark, Kafka, and Hadoop ecosystem.
- Rock-solid algorithmic capabilities.
- Production deployments for massively large-scale systems, real-time personalization, big data analytics and semantic search.
- Expertise in Containerization (Docker, Kubernetes) and Cloud Infra, preferably OpenStack.
- Experience with Spark ML, Tensorflow (& TF Serving), MXNet, Scala, Python, NoSQL DBs, Kubernetes, ElasticSearch/Solr in production.
Tier-1 college (BE from IITs, BITS-Pilani, IIITs, top NITs, DTU, NSIT or MS in Stanford, UC, MIT, CMU, UW–Madison, ETH, top global schools) or exceptionally bright work history is a must.
Let us know if this interests you to explore the profile further.

Job Title:
Distributed Systems Engineer - SDET
Job Location:
Pune, India
Job Description:
Are you looking to put your computer science skills to use? Are you looking to work for one of the hottest start-ups in Silicon Valley? Are you looking to define the next generation data management platform based on Apache Spark? Are you excited by the idea of being a Spark committer?
If you answered yes to all of the questions above, we definitely want to talk to you. We are looking to add highly motivated engineers to work as a QE software engineer in our product development team in Pune. We work on cutting edge data management products that transform the way businesses operate.
As a distributed systems engineer (if you are good) , you will get to work on defining key elements of our real time analytics platform, including
1. Distributed in memory data management
2. OLTP and OLAP querying in a single platform
3. Approximate Query Processing over large data sets
4. Online machine learning algorithms applied to streaming data sets
5. Streaming and continuous querying
Requirements:
1. Experience in testing modern SQL, NewSQL products highly desirable
2. Experience with SQL language, JDBC, end to end testing of databases
3. Hands on Experience in writing SQL queries
4. Experience on database performance benchmarks like TPC-H, TPC-C and TPC-E a plus
5. Prior experience in benchmarking against Cassandra or MemSQL is a big plus
6. You should be able to program either in Java or have some exposure to functional programming in Scala
7. You should care about performance, and by that, we mean performance optimizations in a JVM
8. You should be self motivated and driven to succeed
9. If you are an open source committer on any project, especially an Apache project, you will fit right in
10. Experience working with Spark, SparkSQL, Spark Streaming is a BIG plus
11. Plans & authors Test plans and ensure testability is considered by development in all stages of the life cycle.
12. Plans, schedules and tracks the creations of Test plans / automation scripts using defined methodologies for manual and/or automated tests
13. Work as QE team member in troubleshooting, isolating, reproducing, tracking bugs and verifying fixes.
14. Analyze test results to ensure existing functionality and recommends corrective action. Documents test results, manages and maintains defect & test case databases to assist in process improvement and estimation of future releases.
15. Performs the assessment and planning of test efforts required for automation of new functions/features under development. Influences design changes to improve quality and feature testability.
16. If you have solved big complex problems, we want to talk to you
17. If you are a math geek, with a background in statistics, mathematics and you know what a linear regression is, this just might be the place for you
18. Exposure to stream data processing Storm, Samza is a plus
Open source contributors: Send us your Github id
Product:
SnappyData is a new real-time analytics platform that combines probabilistic data structures, approximate query processing and in memory distributed data management to deliver powerful analytic querying and alerting capabilities on Apache Spark at a fraction of the cost of traditional big data analytics platforms.
SnappyData fuses the Spark computational engine with a highly available, multi-tenanted in-memory database to execute OLAP and OLTP queries on streaming data. Further, SnappyData can store data in a variety of synopsis data structures to provide extremely fast responses on less resources. Finally, applications can either submit Spark programs or connect using JDBC/ODBC to run interactive or continuous SQL queries.
Skills:
1. Distributed Systems,
2. Scala,
3. Apache Spark,
4. Spark SQL,
5. Spark Streaming,
6. Java,
7. YARN/Mesos
What's in it for you:
1. Cutting edge work that is ultra meaningful
2. Colleagues who are the best of the best
3. Meaningful startup equity
4. Competitive base salary
5. Full benefits
6. Casual, Fun Office
Company Overview:
SnappyData is a Silicon Valley funded startup founded by engineers who pioneered the distributed in memory data business. It is advised by some of the legends of the computing industry who have been instrumental in creating multiple disruptions that have defined computing over the past 40 years. The engineering team that powers SnappyData built GemFire, one of the industry leading in memory data grids, which is used worldwide in mission critical applications ranging from finance to retail.

Looking for a technically sound and excellent trainer on big data technologies. Get an opportunity to become popular in the industry and get visibility. Host regular sessions on Big data related technologies and get paid to learn.

Scienaptic (www.scienaptic.com) is a new age technology and analytics company based in NY and Bangalore. Our mission is to infuse robust decision science into organizations. Our mantra to achieve our mission is to - reduce friction- among technology, processes and humans. We believe that good design thinking needs to permeate all aspects of our activities so that our customers get the best possible aesthetic and least frictious experience of our software and services.
As a Prinicipal Software development Engineer you will be responsible for the development and augmentation of the software components which will be used to solve the analytics problems of large enterprises. These components are highly scalable, connect with multiple data sources and implement some of the complex algorithms
We are funded by very senior and eminent business leaders in India and US. Our lead investor is Pramod Bhasin, who is known as a pioneer of ITES revolution. We have the working environment of a new age, cool startup. We are firm believers that the best talent grounds will be non-hierarchical in structure and spirit. We expect you to enjoy, thrive and empower others by progressing that culture.
Requirements :
- Candidate should have all round experience in developing and delivering large-scale business applications in scale-up systems as well as scale-out distributed systems.
- Identify the appropriate software technology / tools based on the requirements and design elements contained in a system specification
- Should implement complex algorithms in a scalable fashion.
- Work closely with product and Analytic managers, user interaction designers, and other software engineers to develop new product offerings and improve existing ones.
Qualifications/Experience :
- Bachelor's or Master's degree in computer science or related field
- 10 to 12 years of experience in core Java programming: JDK 1.7/JDK 1.8 and Familiarity with Big data systems like Hadoop and Spark is an added bonus
- Familiarity with dependency injection, Concurrency, Guice/Spring
- Familiarity with JDBC API / Databases like MySQL, Oracle, Hadoop
- Knowledge of graph databases and traversal
- Knowlede of SOLR/ElasticSearch, Cloud based deployment would be preferred

We at InfoVision Labs, are passionate about technology and what our clients would like to get accomplished. We continuously strive to understand business challenges, changing competitive landscape and how the cutting edge technology can help position our client to the forefront of the competition.We are a fun loving team of Usability Experts and Software Engineers, focused on Mobile Technology, Responsive Web Solutions and Cloud Based Solutions.
Job Responsibilities:
◾Minimum 3 years of experience in Big Data skills required.
◾Complete life cycle experience with Big Data is highly preferred
◾Skills – Hadoop, Spark, “R”, Hive, Pig, H-Base and Scala
◾Excellent communication skills
◾Ability to work independently with no-supervision.

Crest (Part of the Springer Nature group):-Headquartered in Pune, Crest is a Springer Nature company that delivers cutting edge IT and ITeS solutions to some of the biggest scientific content and database brands in the world. Our global teams work closely with our counterparts and clients in Europe, USA and New Zealand, leveraging the latest technology, marketing intelligence and subject matter expertise. With handpicked SME’s in a range of sciences and technology teams working on the latest ECM, Scala, SAP and MS Tech platforms, Crest not only develops quality STM content, but continuously enhances the channels though which they are delivered to the world. Crest is an ISO 9001 certified, driven by over 1000 professionals in Technology, Research& Analysis and Marketing & BPM.
Specialties:
1. Technology
2. Research
3. Marketing Intelligence
4. Business Process Management

We are looking for innovative and enterpreneurial individuals who thrive on solving complex problems and want to build innovative products.
We are looking for candidates with the following qualifications:
Would prefer candidates with 4+ years of experience in development
Good knowledge of MVC and other design patterns
Good understanding and hands-on development experience in HTML/HTML5, CSS, Javascript, jQuery
Must be familiar with Object Oriented Design concepts
Experience of working with Angular.js and/or Backbone, Ember, Knockout, requireJS
2+ years of experience with relational and SQL based databases like MySQL as well as NoSQL based databases like MongoDB and CouchDB.
2+ years experience with build platforms such as Maven/Git/Jenkins, Ant scripting.
Well versed with Agile Development Methodologies including Scrum.
Hands on experience in BigData space (Hadoop Stack like M/R, HDFS, Pig, Hive, HBase, Flume, Sqoop, etc and NoSQL stores like Cassandra, HBase etc) is a Plus.
Experience with Adode Photoshop/GIMP/Illustrator will be an advantage
Exposure to UI /JS / CSS frameworks such as LESS / SaaS, BootStrap etc
Expertise in producing cross browser compatible code and strong ability to troubleshoot JS / browser issues
Responsibilities
Develop a clean, well structured, easily maintainable code.
Design and develop front-end components in Java, Java Script, HTML, CSS, iOS and Android
Develop products that are reliable, scalable, and secure.
Bug fixing & troubleshooting complex problems in a timely and accurate manner.
Education Requirements
Bachelors or Masters (Electrical & Computer Engineering and related) from reputed engineering colleges.

The Microsoft Office India team located in Hyderabad India (IDC) is building a set of next generation experiences.
• Are you fascinated by having to build highly scalable APIs on a reliable stack that can fallback from persistent connections to SMS?
• Can you build and run Services infrastructure that can scale to billions of transactions per day?
• Can you build UI infrastructure that can be extended in infinite ways?
We are part of the group whose mission is to reimagine productivity applications on mobile devices for emerging markets. A solid engineering culture, a fun set of people and solving tough problems are part of the deal and you will find it hard to say no to.
If you have the technical chops, we would love to hear from you.