There are a billion articles on ES6 at this point. What's one more? Here we discuss some emerging patterns and issues related to real world use of ES6 as well as how one can go about using it now via Babel. If you aren't yet familiar with the features and changes of ES6 itself, you'll probably want to check out the following links first:

MDN is invaluable. It provides systematic coverage of all JS, including ES6.

②ality isn't organized like MDN, but it boasts the finest collection of deep articles covering specific features and edge cases.

The online Babel REPL. This is fantastically useful to answer quick questions like ‘does this work?' and of course, ‘how?'

The final draft. Dry reading, but sometimes it's the only way to get an authoritative answer. Edifying if you stare at it long enough.

Where We're AtIn April, the ES6 spec reached its final draft. Later this month, the Grand Council of Javascript Elders will shuffle into the silver sanctum to seal the document with unicorn wax. A glass bell will ring in the spire of the tallest tower in the City and leprechauns will be dispatched to carry the good news to the farthest corners of the Kingdom. ‘ES5 is dead, long live ES6!' they shout.

Unfortunately, that last part takes about ten years. Leprechauns aren't nearly as fast as Hollywood has led you to believe, and they're easily distracted.

On one hand, progress towards ES6 support in browsers has been rapid. If you've followed the Kangax Table for the last few months, that should be clear. Yeah, there's a lot of red yet, but look at Firefox 40 (66%), Chrome 45 (45%) and holy s- yes, that really isIE Edge, aka Project Spartan, at 63%. ‘Imagine there's no heaven...'

Note, these figures sometimes go down, too, as new tests are added to confirm implementation details. You can only precisely compare them at a given moment rather than over time.

On the other, we've all had too many workyears / firstborn stolen by spiteful, undying IE versions to place much stock in the idea that there's a corner we'll turn when suddenly it's totally cool to destructure an array in the browser. Well, you can polyfill Map, you can polyfill Array.from or whatever. But how do you polyfill syntax? ES6 isn't valid ES5 at a syntactic level. This is a new problem.

Age of BabelYou can't polyfill syntax - you can transpile it though. Babel has been tearing it up since the tail end of its days as ‘6to5' (when somebody must have realized it was too npm-big to not have a sexy name). Transpiling JS wasn't new (in fact we owe a lot of ES6 refinements to Coffeescript), nor was transpiling ES6 to ES5 (Traceur's been around a while). But npm download stats present a picture of Babel as now being the community's go-to ES6 transpiler. The line keeps angling upwards.

Babel might owe its popularity to good timing or the fact that they've hung onto the highest ‘Kangax Index' for a while. But that line probably owes its recent incline mostly to the fact that ES6 has been finalized. Suddenly it seems a lot less speculative to jump on board.

Now, many folks feel transpiling is icky:

Tougher to debug

Feels funny

But that said, some of those who were uneasy about Coffeescript seem to have fewer reservations about writing code that, in theory, won't need to be transpiled... someday. That may help with #2 anyway - but as for #1, even with sourcemaps, there's no denying that you're risking an extra maintenance burden by depending on a transpiler. Is it worth it?

Language Shapes UsageThe answer comes down to the ways ES6 could improve the quality of your code. An enumeration of new features / toys may not help much in determining that because the real-world implications of those features aren't always immediately clear. It's through use that we develop a shared vocabulary, composed not by the language's grammar dictates but by the patterns and preferences - the idiom - that makes it easier for us to work together and write reusable code. A language's features and syntax do shape the development of that idiom, and language designers take that into account: they're planting seeds. The available features and syntax will encourage or discourage particular habits and solutions.

At this point, a hazy image of how ES6 really gets used has begun to form. I've cataloged a handful of patterns I've seen emerging in the wild, tried to supply their rationales (as I see them), and supplemented these with some of the conclusions I've arrived at from working mainly in ES6 for a few months myself. As always: YMMV.

Variable DeclarationWhen first encountering let and const, a common reaction is to think of let as the new var - in fact, I've even seen an article called ‘Let Is The New Var.' The phrase shows up verbatim all over. But it isn't true: const is the new var.

Alright, that's not the truth, either. If we're talking about which is closer in behavior tovar, indeed, that's let. Assuming your code isn't relying on hoisting and you don't declare variables in blocks, you could even switch them 1:1 and things would be fine, but that wouldn't be true with const.

You may wonder: Where does var it fit in? It doesn't. It's sort of de facto deprecated - like with was for years before they made it official. Hoisting was a language design error, and the benefits of lexical scope are the lure being used to guide us out of this particular problem zone.

The gut feeling for those of us used to var is that ‘vars' arevariables and ‘constants' are ... not. True. But we never had constants, so we used (unsightly but distinctive) case conventions when we wanted to communicate that a particular identifier represented a "pre-supplied" value. It could be an enum value, a "plug in your string here to configure this" slot, etc. To Javascript devs, a ‘constant' was just a variable whose value was somehow hardcoded.

But the real ES const has nothing to do with ‘hardcodedness'. It just means that a binding is permanent for the duration of the scope in which it was declared. Go open some JS you wrote and review it, looking at variables. How many are ever redefined after they're declared? And (here's the crux) - of those which aren't ever redefined, for how many would such a redefinition, were it to be accidentally introduced, constitute an error?Right now that code doesn't say so. If it happens, there's no way to directly trace the problem back to a constraint that was never expressed.

With const, that code will be invalid if a redefinition occurs. The mistake would even be detected by static analysis, so you'll be told exactly where the offense occurred before the code even has a chance to run.

So a simple pattern has appeared in real-world ES6, the logical consequence of these facts: use const except when let is expressly needed. It's a kind of defensive coding, which is something we don't see a lot of in fast and loose JS (or, as the Java dev behind me would say, ‘sloppy and inferior'). Fortunately it's a simple habit to pick up and it yields immediate benefits.

I suppose I should acknowledge that it's five characters. And that it therefore will not neatly align with four-space tabs. Once you've typed it that first time, it gets easier. I promise.

Lexical Scope, Blocks, and the End of the IIFEAbove we addressed lexical scope briefly. The largest idiomatic impact of lexical scope is that it makes an older idiomatic usage more or less obsolete: the IIFE.

The purpose of an IIFE (immediately-invoked function expression) was to provide a scope-for-hire. Before ES6, aside from a few odd edge cases, function scope was the only scope other than global. Node modules might be argued to afford a different kind of scope, but even the hidden innards of that system involve wrapping modules in IIFEs.

Some background if this is an unfamiliar term: there are function declarations and there are function expressions. A function declaration (hoisted, like var) is a type of statement. Expressions can be statements, but not the other way around. Any statement beginning with function will be a function statement, so it can't be anonymous and it can't be invoked in place. Since the object is to avoid polluting the current scope, you need to somehow ‘expressionize' the function. There are a variety of approaches to this. The most common is to parenthesize it; then it is a function expression inside a parenthesized expression, altogether being an expression statement. Other common choices are prefixing with the logical not operator ! (semantically abusive, aesthetically appealing) or the void operator (arguably more expressive, but relatively obscure).

The fact that there's no consensus about how to do this tells us a bit about IIFEs. The pattern is unavoidable, but it isn't really ‘acknowledged' by the language itself. And although we're accustomed to it, creating these functions, anonymous or not, is an indirect, unexpressive means to get some scope ‘real estate.' They're functions in name but not in, uh, spirit.

So in ES6, (function() { /*...*/ })(); becomes { /* ... */ }. Praise.

Block statements are familiar because we use them routinely as the ‘statement' part of control and loop statements like ‘for' and ‘if', but it's easy to forget they are a type of statement in their own right. This is why a line starting `{` begins a statement, not an object literal.

Perhaps this means that we'll see a return of the long-maligned (but harmless)statement label. A block statement (unless it belongs to a loop) can only use break with a label. I haven't actually seen this as a pattern in practice - just speculating.

A discussion of real usage should address common errors, too. As far as lexical scope goes, there's one I've seen a few times now. Sometimes people who follow the const-unless-let principle abandon it as soon as they get to for/of loops, apparently thinking the identifiers in the loop will be ‘reused' and are therefore let vars. This isn't the case -for/of|in loop scopes

are unique per iteration and

include their initializers! Thus,

for (const char of str) console.log(char); is valid and, if char should be immutable (per-iteration), preferred. Note that that isn't true of the for ;; loop, however.

Arrow as DefaultHere's a snippet from a developer issue thread for V8. The title of the issue is ‘Implement arrow functions' and it dates from 2013, which is around 1838 AD in Javascript years:

What's wrong with the actual language construct? What benefit does it have to be able to writefoo(x => x + 1); instead of foo(function(x) { return x + 1; });, other than saving a few bytes and losing verbosity (i.e. clarity) in the code?

The writer sort of had a good point. It just wasn't obvious to most folks what arrows were supposed to bring to the table. And they looked weird, which is what he or she is actually saying there. At this point, we're used to seeing them, and the argument now seems comically backwards (wait, which one has greater clarity?). But at the time, I'd have agreed.

Now, I consider arrow functions to be the "default" that one diverges from only as situationally required. I'll come clean here - the argument for this that I'm about to present is the product of my own experience, not an observed outside trend (which is what I've tried to stick to so far). Take it with a grain of salt.

The core behavioral difference between arrow-functions (AF) and function-functions (FF) concerns this. Many JS devs avoid using this because it's a pain. It was a pain - arrows fixed it.

Their utility as event handlers is pretty obvious - but it doesn't say much about why one might treat them as the norm. Well, I'd said that AFs fixed the this problem, but in truth, the choice between AFs and FFs is what's fixed it. We didn't just gain a way to express lexical this, we effectively gained a way to express variable this. Afterall, previously we would have used `function` for both, with lexical this approximated with aliases like `self`. Yet in the majority of cases, it simply doesn't matter: most functions in any given (average) project probably will make use of neither lexical nor variable this.

For functions where it doesn't matter, I find it reasonable to say one or the other should be ‘default' - otherwise you're choosing at random, and missing an opportunity to make your code clearer.

const says, "I am not redefinable". AF says, "my this is lexical" - but that's also a way of saying "my this is not redefinable." Since a contextual this is the special case, the thing you need to take care with and draw attention to, it stands to reason that the AF should be used for functions that don't make any use of this at all. Then function means a good deal more:

To be fair, this isn't really that analogous to const/let. You won't get any benefit from static analysis or early errors; it's merely a convention. So you can just as readily argue that the reverse should be true - that lexical this should be considered the ‘special' case. Taking care to be consistent in this regard is more important than which way one chooses to be consistent.

Classes, Symbols and Object LiteralsES6 classes aren't really the totally new construct that they may appear to be (if they were, Babel wouldn't be able to transpile them). But it seems a bit much to just call them sugar. Afterall, they're doing a lot of (obnoxious) work for you and finally provide a singular consistent approach to defining constructors and their prototypes, along with inheritance, all at once and in a very clear way.

The usage trend of note - other than the fact that it's being used at all - is the use of symbolic property names to get something very close to private methods and properties. I think the jury is still out on whether this is something to be gung ho about. The benefit is encapsulation without having to create new scopes, but it can be argued that an overt concern with hiding things is best left to the sorts of languages where that's like, a thing.

The best use case for privacy-via-symbols, though, is to shadow accessors:

const $str = Symbol();

class ASCIIString { constructor(str='') { this.content = str; }

get content() { return this[$str]; }

set content(str) { str = String(str);

for (const char of str) { if (!this.isValidChar(char)) throw new Error(`Char "${ char }" is not valid.`);

this[$str] = str; } }

isValidChar(char) { return char.codePointAt(0) <= 0x7F; } }

Unicode <3: In the above example, when we iterate over the characters and when we use codePointAt, astral plane characters work correctly.

Not the most realistic example, but you get the idea. There are a lot of great things you can do with accessors. I find them invaluable when writing libraries that benefit from a greater degree of opacity and need more aggressive guarding. However, you probably don't want to make getters too elaborate; accessors tend to hide the fact that there may be a higher cost associated with not caching their values than the user of your API may realize.

What class really delivers is slick, useful prototype inheritance. Making use of constructor inheritance is far more common in Node than in the browser, and that's partly because Node provided a consistent way to do it (util.inherits). You still had to expressly call the parent constructor by name and configure the prototype withObject.defineProperties, but it works and people used it. I believe that class andextends will have a similar effect, and that they also invite deeper inheritance patterns than we've been accustomed to, in particular because of the utility and clarity of super:

I've found myself on occasion creating classes with inheritance chains three or four deep - something I never did before, mainly because the amount of boilerplate involved made it seem awkward, especially when a class only represented a small change from its parent. Now the syntax matches up with the reality of what we're doing, and it's turned out to be one of my favorite improvements.

Except for static, object literals now allow methods and accessors using the same syntax as class. It has a nice symmetry, and drives home the point that class is really nothing more than a special sort of object in JS. If you want a ‘singleton' (without inheritance), the object literal remains a more direct means to implement that pattern than class.

CIO, CTO & Developer Resources

Function Signatures and Binding PatternsDestructuring has led to a number of new patterns. The first is the reimagining of the traditional ‘options object' argument:

constructor({ name, age, species='cat' }={}) { ... }

Default assignment in the options argument lets us drop a ton of awkward ‘this or this or this' variable assignments at the head of a function. Notice that the object has its own default there - you'll need to do this if you want the options argument itself to be optional.

Rest gets heavy use in method signatures when a child class method exists as a decorator of its super's same-name method - and is often paired with spread:

In any situation where one would have addressed a member by a predetermined index, destructured assignment proves to be more readable and direct. For regex pattern matching with multiple match groups, it's invaluable. Even for simple matches, I think it's clearer:

Functions with ‘multiple' return values were a pattern previously reserved for cases where there's no alternative. The function might return an object where the ‘main' result was one property and there were one or more properties with important metadata or something. You avoided it because it meant the caller would need to pick off bits of the result to use - extra steps. Destructuring makes this sort of return value so natural though that old reservations begin to fall away. It even lends itself to working with (untyped, but) ‘tuple-like' values.

One of the most important mechanisms for async control flow is Promise.all(), which accepts an array of promises (or non-promise values, which can be useful in cases where you don't know which values will be promises in advance). Its then() passes the matching array of resolved values to its callback. This is another key situation that demands destructuring of arguments for your sanity:

IterationIf ES6 could be said to have a theme, it might be ‘iteration.' It also might be ‘expose everything' (see Proxy and Reflect). We're being given the tools to work with low level behaviors - nothing is magic anymore. In the case of iteration, this is achieved with the property Symbol.iterator.

Perhaps you want to subclass Array to create a Stack structure. It should probably iterate from last to first:

Note that truly subclassing Array remains impossible; it's not something that a transpiler can completely emulate or polyfill. Methods will work, but things will go weird if you assign directly to indices and you'll need to provide an explicit toString. Honestly I'm giving you a terrible example here.

Generators are special functions that return an ‘iterable' (like the method above). As with Promises, iterables need only conform to a particular pattern; you can make up your own and use them anywhere an ‘iterable' is expected. MDN has good coverage of this.

Generators serve purposes other than iteration. The most significant trend in generator use is to wrap them with libraries like co to create Promise-driven coroutines. That's a lot of words. Well, it's a big thing on node right now, but rather than address it in that form, we'll address async and await, the formal ES7 proposal for adding this type of functionality at the language level, below.

There are two ways one will find themselves using iterables frequently:for (const x of iterable) {} and [ ...iterable ]. The latter effectively casts any iterable to an array, so you wouldn't want to use it with an infinitely yielding generator.

It really wasn't that long ago that we first got forEach() and the otherArray.prototype iteration methods. It's still common to see classic C-style for ;;loops in places where something else would make more sense. When comparingfor of loops with the forEach method, I think it usually comes down to a question of code reusability. It makes more sense to use forEach when a named function is involved; but if it would have been a lambda, I'd favor the statement. In particular consider that a return in forEach is equivalent to continue in a loop, but forEach has no equivalent for break; to achieve its effect, you'd need to use every or some in ‘off-label' ways.

Also note an obscure but potentially confusing gotcha: for of iteration includes the undefined indices in a sparse array, while forEach and other Array.prototypemethods do not.

ES7ES6 - or ES2015 as it's now called - is just the first wave (and almost certainly the largest) of what are to be incremental, perhaps annual, updates to EcmaScript. ES[YEAR] is to be a sort of rolling target, if I understand correctly, which is a way of acknowledging the reality of how engines end up implementing the new standards incrementally themselves. The updates have deadlines, but they will occur frequently enough that there will be no pressure to finalize any specifications that haven't gotten the requisite level of fussing over that keeps our language (hopefully) clear of cruft and new wats - because it just means waiting a year to get it right, not five.

Babel has come to fulfill a secondary role as a kind of live testing ground for tentative language changes taken from the strawman specs. Although these exist in varying degrees of maturity, and one cannot expect them to necessarily enter the language in their current forms (or at all), they're worth experimenting with. Some are little no-brainers (the exponentiation operator), some are more elaborate and iffy. There are two I want to address: the first because it may as well be in ES6, as far as Babel users are concerned; and the second because I think it frames an interesting debate well.

Holy Grail: Async / AwaitThe async/await spec has been around a while; it was a contender for ES6. It enables a sort of async holy grail - this is to Javascript what flexbox was to CSS. It may be an experimental feature, technically, but once you've activated it there's no going back.

Though async functions are based on generators, and the syntax mirrors that, they're more fundamentally wrapping Promise. Where generators return iterators, async functions return promises.

Promises are great - sometimes - but even now that we have the One True Promise to work with across the board, it can be a little tough to shake the sense that we've only traded one set of problems for another. We canthrow in promises, but .catch() is not catch. And callback-heavy code can be nearly as awkward when rewritten with promises, which, after all, still essentially take callbacks. These are the things which async aims to address.

It's particularly fascinating in the browser. The following example isn't, say, IE8 safe, but it can be made to be pretty easily; I just want to keep the premise clear:

When you await a value, if the value is a promise, there's a yield behind the scenes. When execution resumes, the return on that hidden yield will be the value from the promise's resolution. Or, if the promise was rejected, it actually throws.

I believe that client-side use of Babel is inevitably going to increase, and it will be bringing async/await with it. And since promises include ‘promise-like' objects, async/await is already compatible with any libraries that return promises for asynchronous operations, like jQuery and Angular.

The Binding Operator's QuestionsOne of the more interesting candidates for ES7 - also already implemented as an optional feature in Babel - is the binding operator. Like async/await, the binding operator had been under discussion for ES6 but it wasn't ready; there are still uncertain details. To its credit, it has a sweet, unambiguous symbol that doesn't reek of grawlix: ::. These are hard to come by.

I'm not sure what you'd call its action in technical terms - Googling leads me to multimethods, dynamic dispatch, or late binding. The latter two are probably not at all accurate in a JS context, where all methods are late-bound because of the nature of the prototype chain, and dynamic dispatch is an inapplicable concept because of how JS properties work. ‘Multimethod' makes a little more sense, perhaps, but also has a bunch of inapplicable classical OO baggage.

In other words, it's call(), re-arranged to allow a method-call-like syntax.

It can be used another way, too. In promise.then(::object.method), the argument passed to then is equivalent to object.method.bind(object). Grammar folks may note this presents a unique case - a sort of prefixed binary operator, except its operands are a sequence that would normally resolve to a single value. I suspect this might have something to do with why the spec is still up in the air.

The utility is pretty obvious - especially when dealing with array-like objects as in the first example - but the binding operator still falls neatly in the experimental field. That isn't to say it's a bad idea to use it, but it hasn't seen anything like the ravenous attention asynchas garnered. There are not yet any idiomatic uses associated with it, except perhaps its use as a way to get DOM perversions to behave like they already should.

In that case, why mention it? Because it has ... philosophical implications. Jav- er, EcmaScript has always been a multi-paradigm language. I don't know if that began by accident or by design, but now it's considered a cornerstone of its identity. Recent years have seen a rise of ardent functionalism in JS (and elsewhere) that's often fascinatingand alluring, dragging us a bit further away from JS's roots as a sort of bootleg object oriented ish grab bag.

The introduction of class syntax has bolstered the case for - or at least the simplicity of implementing - software that follows a model that's more or less object oriented. There was some resistence to this and I suspect in some ways it came from the aforementioned group feeling that their work at converting dunderheads is already hard enough. (Other objections were that it could make the workings of prototypal inheritance murkier, and concern about the whole new world of things Java devs might end up doing when they touch JS: ‘ah, class - it's about time!').

The bind operator fits into this ongoing tug of war about what JS ought to move towards because it can be seen as ‘anti-functional.' It places emphasis on this and invites us to create whole libraries of plug-and-play methods for use on objects and values without modifying built-ins, while taking advantage of coercion or duck typing. Contrast this with the equally valid functional approach that would prefer to see those objects and values as arguments subservient to the almighty function.

const seconds = function() { return this * 1000; };

3::seconds(); // 3000 '3'::seconds(); // 3000

All other concerns aside, they do scan nicely. If one were dedicated enough to the premise, it's a short jump to using these free floating methods anywhere that a given function could be said to have a core argument that would make sense as this. Preexisting functions that fit the bill can be converted easily:

So it could get out of hand, but I think it'll be alright. At this point, on the back end at least, functional techniques have become idiomatic JS themselves. Avoiding mutation and side effects, thinking in terms of higher order functions, and taking joy in writing small, abstract and single-minded components are all recognized as ‘good.'. This is only a tiny portion of what ‘functional' might mean, though. Where's the rest? Perhaps it's still a matter of time, but it's just as likely that this is a case of plundering the parts we can use ... and ignoring the parts that we believe we already have superior - or at least, equally adequate - solutions for.

One of the coolest things about JS is how freely you can mix paradigms without creating discord. Even lodash/underscore, the warhorse of functional programming in JS, is really a hybrid creature - compare it to Ramda and that'll be clear. Multi-paradigm is our paradigm. It has its own flavor. Since ES6 has seen us make peace with this, the pendulum may swing back a little towards something more OO, but ultimately I expect the popular writing style will continue walking a line right down the center.

Using ES6 NowUsing Babel with node or io.js is pretty straightforward. You'll want your /src to be in .npmignore and your /lib (or whatever) to be in .gitignore. You can use package.json script hooks to make it build using the Babel CLI, or you can use a build tool or task runner. Personally, I usually use Gobble and tie it in at the "test" script, something likegobble build lib --force && node test/test.js.

There are several options for polyfilling. Babel is only directly responsible for transpiling; concerns like making sure Symbol exists fall on CoreJS, and generator / async support falls on Facebook's regenerator. You can include the "runtime" transform to get both. Depending on what you've written, you may be able to leave regenerator out.

I always transpile with sourcemaps. It's pretty critical if you want to debug or test without completely losing your mind. At your entry point, you might have something like this before any other code:

import 'babel/polyfill'; import 'source-map-support/register';

That will transform stack trace output so it shows the error position in the original code. It works, which seems like amazing spooky magic to me.

I mentioned earlier that I thought client-side use of Babeled code would increase. But that means including the polyfill (CoreJS and regenerator) which is quite large. The tradeoff between size and utility is still something that needs to be considered case-by-case. That said, I was able to get a Browserify bundle of Babel-transpiled code with CoreJS and the regenerator runtime down to 47kB after mangling and - this is important because of the incredible number of modules in CommonJS - converting all require paths to numeric identifiers using bundle-collapser. And the result? ES6 - ES7, even - works in IE8. Eight.

Here's the build script that got me there. In this case, I include the polyfill by importing it at the entry point (import 'babel/polyfill';); when building for node it will probably make more sense to polyfill with the ‘runtime' option. Using loose mode and dead code removal options help, but you should check out the extra caveats that using these options may entail before using them.

If you're looking to learn more about ES6, in addition to the links at the start of this article, I should note that 2ality's Axel Rauschmayer is about to publish the first comprehensive book dedicated to ES6. Given the quality of the material on his site, it seems like a good bet.

If you're working in ES6, you'll probably want an appropriate syntax definition in your editor so highlighting doesn't turn into a mess with the new syntax. For .tmLanguage, there's Babel-Sublime and JSNext. That format is supported by many editors, including Sublime. On the off chance that you're a Sublime Text 3 user who keeps up to date with the dev-channel releases, you can also use .sublime-syntax definitions, in which case you might want to check out my own ES6+ sublime-syntax def (available via Package Control as "Ecmascript Syntax").

Trevor Parsons is Chief Scientist and Co-founder of Logentries. Trevor has over 10 years experience in enterprise software and, in particular, has specialized in developing enterprise monitoring and performance tools for distributed systems. He is also a research fellow at the Performance Engineering Lab Research Group and was formerly a Scientist at the IBM Center for Advanced Studies. Trevor holds a PhD from University College Dublin, Ireland.

SYS-CON Events announced today that Interface Corporation will exhibit at the Japan External Trade Organization (JETRO) Pavilion at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Interface Corporation is a company developing, manufacturing and marketing high quality and wide variety of industrial computers and interface modules such as PCIs and PCI express. For more information, visit http://www.i...

SYS-CON Events announced today that MIRAI Inc. will exhibit at the Japan External Trade Organization (JETRO) Pavilion at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
MIRAI Inc. are IT consultants from the public sector whose mission is to solve social issues by technology and innovation and to create a meaningful future for people.

SYS-CON Events announced today that Fusic will exhibit at the Japan External Trade Organization (JETRO) Pavilion at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Fusic Co. provides mocks as virtual IoT devices. You can customize mocks, and get any amount of data at any time in your test. For more information, visit https://fusic.co.jp/english/.

SYS-CON Events announced today that Taica will exhibit at the Japan External Trade Organization (JETRO) Pavilion at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Taica manufacturers Alpha-GEL brand silicone components and materials, which maintain outstanding performance over a wide temperature range -40C to +200C. For more information, visit http://www.taica.co.jp/english/.

Smart cities have the potential to change our lives at so many levels for citizens: less pollution, reduced parking obstacles, better health, education and more energy savings. Real-time data streaming and the Internet of Things (IoT) possess the power to turn this vision into a reality. However, most organizations today are building their data infrastructure to focus solely on addressing immediate business needs vs. a platform capable of quickly adapting emerging technologies to address future ...

In his session at @ThingsExpo, Greg Gorman is the Director, IoT Developer Ecosystem, Watson IoT, will provide a short tutorial on Node-RED, a Node.js-based programming tool for wiring together hardware devices, APIs and online services in new and interesting ways. It provides a browser-based editor that makes it easy to wire together flows using a wide range of nodes in the palette that can be deployed to its runtime in a single-click.
There is a large library of contributed nodes that help so...

SYS-CON Events announced today that Daiya Industry will exhibit at the Japan External Trade Organization (JETRO) Pavilion at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Daiya Industry specializes in orthotic support systems and assistive devices with pneumatic artificial muscles in order to contribute to an extended healthy life expectancy.
For more information, please visit https://www.daiyak...

What is the best strategy for selecting the right offshore company for your business?
In his session at 21st Cloud Expo, Alan Winters, U.S. Head of Business Development at MobiDev, will discuss the things to look for - positive and negative - in evaluating your options. He will also discuss how to maximize productivity with your offshore developers.
Before you start your search, clearly understand your business needs and how that impacts software choices.

SYS-CON Events announced today that SIGMA Corporation will exhibit at the Japan External Trade Organization (JETRO) Pavilion at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
uLaser flow inspection device from the Japanese top share to Global Standard! Then, make the best use of data to flip to next page. For more information, visit http://www.sigma-k.co.jp/en/.

SYS-CON Events announced today that Yuasa System will exhibit at the Japan External Trade Organization (JETRO) Pavilion at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Yuasa System is introducing a multi-purpose endurance testing system for flexible displays, OLED devices, flexible substrates, flat cables, and films in smartphones, wearables, automobiles, and healthcare.

SYS-CON Events announced today that B2Cloud will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
B2Cloud specializes in IoT devices for preventive and predictive maintenance in any kind of equipment retrieving data like Energy consumption, working time, temperature, humidity, pressure, etc.

SYS-CON Events announced today that NetApp has been named “Bronze Sponsor” of SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
NetApp is the data authority for hybrid cloud. NetApp provides a full range of hybrid cloud data services that simplify management of applications and data across cloud and on-premises environments to accelerate digital transformation. Together with their partners, NetApp em...

SYS-CON Events announced today that Ryobi Systems will exhibit at the Japan External Trade Organization (JETRO) Pavilion at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Ryobi Systems Co., Ltd., as an information service company, specialized in business support for local governments and medical industry. We are challenging to achive the precision farming with AI. For more information, visit http:...

SYS-CON Events announced today that mruby Forum will exhibit at the Japan External Trade Organization (JETRO) Pavilion at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
mruby is the lightweight implementation of the Ruby language. We introduce mruby and the mruby IoT framework that enhances development productivity. For more information, visit http://forum.mruby.org/.

SYS-CON Events announced today that Enroute Lab will exhibit at the Japan External Trade Organization (JETRO) Pavilion at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Enroute Lab is an industrial design, research and development company of unmanned robotic vehicle system. For more information, please visit http://elab.co.jp/.

Real IoT production deployments running at scale are collecting sensor data from hundreds / thousands / millions of devices. The goal is to take business-critical actions on the real-time data and find insights from stored datasets.
In his session at @ThingsExpo, John Walicki, Watson IoT Developer Advocate at IBM Cloud, will provide a fast-paced developer journey that follows the IoT sensor data from generation, to edge gateway, to edge analytics, to encryption, to the IBM Bluemix cloud, to Wa...

SYS-CON Events announced today that Suzuki Inc. will exhibit at the Japan External Trade Organization (JETRO) Pavilion at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Suzuki Inc. is a semiconductor-related business, including sales of consuming parts, parts repair, and maintenance for semiconductor manufacturing machines, etc. It is also a health care business providing experimental research for...

Many companies start their journey to the cloud in the DevOps environment, where software engineers want self-service access to the custom tools and frameworks they need. Machine learning technology can help IT departments keep up with these demands.
In his session at 21st Cloud Expo, Ajay Gulati, Co-Founder, CTO and Board Member at ZeroStack, will discuss the use of machine learning for automating provisioning of DevOps resources, taking the burden off IT teams.

DevSecOps – a trend around transformation in process, people and technology – is about breaking down silos and waste along the software development lifecycle and using agile methodologies, automation and insights to help get apps to market faster. This leads to higher quality apps, greater trust in organizations, less organizational friction, and ultimately a five-star customer experience.
These apps are the new competitive currency in this digital economy and they’re powered by data. Without ...

Containers are the future of web development, in large part thanks to Docker’s explosive growth. According to DataDog, 15 percent of hosts run Docker, which is significantly up from the 6 percent of hosts running it at this point in 2015. LinkedIn has also seen a 160 percent increase in profile references to Docker in just the past year alone, indicating Docker has become a much bigger priority for IT professionals looking for work. With this technology primed to continue its exponential growth ...

Is advanced scheduling in Kubernetes achievable?
Yes, however, how do you properly accommodate every real-life scenario that a Kubernetes user might encounter?
How do you leverage advanced scheduling techniques to shape and describe each scenario in easy-to-use rules and configurations?
In his session at @DevOpsSummit at 21st Cloud Expo, Oleg Chunikhin, CTO at Kublr, will answer these questions and demonstrate techniques for implementing advanced scheduling. For example, using spot instances ...

The nature of the technology business is forward-thinking. It focuses on the future and what’s coming next. Innovations and creativity in our world of software development strive to improve the status quo and increase customer satisfaction through speed and increased connectivity.
Yet, while it's exciting to see enterprises embrace new ways of thinking and advance their processes with cutting edge technology, it rarely happens rapidly or even simultaneously across all industries.

With the modern notion of digital transformation, enterprises are chipping away at the fundamental organizational and operational structures that have been with us since the nineteenth century or earlier.
One remarkable casualty: the business process. Business processes have become so ingrained in how we envision large organizations operating and the roles people play within them that relegating them to the scrap heap is almost unimaginable, and unquestionably transformative.
In the Digital ...

These days, APIs have become an integral part of the digital transformation journey for all enterprises. Every digital innovation story is connected to APIs . But have you ever pondered over to know what are the source of these APIs? Let me explain - APIs sources can be varied, internal or external, solving different purposes, but mostly categorized into the following two categories. Data lakes is a term used to represent disconnected but relevant data that are used by various business units wit...

Today most companies are adopting or evaluating container technology - Docker in particular - to speed up application deployment, drive down cost, ease management and make application delivery more flexible overall.
As with most new architectures, this dream takes significant work to become a reality. Even when you do get your application componentized enough and packaged properly, there are still challenges for DevOps teams to making the shift to continuous delivery and achieving that reducti...

‘Trend’ is a pretty common business term, but its definition tends to vary by industry. In performance monitoring, trend, or trend shift, is a key metric that is used to indicate change.
Change is inevitable. Today’s websites must frequently update and change to keep up with competition and attract new users, but such changes can have a negative impact on the user experience if not managed properly. The dynamic nature of the Internet makes it necessary to constantly monitor different metrics. O...

Agile has finally jumped the technology shark, expanding outside the software world. Enterprises are now increasingly adopting Agile practices across their organizations in order to successfully navigate the disruptive waters that threaten to drown them. In our quest for establishing change as a core competency in our organizations, this business-centric notion of Agile is an essential component of Agile Digital Transformation.
In the years since the publication of the Agile Manifesto, the conn...

Many organizations are now looking to DevOps maturity models to gauge their DevOps adoption and compare their maturity to their peers. However, as enterprise organizations rush to adopt DevOps, moving past experimentation to embrace it at scale, they are in danger of falling into the trap that they have fallen into time and time again.
Unfortunately, we've seen this movie before, and we know how it ends: badly.

There is a huge demand for responsive, real-time mobile and web experiences, but current architectural patterns do not easily accommodate applications that respond to events in real time. Common solutions using message queues or HTTP long-polling quickly lead to resiliency, scalability and development velocity challenges. In his session at 21st Cloud Expo, Ryland Degnan, a Senior Software Engineer on the Netflix Edge Platform team, will discuss how by leveraging a reactive stream-based protocol,...

Not very long ago, in my IT consulting career, I used to be responsible for the launch of mission-critical applications that help enterprises leap into the cutting edge of the digital business revolution. There were a lot of hard skills required for leading such a mission that involved getting the system architecture and software design right early, mentoring and managing the engineering resources, and tracking the progress to the satisfaction of the business analysts who put together the requir...

DevOps at Cloud Expo, taking place October 31 - November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 21st Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world.
The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to w...

New competitors, disruptive technologies, and growing expectations are pushing every business to both adopt and deliver new digital services. This ‘Digital Transformation’ demands rapid delivery and continuous iteration of new competitive services via multiple channels, which in turn demands new service delivery techniques – including DevOps. In this power panel at @DevOpsSummit 20th Cloud Expo, moderated by DevOps Conference Co-Chair Andi Mann, panelists examined how DevOps helps to meet the de...

Cloud Expo, Inc. has announced today that Andi Mann and Aruna Ravichandran have been named Co-Chairs of @DevOpsSummit at Cloud Expo Silicon Valley which will take place Oct. 31-Nov. 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. "DevOps is at the intersection of technology and business-optimizing tools, organizations and processes to bring measurable improvements in productivity and profitability," said Aruna Ravichandran, vice president, DevOps product and solutions marketing...

The convergence of rapid feature development, automation, continuous delivery, and the shifting makeup of modern tech stacks has pushed monitoring requirements to a potentially overwhelming scale. But while the systems you need to monitor are complex, your monitoring strategy doesn’t have to be. The scale and pace of change involved in ops today dictate a carefully crafted monitoring and incident response strategy. Keeping the strategy simple will take some of the pain out of monitoring.

SYS-CON Events announced today that DXWorldExpo has been named “Global Sponsor” of SYS-CON's 21st International Cloud Expo, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Digital Transformation is the key issue driving the global enterprise IT business. Digital Transformation is most prominent among Global 2000 enterprises and government institutions.

One of the biggest challenges with adopting a DevOps mentality is: new applications are easily adapted to cloud-native, microservice-based, or containerized architectures - they can be built for them - but old applications need complex refactoring. On the other hand, these new technologies can require relearning or adapting new, oftentimes more complex, methodologies and tools to be ready for production.
In his general session at @DevOpsSummit at 20th Cloud Expo, Chris Brown, Solutions Marketi...

Cloud computing budgets worldwide are reaching into the hundreds of billions of dollars, and no organization can survive long without some sort of cloud migration strategy. Each month brings new announcements, use cases, and success stories.