The JS super
(and transpilers) problem
In JavaScript there are 3 different ways to pass through an extended constructor:
- the good’ol’ ES3 and ES5 way, via
function Constructor() { Parent.apply(this, arguments); }
- the procedural ES2015 way via
Reflect.construct(Parent, arguments, Constructor);
- the native ES2015 classes way via
class extends Parent { constructor(...args) { super(...args); } }
All of them do something slightly different that is not always compatible and can be swallowed in a transpiled limbo.
The old way is not future proof
Even if you’ve read that ES6/2015 is backward compatible, there are cases where it’s impossible to use the old syntax to implement new features.
// basic ES2015 native extend
// the constructor is superfluous
// used in here to compare
// against old syntax
class MyList extends Array {
constructor(...args) {
super(...args);
}
}
const ml = new MyList(1, 2, 3);
ml instanceof MyList; // true
ml instanceof Array; // true
Array.isArray(ml); // true
ml.toString(); // "1,2,3"
// ES5 pseudo-equivalent operation that will fail
function MyOldList() {
// this doesn't do what we expect it to do
Array.apply(this, arguments);
}
MyOldList.prototype = Object.create(
Array.prototype,
{constructor: {value: MyOldList}}
);
var mol = new MyOldList(1, 2, 3);
ml instanceof MyList; // true
ml instanceof Array; // true
Array.isArray(ml); // false
ml.toString(); // "" empty string
The reason mol
variable is empty is that Array
cannot be natively extended in ECMAScript 5. To actually replicate 1:1 the modern behavior, we need to override the default return of MyOldList
constructor with an Array
instance that gets updated.
This is basically the inverted operation of what modern engines do extending natives, where the super()
call is mandatory to even have a this
context, or nothing will work!
Accordingly, we need a workaround that simulates the instance upgrade once its parent class is invoked.
// patched native Array extend
function MyOldList() {
// use the new Array instance instead
var self = Array.apply(this, arguments);
// upgrade its inheritance
Object.setPrototypeOf(self, MyOldList.prototype);
// overload the default returned context
return self;
}
MyOldList.prototype = Object.create(
Array.prototype,
{constructor: {value: MyOldList}}
);
var mol = new MyOldList(1, 2, 3);
mol instanceof MyOldList; // true
mol instanceof Array; // true
Array.isArray(mol); // true
mol.toString(); // "1,2,3"
Not so easy though …
While Array
and few other “old friends“ can be still be polyfilled, transpiled, and extended in the pre-classes JS era, there are constructors that don’t allow user-land code to invoke them through .call
or .apply
, even if the used context is theoretically valid.
// fails
XMLHttpRequest.call({});
// TypeError: Failed to construct 'XMLHttpRequest'
// Please use the 'new' operator,
// this DOM object constructor
// cannot be called as a function.
// fails too
new HTMLElement();
// TypeError: Illegal constructor
// fails as well
ArrayBuffer.call([]);
// TypeError: Constructor ArrayBuffer requires 'new'
The main issue with the previously mentioned workaround here, is that if you cannot use .apply
and your constructor is capable of N arguments like it is for the Array
, there’s no way to create a new instance with an unknown amount of arguments.
Or better … there is one …
function GenericExtend() {
var
p = GenericExtend.prototype,
upgrade = Object.setPrototypeOf,
a = arguments
;
switch (a.length) {
case 0: return upgrade(new GenericParent, p);
case 1: return upgrade(new Parent(a[0]), p);
case 2: return upgrade(new Parent(a[0], a[1]), p);
case 3: return upgrade(new Parent(a[0], a[1], a[2]), p);
// ... and so on ... to Infinity and Beyond!
}
}
Even using .bind
hacks might fail, so that the following is also not a universal solution:
// some smart developer might try the following
// but it will fail in many cases
function GenericExtend() {
var C = GenericParent.bind.apply(
GenericParent,
[this].concat(
Array.prototype.slice.call(arguments)
)
);
return Object.setPrototypeOf(
new C,
GenericExtend.prototype
);
}
Using Reflect.construct
doesn’t solve the issue
To start with, Reflect
and Reflect.construct
are something part of modern specifications. Being something new, if polyfilled, and backported to old engines, it inevitably uses a weak .bind
hack, or .call
and .apply
that will most likely throw right away.
// any class ...
class MyElement {}
// using CustomElements V1 API
customElements.define('my-element', MyElement);
// ... but right now the following code
// cannot be polyfilled!
const me = Reflect.construct(
HTMLElement,
[],
MyElement
);
Once transpiled, and digested by modern browsers, there’s no way you’ll have that behavior natively working.
In few words, we are in a time-frame where modern engines don’t recognize transpiled code, causing unprecedented compatibility issues between your intent, as modern developer, and production code, when target browsers are not just the latest evergreen one.
In this constructor cases, it means that both polyfills and transpilers need to overwrite the globally available constructor in order to understand what’s going on and how to simulate modern meaning over constructors that cannot be invoked without new
.
// a possible workaround for transpiled code
function GenericExtend() {'use strict';
var self;
if (typeof Reflect === 'object') {
self = Reflect.construct(
GenericParent,
arguments,
this.constructor
);
} else {
// in case there's no context upgrade
// fallback to the updated `this`
self = GenericParent.apply(
this, arguments
) || this;
}
// in every case, return the instance
return self;
}
Classes and polyfill patterns
I hope it’s clear by now that the meaning of super()
, where the current context actually gets defined by specs, cannot be equally represented by older syntax.
However, there is a way to backward simulate what a transpiler would do, following a similar workaround needed by older code.
class Child extends Parent {
constructor(...args) {
const self = super(...args);
// other eventual operations
// using `self` instead of `this`
// overriding the returned instance
return self;
}
}
Above pattern would most likely work with any available transpiler is used today, and best of all, it is fully compatible with native ECMAScript standard.
Following just few examples:
// expected result
class MyList extends Array {
constructor(...args) {
return super(...args);
}
}
const ml = new MyList(1, 2, 3);
// expected result
class MyElement extends HTMLElement {
constructor(...args) {
return super(...args);
}
}
customElements.define('my-element', MyElement);
const me = new MyElement;
You can try directly on Babel repl the array version of the extend, while unfortunately if you try with the second MyElement
example you’ll notice an error saying:
Super expression must either be
null
or afunction
, notobject
which is not even accurate as an error, but it’s the reason this post exists.
Are polyfills doomed?
As a polyfill author, while some browser vendor developer might feel proud about current status (please note this developer just erased his presence on twitter so nobody can document his point of view anymore), the only way I can think about polyfills not being ruined by the current situation is, irony included, transpilers.
The same logic that is incompatible with “current state of JS transpiled art“, is most likely the only solution to this problem and the reason is simple: every mentioned workaround is not enough, for the obvious reason a defined constructor
does usually much more than just a super()
call.
Accordingly, either we all implement an .init
method that takes care of such implicit constructor initialization, or we need transpilers to overtake the initial class definition. How?
// how developers define their class
class MyElement extends HTMLElement {
constructor(...args) {
super(...args);
this.setAttribute('awe', 'some');
}
}
// what transpilers should produce instead
var MyElement = (function (parent) {
var upgrade = typeof Reflect === 'object' ?
function () {
return Reflect.construct(
parent,
arguments,
this.constructor
);
} :
function () {
return parent.apply(this, arguments) || this;
};
function constructor() {
this.setAttribute('awe', 'some');
}
function MyElement() {
var self = upgrade.apply(this, arguments);
return constructor.apply(self, arguments) || self;
}
MyElement.prototype = Object.create(
parent.prototype,
{
constructor: {
configurable: true,
writable: true,
value: MyElement
}
}
);
return MyElement;
}(HTMLElement));
// element definition
customElements.define('my-element', MyElement);
With above code, engines compatible with latest ES2015 standard won’t have problems with patched parent constructors, when and if needed, while engines compatible with ES3 or ES5 can use patched natives so that moving forward keeps being promoted, and backward compatibility is preserved.
Please share this post so that libraries and polyfills authors can be on the same page: thank you!