Requests for comment/More compact JSON API output

Problem
Sample piece of action=parse output on zhwiki:  \u5973\u6027<\/b>\u662f\u6307\u96cc\u6027<\/a>\u7684\u4eba\u985e<\/a>\uff0c\u8207\u7537\u6027<\/a>\uff0c\u4e5f\u5c31\u662f\u96c4\u6027<\/a>\u4eba\u985e\u6210\u5c0d\u6bd4\u3002\u5973\u6027\u9019\u500b\u540d\u8a5e\u662f\u7528\u4f86\u8868\u793a\u751f\u7269\u5b78<\/a> 

And list=allpages query on en: {	"pageid": 5878274, "ns": 0, "title": "!" }, {	"pageid": 3632887, "ns": 0, "title": "!!" }, {	"pageid": 600744, "ns": 0, "title": "!!!" }, We can see that even though it is a valid JSON, it is bulky if compared to what JavaScript aloows it to be.

Proposed solution
I propose to add a new format, jsoncompact, or, for compactness, jsonc, whose output should be much shorter for some use cases as the price of being a valid JavaScript but not necessarily valid JSON. A few things that can be done:
 * Example 1, don't escape Unicode: {"foo":"\u043f\u0440\u0435\u0432\u0435\u0434"} → {"foo":"превед"} (24 bytes less).
 * Example 2, don't quote keys if possible: {"foo":"bar"}</tt> → {foo:"bar"}</tt> (2 bytes less).

The second example is much less tolerated by different parsers so I'm not proposing to use it, though even it can save 3000 bytes