COMMUNITY

JSON parser has issues

haxe-hl

(Shalmu) #1

When I was experimenting with opening JSON in Hash Link and Neko, I have noticed that standard “JsonParser.parse” won’t work out of the box. I have to write my own “wrapper” that uses this function inside, like this. Should we make it “built in” or leave it as it is, so that people would write workarounds, like I did? Like we have the JavaScript situation, where “btoa” doesn’t really work correctly and we have to write wrapper around it…

Forgot to mention that I read the file with “File.getContent” in neko/php/hl environments. However, this wrapper is unnecessary if the file was not encoded in UTF.

Here is the code:

class JaySon{
	public static function parse(cont:String, filename='') {
		function is(n:Int, what:Int) return cont.charCodeAt(n) == what;	
		
		return JsonParser.parse(cont.substr(
			if (is(0, 65279))
				/// looks like a HL target, skipping only first character here:
				1
			else if (is(0, 239) && is(1, 187) && is(2, 191))
				/// it seems to be Neko or PHP, start from position 3:
				3
			else
				/// all other targets, that prepare the UTF string correctly
				0
		));
	}	
}

(Valentin Lemière) #2

How do you get your string?
Does it work if you include your json directly, like JsonParser.parse('{ "i": 7 }'); ?


(Shalmu) #3

I read it with “File.getContent” in neko/php/hl environments.
Yes, standard parser does work with strings, generated on the fly, as well as with files read from filesystem that are not utf


(Valentin Lemière) #4

Which utf? 8 I assume, does it have bom or not?


(Shalmu) #5

Yes, UTF8 and yes, it has bom, which I actually try to detect and skip. The same bom appears different in hl (as one byte) and other system targets (as 3 bytes, in neko and php).


(Valentin Lemière) #6

Sounds like a bug with File.getContent, the bom shouldn’t be part of the string.

You should open an issue on github.


(Shalmu) #7

Done.