Created
October 19, 2011 16:22
-
-
Save jamis/1298818 to your computer and use it in GitHub Desktop.
Compute how many bytes you'd need to represent a Javascript string in UTF-8. This is great if you're doing naughty things like forcing utf-8 encoded text into latin1 database columns, and want to avoid silently truncating text.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
/* Returns the number of bytes needed to represent the given | |
* Javascript string in UTF-8. */ | |
String.prototype.byteSize = function () { | |
var bytes = 0; | |
for(i = 0; i < this.length; i++) { | |
var charCode = this.charCodeAt(i); | |
if (charCode <= 0x7F) | |
bytes += 1; | |
else if (charCode <= 0x7FF) | |
bytes += 2; | |
else if (charCode <= 0xFFFF) | |
bytes += 3; | |
else | |
bytes += 4; | |
} | |
return bytes; | |
} |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment