Created
December 8, 2016 18:48
-
-
Save devnull255/acfc7034331fda9fb9cc03650e3dcf2e to your computer and use it in GitHub Desktop.
Swift String Extension with charAt(at: Int) method to return a single char at zero-based int position
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
// In the process of creating my first Swift Application using the book "Cocoa Programming for OSX", | |
// I grumbled at what seemed like clumsy, string-handling, then hit a roadblock that nearly turned me completely off. | |
// The string handling code that had worked with Swift 1.2 wouldn't work at all in Swift 3.0. When I figured out | |
// what did work in the latest version I was even more annoyed at the verbosity of it. I could have just left it alone | |
// and dismissed it as not worth wasting any more of my time, but my curiosity got the best of me and I started looking at | |
// the language guide on Apple's Developer site. There I discovered something that totally reversed my thinking: extensions. | |
// I had also read why string-handling had to be that way (flexibility, multiple Unicode encodings, yada, yada, yada). But | |
// I learned that I could utilize extensions to add my own methods to the String class that would give me more convient | |
// string-handling for my own needs. So after much long-windedness, here is my charAt method. | |
extension String { | |
// charAt(at:) returns a character at an integer (zero-based) position. | |
// example: | |
// let str = "hello" | |
// var second = str.charAt(at: 1) | |
// -> "e" | |
func charAt(at: Int) -> Character { | |
let charIndex = self.index(self.startIndex, offsetBy: at) | |
return self[charIndex] | |
} | |
} |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment