I have the following function to find and highlight hashtags or mentions (@ or #) in a UILabel:
class func addLinkAttribute(pattern: String,
        toText text: String,
        withAttributeName attributeName : String,
        toAttributedString attributedString :NSMutableAttributedString,
        withLinkAttributes linkAttributes: [NSObject : AnyObject]) {
        var error: NSError?
        if let regex = NSRegularExpression(pattern: pattern, options:.CaseInsensitive, error: &error) {
            regex.enumerateMatchesInString(text, options: .allZeros, range: NSMakeRange(0, count(text))) { result, flags, stop in
                let range = result.range
                let start = advance(text.startIndex, range.location)
                let end = advance(start, range.length)
                let foundText = text.substringWithRange(Range<String.Index>(start: start,end: end))
                var linkAttributesWithName = linkAttributes
                linkAttributesWithName[attributeName] = foundText
                attributedString.addAttributes(linkAttributesWithName, range: range)
            }
        }
    }
If I pass a hashtag (#)(\\w+) or mention (@)(\\w+) pattern the code works perfectly but if the text contains an Emoji the range is offset by the number of emojis preceding it:

I know Swift treats strings differently to Objective-C, since count(string) and count(string.utf16) give me different results, but I am stumped as to how to account for this when using a regular expression.
I could just check the difference between the 2 counts and offset the range, but this seems wrong and hacky to me. There must be another way.
 
     
     
    