I have a string of integer "72101108108111" but it is a string representing bytes of original string "Hello".
How can I convert "72101108108111" to Ascii string "Hello" in Ruby?
I have a string of integer "72101108108111" but it is a string representing bytes of original string "Hello".
How can I convert "72101108108111" to Ascii string "Hello" in Ruby?
 
    
    Answering your question as clarified in comments (which has nothing to do with the title):
I need to encode/decode a string to Base58
EDIT: now as a class (using base58 gem):
require 'base58'
class Base58ForStrings
  def self.encode(str)
    Base58.encode(str.bytes.inject { |a, b| a * 256 + b })
  end
  def self.decode(b58)
    b = []
    d = Base58.decode(b58)
    while (d > 0)
      d, m = d.divmod(256)
      b.unshift(m)
    end
    b.pack('C*').force_encoding('UTF-8')
  end
end
Base58ForStrings.encode('Hello こんにちは')
# => "5scGDXBpe3Vq7szFXzFcxHYovbD9c" 
Base58ForStrings.decode('5scGDXBpe3Vq7szFXzFcxHYovbD9c')
# => "Hello こんにちは"
Works for any UTF-8 string.
 
    
    s = '72101108108111'
pattern=/^[a-zA-Z]/
index,res=0,''
while index<s.size
  len=0
  while (s[index..index+len].to_i.chr=~pattern).nil?
    len+=1
  end
  res << s[index..index+len].to_i.chr
  index+=len+1
end
p res
Try this , because the length of every string to be decoded is not certain.
For example, "72->'H' , 101->'e' , 23->'\x17'"
So every time we find the character decoded is not "a~z" or "A~Z" (ex:\x17),
we just add the length and parse until we find character we want
For this case , the answer will be correctly "Hello"
It may malfunction on some case , but it works well now for my test
Just take it a look
It only works without Exception on the case containing only "A-Z and a-z"
