std::string is made by chars; BSTR is usually a Unicode UTF-16 wchar_t-based string, with a length prefix.
Even if one could use a BSTR as a simple way to marshal a byte array (since the BSTR is length-prefixed, so it can store embedded NULs), and so potentially a BSTR could be used also to store non-UTF-16 text, the usual "natural" behavior for a BSTR is to contain a Unicode UTF-16 wchar_t-string.
So, the first problem is to clarify what kind of encoding the std::string uses (for example: Unicode UTF-8? Or some other code page?). Then you have to convert that string to Unicode UTF-16, and create a BSTR containing that UTF-16 string.
To convert from UTF-8 (or some other code page) to UTF-16, you can use the MultiByteToWideChar() function. If the source std::string contains a UTF-8 string, you can use the CP_UTF8 code page value with the aforementioned API.
Once you have the UTF-16 converted string, you can create a BSTR using it, and pass that as the output BSTR* parameter.
The main Win32 API to create a BSTR is SysAllocString(). There are also some variants in which you can specify the string length.
Or, as a more convenient alternative, you can use the ATL's CComBSTR class to wrap a BSTR in safe RAII boundaries, and use its Detach() method to pass the BSTR as an output BSTR* parameter.
CComBSTR bstrResult( /* UTF-16 string from std::string */ );
*restr = bstrResult.Detach();
Bonus reading:
Eric's Complete Guide To BSTR Semantics