The table has an id and step_data columns.
id | step_data
--------------
a1 | {...}
a2 | {...}
a3 | {...}
Where step_data is nested structure as follows, where the actual keys of the metadata can be in of the events objects.
{
  "events": [
    {
      "timestamp": "2021-04-07T17:46:13.739Z",
      "meta": [
        {
          "key": "length",
          "value": "0.898"
        },
        {
          "key": "height",
          "value": "607023104"
        },
        {
          "key": "weight",
          "value": "33509376"
        }
      ]
    },
    {
      "timestamp": "2021-04-07T17:46:13.781Z",
      "meta": [
        {
          "key": "color",
          "value": "0.007"
        },
        {
          "key": "count",
          "value": "641511424"
        },
        {
          "key": "age",
          "value": "0"
        }
      ]
    }
  ]
}
I can extract one field like length pretty easily.
select cast(metadata ->> 'value' as double precision) as length,
       id
from (
         select jsonb_array_elements(jsonb_array_elements(step_data #> '{events}') #> '{meta}') metadata,
                id
         from table
     ) as parsed_keys
where metadata @> '{
  "key": "length"
}'::jsonb
| id | length | 
|---|---|
| a1 | 0.898 | 
| a2 | 0.800 | 
But what I really need is to extract the metadata as columns from a couple of known keys, like length and color. Not sure how to get another column efficiently once I split the array with jsonb_array_elements().
Is there an efficient way to do this without having to call jsonb_array_elements() again and do a join on every single one? For example such that the result set looks like this.
| id | length | color | weight | 
|---|---|---|---|
| a1 | 0.898 | 0.007 | 33509376 | 
| a2 | 0.800 | 1.000 | 15812391 | 
Using Postgres 11.7.
 
     
    