This is doable. I used a bunch of sub-selects, and it makes the query a bit long an tedious. It can very likely be simplified a lot more, and I haven't tested performance. I'm not sure which version of SQL you're using, but newer versions should have some functionality that will allow you to simplify also. You'll have to tweak it.
I also added a Date Dimension table to simplify working with dates. As I said above, I'm of the opinion that almost all databases can benefit from a Date Dimension and a Numbers Table. There are a bazillion articles as to why and how, but I've always been a fan of Aaron Bertrand's articles.
SQL Fiddle (see the Fiddle for setup)
Query:
SELECT s5.id, s5.d, s5.col2, s5.col4
FROM (
SELECT s4.id, s4.d, s4.col2, s4.theDay, s4.theYear
/* 5. Smear the past data up to the next change. */
, MAX(s4.col2) OVER (PARTITION BY s4.c1, s4.id) AS col4
FROM (
SELECT s1.d, s1.theDay, s1.theYear, s1.id , s2.col2
/* 4. Identify the records that should be grouped in the window. */
, COUNT(s2.col2) OVER (ORDER BY s1.id, s1.d) AS c1
FROM (
/* 1. build the list of days for each id */
SELECT dd.d, dd.theDay, dd.theYear, s1.id
FROM datedim dd
CROSS APPLY ( SELECT DISTINCT t.id FROM t1 t) s1
) s1
/* 3. JOIN the two together. */
LEFT OUTER JOIN (
/* 2. Remove dupes from modified records */
SELECT s3.id, s3.col2, s3.modified
FROM (
SELECT t1.id, t1.col2, t1.modified, d1.theMonth AS monthModified
/* 2a. Use the ROW_NUMBER() Window Function to number changes in a month. */
, ROW_NUMBER() OVER (PARTITION BY t1.id, d1.theYear, d1.theMonth ORDER BY t1.modified DESC) AS rn
FROM t1
INNER JOIN datedim d1 ON t1.modified = d1.d
) s3
WHERE s3.rn = 1
) s2 ON s1.d = s2.modified
AND s1.id = s2.id
) s4
)s5
/* 6. Filter for only the 1st day of the month. */
WHERE s5.theDay = 1
AND s5.theYear = year(getDate())
AND s5.d <= getDate()
/* 6a. Also, if we set a color before 1/1, moving the filter for the date and the year will allow us to carry the color forward from the last time it was set. */
ORDER BY s5.id, s5.d
This Gives You::
| id | d | col2 | col4 |
|----|------------|--------|--------|
| 1 | 2019-01-01 | (null) | (null) |
| 1 | 2019-02-01 | (null) | red |
| 1 | 2019-03-01 | (null) | green |
| 1 | 2019-04-01 | (null) | blue |
| 1 | 2019-05-01 | (null) | blue |
| 1 | 2019-06-01 | (null) | blue |
| 1 | 2019-07-01 | (null) | blue |
| 1 | 2019-08-01 | (null) | blue |
| 2 | 2019-01-01 | (null) | (null) |
| 2 | 2019-02-01 | (null) | green |
| 2 | 2019-03-01 | (null) | green |
| 2 | 2019-04-01 | (null) | blue |
| 2 | 2019-05-01 | (null) | red |
| 2 | 2019-06-01 | (null) | red |
| 2 | 2019-07-01 | (null) | red |
| 2 | 2019-08-01 | (null) | red |
| 3 | 2019-01-01 | (null) | yellow |
| 3 | 2019-02-01 | (null) | yellow |
| 3 | 2019-03-01 | (null) | green |
| 3 | 2019-04-01 | (null) | green |
| 3 | 2019-05-01 | (null) | green |
| 3 | 2019-06-01 | (null) | green |
| 3 | 2019-07-01 | (null) | green |
| 3 | 2019-08-01 | (null) | green |
| 4 | 2019-01-01 | (null) | (null) |
| 4 | 2019-02-01 | (null) | (null) |
| 4 | 2019-03-01 | (null) | green |
| 4 | 2019-04-01 | (null) | green |
| 4 | 2019-05-01 | (null) | green |
| 4 | 2019-06-01 | orange | orange |
| 4 | 2019-07-01 | (null) | orange |
| 4 | 2019-08-01 | (null) | orange |
I tried to comment the query so you could follow along with my logic. I also added an additional test case in the changes table to demonstrate how to pick the most recent if 2 or more changes happened in a month. And a second change I added was a check for a color set in the previous year. If that shouldn't be carried forward, then the check for year and date can be moved back up into s1.
Essentially, I used my date table to create a running "calendar" to easily "smear" my change data across the missing days. Then applied those days to each id. Then selected the most recent change and filled in the missing colors. Then selected just the 1st day of each month for each id.
Note that with the Calendar Table / Date Dimension, it becomes easily possible to find the color for the third Tuesday of each month, if you wanted to.
And again, if you have a lot of ids and you're checking this report in December, this could become a lot of data. It will possibly have to be massaged down to a manageable size.