I'm making a PyGame game and I have made a Button class to make on-screen buttons. This is the code:
import pygame
pygame.font.init()
dfont = pygame.font.Font('mfdfont.ttf', 64)
#button
class Button():
    def __init__(self, x, y, image, scale = 1, rot = 0, text_in = '', color = 'WHITE', xoff = 0, yoff = 0):
        self.xoff = xoff
        self.yof = yoff
        self.x = x
        self.y = y
        self.scale = scale
        width = image.get_width()
        height = image.get_height()
        self.image = pygame.transform.rotozoom(image, rot, scale)
        self.text_in = text_in
        self.text = dfont.render(self.text_in, True, color)
        self.text_rect = self.text.get_rect(center=(self.x +width/(2/scale) + xoff, self.y + height/(2/scale) + yoff))
        self.rect = self.image.get_rect()
        self.rect.topleft = (x, y)
        self.clicked = False
    def draw(self, surface):
        action = False
        #get mouse position
        pos = pygame.mouse.get_pos()
        #check mouseover and clicked conditions
        if self.rect.collidepoint(pos):
            if pygame.mouse.get_pressed()[0] == 1 and self.clicked == False:
                self.clicked = True
                action = True
        if pygame.mouse.get_pressed()[0] == 0:
            self.clicked = False
        #draw button on screen
        surface.blit(self.image, (self.rect.x, self.rect.y))
        surface.blit(self.text, self.text_rect)
        return action
The problem is that I'm using two surfaces to draw the game to the screen. One is a dummy surface at 1024x2048 resolution that I draw everything to, and the second one is a surface that I can resize to any resolution. The dummy surface then gets scaled and blit to the real surface and the real surface is drawn on the screen. This allows me to have a resizable window without messing the screen positions of UI and game elements.
The actual problem is that after implementing this second surface, click and touch input on buttons doesn't work anymore because I'm basically clicking on the real surface and not the dummy surface the buttons are drawn on. I wonder if there is a way to redirect clicks from a certain position on the real surface to clicks to the relative position on the dummy surface. Or maybe have the button class listen for input on the real surface instead of the dummy surface it's drawn on.
I added a "mobile mode", so when the game is running on a mobile device the entire two surface rendering process is not used and instead uses the classic one surface rendering, which makes the buttons work on mobile devices (or devices that don't allow window resizing). This is a temporary fix for the mobile version, but that still leaves the desktop application unusable because the on screen buttons don't work. I must mention that the OSBs are needed, I won't add keyboard controls for those buttons instead. They must be on screen.
 
    