I have this two related django models
class Item(models.Model):
item_nbr = models.IntegerField(primary_key=True)
created = models.DateTimeField(auto_now_add=True)
item_nbr_desc = models.CharField(max_length=155)
class SetItem(models.Model):
set_id = models.CharField(primary_key=True, default='', max_length=12)
set_nbr = models.IntegerField()
items = models.ForeignKey(Item)
Im running an script (periodically) to read the Item table from another database, and use that dataframe to update the django database item table. I'm using the django orm framework to make the interface with the django database
script.py
from app.models import Item
item_table = pd.read_sql(__)
item_table = some_transformations(item_table.copy())
#I remove all the Items that will be updated
Item.objects.filter(item_nbr__in=item_table.item_nbr.unique()).delete()
item_table_records = item_table.to_dict('records')
item_instances = []
fields = item_table.keys()
for record in item_table_records:
kwargs = {
field: record[field] for field in fields
}
item_instances.append(Item(**kwargs))
Item.objects.bulk_create(item_instances) # update the new rows
the problem is that the setItem table its deleted each time that I delete the Items related (because the on_delete=models.CASCADE behavior). I want to update the items not erasing the setItem related rows, and i don't want to change the on_delete default behavior because only in this script I need to upload a whole table, its possible that i want to delete a Item in another context and i hope that the cascade behavior is working. what can i do? its there a bulk update function that might perform a non-destructive update of the table?