Here is a version using glue as @ha-pu suggested.
dbDeleteCustom = function(x, con, schema_name, table_name) {
#x: 1-row dataframe that contains 1+ key columns, and 1+ update columns.
if (nrow(x) != 1) stop("Input dataframe must be exactly 1 row")
where_str <- purrr::map_chr(colnames(x), ~glue::glue_sql("{`.x`} = {x[[.x]]}", .con = con))
where_str <- paste(where_str, collapse = " AND ")
update_str <- glue::glue("DELETE FROM {schema_name}.{table_name} WHERE {where_str}")
# Execute ------------------------------------------------------------------
query_res <- DBI::dbSendQuery(con, update_str)
rowsAffected <- dbGetRowsAffected(query_res)
DBI::dbClearResult(query_res)
return (rowsAffected)
}
This code is based on code from this post: How to pass data.frame for UPDATE with R DBI
Side thoughts:
This code would need to be adapted for permanently deleting based on id. Best practice is to avoid permanently deleting a record (based on id). Instead use a soft delete process, a column for recording if record is deleted/invalid. The code above was written for deletion of a row in a bridging table, which could be soft deleted.
stringr::str_c
orglue::glue
you could put the "# create list" step directly into themap
call. Still, smart approach. – Beckford