Hello devs! 👋 Hopefully, I'm allowed to simply open a feature request here. Please tell me if I'm missing something.
With SQL model blueprints, it's possible to use a blueprint variable in the column names.
With a Python model that uses the decorated def execute(...) -> pd.DataFrame: ... function, the docs say
Because SQLMesh creates tables before evaluating models, the schema of the output DataFrame is a required argument. The @model argument columns contains a dictionary of column names to types.
So I need to declare the columns in the @model decorator, but they can't be blueprinted.
Here's an minimum not-working example:
@model(
"schema.@{name}", # blueprinting works here!
kind="full",
columns={
"@{column_name}": "text", # ... but not here
"TRANSFORMED_@{column_name}": "text",
},
blueprints=[
{"name": "FirstTable", "column_name": "FIRSTCOLUMN"},
{"name": "SecondTable", "column_name": "SECONDCOLUMN"},
],
)
def execute(
context: ExecutionContext,
start: datetime,
end: datetime,
execution_time: datetime,
**kwargs: t.Any,
) -> pd.DataFrame:
"""Execute the model."""
# Get hold of blueprint variables for the column name
name = context.blueprint_var("name")
column_name = context.blueprint_var("column_name")
source_table = f"{name}Source"
select_statement = f"SELECT {column_name} FROM {source_table};"
output_column_name = f"TRANSFORMED_{column_name}"
df = context.fetchdf(select_statement)
df[output_column_name] = df[column_name].apply(some_fancy_transformation)
return df
Is it possible/easy to support this pattern?
Hello devs! 👋 Hopefully, I'm allowed to simply open a feature request here. Please tell me if I'm missing something.
With SQL model blueprints, it's possible to use a blueprint variable in the column names.
With a Python model that uses the decorated
def execute(...) -> pd.DataFrame: ...function, the docs saySo I need to declare the columns in the
@modeldecorator, but they can't be blueprinted.Here's an minimum not-working example:
Is it possible/easy to support this pattern?