Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Spark does not escape nested function types #105

Open
briankariuki opened this issue Jul 31, 2024 · 4 comments
Open

Spark does not escape nested function types #105

briankariuki opened this issue Jul 31, 2024 · 4 comments
Labels
bug Something isn't working

Comments

@briankariuki
Copy link

Describe the bug
I have an entity defined with a schema option (i.e news) that is a keyword list with its key types defined by keys. One of the fields (publication_date) may be a function that will return a string for the field. But when I compile this code, I get an error

Error: cannot inject attribute @urls into function/macro because cannot escape #Function<>

It does compile when I remove {:fun, [:map], :string} from the list of type options for publication_date. It does escape the function correctly for the :path option

@urls %Spark.Dsl.Entity{
    name: :url,
    target: AshSitemap.Url,
    examples: [
      "url 'index.html', priority: 0.5"
    ],
    schema: [
      path: [
        type: {
          :or,
          [:string, {:fun, [:map], :string}]
        },
        required: true
      ],
      priority: [type: :float, required: false],
      news: [
        type: :non_empty_keyword_list,
        keys: [
          publication: [
            type: :string,
            required: true
          ],
          publication_date: [
            type: {
              :or,
              [:string, :atom, {:fun, [:map], :string}]
            },
            required: true
          ],
          title: [
            type: {
              :or,
              [:string, :atom]
            },
            required: true
          ],
          keywords: [
            type: {:list, :string},
            required: true
          ]
        ],
        required: false
      ]
    ],
    args: [:path]
  }

To Reproduce
Use the above urls entity as an example

Expected behavior
I expected spark to parse the types of the publication_date correctly

Runtime

  • Elixir version 1.17
  • Erlang version 26.2.5
  • OS MacOS 14.3.1
  • Spark version 2.2.10
  • any related extension versions
@briankariuki briankariuki added the bug Something isn't working label Jul 31, 2024
@zachdaniel
Copy link
Contributor

So spark can't choose to escape/not escape types (that happens after spark comes into play), but what we should do given how these specifications are generally used is not have any three-tuple types.

@zachdaniel
Copy link
Contributor

The function type needs some overhauling anyway, as we need to be able to name each argument type. For now, the type just won't be usable in that context.

@briankariuki
Copy link
Author

So spark can't choose to escape/not escape types (that happens after spark comes into play), but what we should do given how these specifications are generally used is not have any three-tuple types.

Thanks for the explanation. So this means its impossible to allow a function to customize the :publication_date?

@zachdaniel
Copy link
Contributor

You would just need to either use a custom type or a less specific function type, i.e {:fun, 1} means "any one argument function", and that can be successfully unquoted.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
Development

No branches or pull requests

2 participants