ak.to_parquet_dataset
---------------------

.. py:module: ak.to_parquet_dataset

Defined in `awkward.operations.ak_to_parquet_dataset <https://github.com/scikit-hep/awkward/blob/36da52cfa8846355c390beb6555eac1d31c27c26/src/awkward/operations/ak_to_parquet_dataset.py>`__ on `line 11 <https://github.com/scikit-hep/awkward/blob/36da52cfa8846355c390beb6555eac1d31c27c26/src/awkward/operations/ak_to_parquet_dataset.py#L11>`__.

.. py:function:: ak.to_parquet_dataset(directory, filenames=None, storage_options=None)


    :param directory: A directory in which to write ``_common_metadata``
                  and ``_metadata``, making the directory of Parquet files into a dataset.
    :type directory: str or Path
    :param filenames: If None, the ``directory`` will be
                  recursively searched for files ending in ``filename_extension`` and
                  sorted lexicographically. Otherwise, this explicit list of files is
                  taken and row-groups are concatenated in its given order. If any
                  filenames are relative, they are interpreted relative to ``directory``.
    :type filenames: None or list of str or Path
    :param filename_extension: Filename extension (including ``.``) to use to
                           search for files recursively. Ignored if ``filenames`` is None.
    :type filename_extension: str

Creates a ``_common_metadata`` and a ``_metadata`` in a directory of Parquet files.

.. code-block:: python


    >>> ak.to_parquet(array1, "/directory/arr1.parquet", parquet_compliant_nested=True)
    >>> ak.to_parquet(array2, "/directory/arr2.parquet", parquet_compliant_nested=True)
    >>> ak.to_parquet_dataset("/directory")

The ``_common_metadata`` contains the schema that all files share. (If the files
have different schemas, this function raises an exception.)

The ``_metadata`` contains row-group metadata used to seek to specific row-groups
within the multi-file dataset.