From 410d18d8925426d603322294e0deac5f19eedd7f Mon Sep 17 00:00:00 2001 From: Mike Woofter <108414937+mongoKart@users.noreply.github.com> Date: Wed, 9 Apr 2025 20:28:26 -0500 Subject: [PATCH 01/17] wip --- source/aggregation.txt | 6 + source/aggregation/builders.txt | 944 ++++++++++++++++++++++++++++++++ 2 files changed, 950 insertions(+) create mode 100644 source/aggregation/builders.txt diff --git a/source/aggregation.txt b/source/aggregation.txt index 404bf1ea..8e794b8a 100644 --- a/source/aggregation.txt +++ b/source/aggregation.txt @@ -17,6 +17,12 @@ Aggregation :depth: 2 :class: singlecol +.. toctree:: + :titlesonly: + :maxdepth: 1 + + Builders Syntax + Overview -------- diff --git a/source/aggregation/builders.txt b/source/aggregation/builders.txt new file mode 100644 index 00000000..09f8da9c --- /dev/null +++ b/source/aggregation/builders.txt @@ -0,0 +1,944 @@ +.. _csharp-aggregation-builders: + +========================================= +Builders Syntax for Aggregation Pipelines +========================================= + +.. facet:: + :name: genre + :values: reference + +.. meta:: + :keywords: code examples, dotnet + +.. contents:: On this page + :local: + :backlinks: none + :depth: 2 + :class: singlecol + +Overview +-------- + +This page describes the aggregation stages available in the {+driver-short+}. + +Match +----- + +Use the ``match()`` method to create a :manual:`$match ` +pipeline stage that matches incoming documents against the specified +query filter, filtering out documents that do not match. + +.. tip:: + + The filter can be an instance of any class that implements ``Bson``, but it's + convenient to combine with use of the :ref:`Filters ` class. + +The following example creates a pipeline stage that matches all documents where the +``title`` field is equal to "The Shawshank Redemption": + +.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggBuilders.java + :start-after: begin match + :end-before: end match + :language: java + :dedent: + + +Project +------- + +Use the ``project()`` method to create a :manual:`$project ` +pipeline stage that project specified document fields. Field projection +in aggregation follows the same rules as :ref:`field projection in queries `. + +.. tip:: + + Though the projection can be an instance of any class that implements ``Bson``, + it's convenient to combine with use of :ref:`Projections `. + +The following example creates a pipeline stage that excludes the ``_id`` field but +includes the ``title`` and ``plot`` fields: + +.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggBuilders.java + :start-after: begin project + :end-before: end project + :language: java + :dedent: + +Projecting Computed Fields +~~~~~~~~~~~~~~~~~~~~~~~~~~ + +The ``$project`` stage can project computed fields as well. + +The following example creates a pipeline stage that projects the ``rated`` field +into a new field called ``rating``, effectively renaming the field. + +.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggBuilders.java + :start-after: begin computed + :end-before: end computed + :language: java + :dedent: + + +Documents +--------- + +Use the ``documents()`` method to create a +:manual:`$documents ` +pipeline stage that returns literal documents from input values. + +.. important:: + + If you use a ``$documents`` stage in an aggregation pipeline, it must be the first + stage in the pipeline. + +The following example creates a pipeline stage that creates +sample documents with a ``title`` field: + +.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggBuilders.java + :start-after: // begin documents + :end-before: // end documents + :language: java + :dedent: + +.. important:: + + If you use the ``documents()`` method to provide the input to an aggregation pipeline, + you must call the ``aggregate()`` method on a database instead of on a + collection. + + +Sample +------ + +Use the ``sample()`` method to create a :manual:`$sample ` +pipeline stage to randomly select documents from input. + +The following example creates a pipeline stage that randomly selects 5 documents: + +.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggBuilders.java + :start-after: // begin sample + :end-before: // end sample + :language: java + :dedent: + + + +Sort +---- + +Use the ``sort()`` method to create a :manual:`$sort ` +pipeline stage to sort by the specified criteria. + +.. tip:: + + Though the sort criteria can be an instance of any class that + implements ``Bson``, it's convenient to combine with use of :ref:`Sorts `. + +The following example creates a pipeline stage that sorts in descending order according +to the value of the ``year`` field and then in ascending order according to the +value of the ``title`` field: + +.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggBuilders.java + :start-after: // begin sortStage + :end-before: // end sortStage + :language: java + :dedent: + +Skip +---- + +Use the ``skip()`` method to create a :manual:`$skip ` +pipeline stage to skip over the specified number of documents before +passing documents into the next stage. + +The following example creates a pipeline stage that skips the first ``5`` documents: + +.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggBuilders.java + :start-after: // begin skip + :end-before: // end skip + :language: java + :dedent: + +Limit +----- + +Use the :manual:`$limit ` pipeline stage +to limit the number of documents passed to the next stage. + +The following example creates a pipeline stage that limits the number of documents to ``10``: + +.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggBuilders.java + :start-after: // begin limit + :end-before: // end limit + :language: java + :dedent: + +Lookup +------ + +Use the ``lookup()`` method to create a :manual:`$lookup ` +pipeline stage to perform joins and uncorrelated subqueries between two collections. + +Left Outer Join +~~~~~~~~~~~~~~~ + +The following example creates a pipeline stage that performs a left outer +join between the ``movies`` and ``comments`` collections: + +- It joins the ``_id`` field from ``movies`` to the ``movie_id`` field in ``comments`` +- It outputs the results in the ``joined_comments`` field: + +.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggBuilders.java + :start-after: // begin basic lookup + :end-before: // end basic lookup + :language: java + :dedent: + +Full Join and Uncorrelated Subqueries +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +The following example creates a pipeline stage that joins two collections, ``orders`` +and ``warehouses``, by the item and whether the available quantity is enough +to fulfill the ordered quantity: + +.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggBuilders.java + :start-after: // begin advanced lookup + :end-before: // end advanced lookup + :language: java + :dedent: + +Group +----- + +Use the ``group()`` method to create a :manual:`$group ` +pipeline stage to group documents by a specified expression and output a document +for each distinct grouping. + +.. tip:: + + The driver includes the `Accumulators <{+core-api+}/client/model/Accumulators.html>`__ + class with static factory methods for each of the supported accumulators. + +The following example creates a pipeline stage that groups documents by the value +of the ``customerId`` field. Each group accumulates the sum and average +of the values of the ``quantity`` field into the ``totalQuantity`` and +``averageQuantity`` fields. + +.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggBuilders.java + :start-after: // begin group + :end-before: // end group + :language: java + :dedent: + +Learn more about accumulator operators from the Server manual section +on :manual:`Accumulators `. + +.. _java_aggregates_pick_n: + +Pick-N Accumulators +------------------- + +The pick-n accumulators are aggregation accumulation operators that return +the top and bottom elements given a specific ordering. Use one of the +following builders to create an aggregation accumulation operator: + +- :ref:`minN() ` +- :ref:`maxN() ` +- :ref:`firstN() ` +- :ref:`lastN() ` +- :ref:`top() ` +- :ref:`topN() ` +- :ref:`bottom() ` +- :ref:`bottomN() ` + +.. tip:: + + You can only perform aggregation operations with these pick-n accumulators + when running MongoDB v5.2 or later. + +Learn which aggregation pipeline stages you can use accumulator operators with +from the Server manual section on +:manual:`Accumulators `. + +.. _java_aggregates_min_n: + +MinN +~~~~ + +The ``minN()`` builder creates the :manual:`$minN ` +accumulator which returns data from documents that contain the ``n`` lowest +values of a grouping. + +.. tip:: + + The ``$minN`` and ``$bottomN`` accumulators can perform similar tasks. + See + :manual:`Comparison of $minN and $bottomN Accumulators ` + for recommended usage of each. + +The following example demonstrates how to use the ``minN()`` method to return +the lowest three ``imdb.rating`` values for movies, grouped by ``year``: + +.. literalinclude:: /includes/fundamentals/code-snippets/builders/AccumulatorsPickN.java + :start-after: // begin minN accumulator + :end-before: // end minN accumulator + :language: java + :dedent: + +See the `minN() API documentation <{+core-api+}/client/model/Accumulators.html#minN(java.lang.String,InExpression,NExpression)>`__ +for more information. + +.. _java_aggregates_max_n: + +MaxN +~~~~ + +The ``maxN()`` accumulator returns data from documents that contain the ``n`` +highest values of a grouping. + +The following example demonstrates how to use the ``maxN()`` method to +return the highest two ``imdb.rating`` values for movies, grouped by ``year``: + +.. literalinclude:: /includes/fundamentals/code-snippets/builders/AccumulatorsPickN.java + :start-after: // begin maxN accumulator + :end-before: // end maxN accumulator + :language: java + :dedent: + +See the `maxN() API documentation <{+core-api+}/client/model/Accumulators.html#maxN(java.lang.String,InExpression,NExpression)>`__ +for more information. + +.. _java_aggregates_first_n: + +FirstN +~~~~~~ + +The ``firstN()`` accumulator returns data from the first ``n`` documents in +each grouping for the specified sort order. + +.. tip:: + + The ``$firstN`` and ``$topN`` accumulators can perform similar tasks. + See + :manual:`Comparison of $firstN and $topN Accumulators ` + for recommended usage of each. + +The following example demonstrates how to use the ``firstN()`` method to +return the first four movie ``title`` values, based on the order they came +into the stage, grouped by ``year``: + +.. literalinclude:: /includes/fundamentals/code-snippets/builders/AccumulatorsPickN.java + :start-after: // begin firstN accumulator + :end-before: // end firstN accumulator + :language: java + :dedent: + +See the `firstN() API documentation <{+core-api+}/client/model/Accumulators.html#firstN(java.lang.String,InExpression,NExpression)>`__ +for more information. + +.. _java_aggregates_last_n: + +LastN +~~~~~ + +The ``lastN()`` accumulator returns data from the last ``n`` documents in +each grouping for the specified sort order. + +The following example demonstrates how to use the ``lastN()`` method to show +the last three movie ``title`` values, based on the the order they came into +the stage, grouped by ``year``: + +.. literalinclude:: /includes/fundamentals/code-snippets/builders/AccumulatorsPickN.java + :start-after: // begin lastN accumulator + :end-before: // end lastN accumulator + :language: java + :dedent: + +See the `lastN() API documentation <{+core-api+}/client/model/Accumulators.html#lastN(java.lang.String,InExpression,NExpression)>`__ +for more information. + +.. _java_aggregates_top: + +Top +~~~ + +The ``top()`` accumulator returns data from the first document in a group +based on the specified sort order. + +The following example demonstrates how to use the ``top()`` method to return +the ``title`` and ``imdb.rating`` values for the top rated movies based on the +``imdb.rating``, grouped by ``year``. + +.. literalinclude:: /includes/fundamentals/code-snippets/builders/AccumulatorsPickN.java + :start-after: // begin top accumulator + :end-before: // end top accumulator + :language: java + :dedent: + +See the `top() API documentation <{+core-api+}/client/model/Accumulators.html#top(java.lang.String,org.bson.conversions.Bson,OutExpression)>`__ +for more information. + +.. _java_aggregates_top_n: + +TopN +~~~~ + +The ``topN()`` accumulator returns data from documents that contain the +highest ``n`` values for the specified field. + +.. tip:: + + The ``$firstN`` and ``$topN`` accumulators can perform similar tasks. + See + :manual:`Comparison of $firstN and $topN Accumulators ` + for recommended usage of each. + +The following example demonstrates how to use the ``topN()`` method to return +the ``title`` and ``runtime`` values of the three longest movies based on the +``runtime`` values, grouped by ``year``. + +.. literalinclude:: /includes/fundamentals/code-snippets/builders/AccumulatorsPickN.java + :start-after: // begin topN accumulator + :end-before: // end topN accumulator + :language: java + :dedent: + +See the `topN() API documentation <{+core-api+}/client/model/Accumulators.html#topN(java.lang.String,org.bson.conversions.Bson,OutExpression,NExpression)>`__ +for more information. + +.. _java_aggregates_bottom: + +Bottom +~~~~~~ + +The ``bottom()`` accumulator returns data from the last document in a group +based on the specified sort order. + +The following example demonstrates how to use the ``bottom()`` method to +return the ``title`` and ``runtime`` values of the shortest movie based on the +``runtime`` value, grouped by ``year``. + +.. literalinclude:: /includes/fundamentals/code-snippets/builders/AccumulatorsPickN.java + :start-after: // begin bottom accumulator + :end-before: // end bottom accumulator + :language: java + :dedent: + +See the `bottom() API documentation <{+core-api+}/client/model/Accumulators.html#bottom(java.lang.String,org.bson.conversions.Bson,OutExpression)>`__ +for more information. + +.. _java_aggregates_bottom_n: + +BottomN +~~~~~~~ + +The ``bottomN()`` accumulator returns data from documents that contain the +lowest ``n`` values for the specified field. + +.. tip:: + + The ``$minN`` and ``$bottomN`` accumulators can perform similar tasks. + See :manual:`Comparison of $minN and $bottomN Accumulators ` + for recommended usage of each. + +The following example demonstrates how to use the ``bottomN()`` method to +return the ``title`` and ``imdb.rating`` values of the two lowest rated movies +based on the ``imdb.rating`` value, grouped by ``year``: + +.. literalinclude:: /includes/fundamentals/code-snippets/builders/AccumulatorsPickN.java + :start-after: // begin bottomN accumulator + :end-before: // end bottomN accumulator + :language: java + :dedent: + +See the `bottomN() API documentation <{+core-api+}/client/model/Accumulators.html#bottomN(java.lang.String,org.bson.conversions.Bson,OutExpression,NExpression)>`__ +for more information. + +Unwind +------ + +Use the ``unwind()`` method to create an :manual:`$unwind ` +pipeline stage to deconstruct an array field from input documents, creating +an output document for each array element. + +The following example creates a document for each element in the ``sizes`` array: + +.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggBuilders.java + :start-after: // begin unwindStage + :end-before: // end unwindStage + :language: java + :dedent: + +To preserve documents that have missing or ``null`` +values for the array field, or where array is empty: + + +.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggBuilders.java + :start-after: // begin unwindPreserve + :end-before: // end unwindPreserve + :language: java + :dedent: + +To include the array index, in this example in a field called ``"position"``: + +.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggBuilders.java + :start-after: // begin unwindIndex + :end-before: // end unwindIndex + :language: java + :dedent: + +Out +--- + +Use the ``out()`` method to create an :manual:`$out ` +pipeline stage that writes all documents to the specified collection in +the same database. + +.. important:: + + The ``$out`` stage must be the last stage in any aggregation pipeline. + +The following example writes the results of the pipeline to the ``authors`` +collection: + +.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggBuilders.java + :start-after: // begin out + :end-before: // end out + :language: java + :dedent: + +Merge +----- + +Use the ``merge()`` method to create a :manual:`$merge ` +pipeline stage that merges all documents into the specified collection. + +.. important:: + + The ``$merge`` stage must be the last stage in any aggregation pipeline. + +The following example merges the pipeline into the ``authors`` collection using the default +options: + +.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggBuilders.java + :start-after: // begin mergeStage + :end-before: // end mergeStage + :language: java + :dedent: + +The following example merges the pipeline into the ``customers`` collection in the +``reporting`` database using some options that specify to replace +the document if both ``date`` and ``customerId`` match, otherwise insert the +document: + +.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggBuilders.java + :start-after: // begin mergeOptions + :end-before: // end mergeOptions + :language: java + :dedent: + +GraphLookup +----------- + +Use the ``graphLookup()`` method to create a :manual:`$graphLookup ` +pipeline stage that performs a recursive search on a specified collection to match +a specified field in one document to a specified field of another document. + +The following example computes the social network graph for users in the +``contacts`` collection, recursively matching the value in the ``friends`` field +to the ``name`` field: + +.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggBuilders.java + :start-after: // begin graphLookupBasic + :end-before: // end graphLookupBasic + :language: java + :dedent: + +Using ``GraphLookupOptions``, you can specify the depth to recurse as well as +the name of the depth field, if desired. In this example, ``$graphLookup`` will +recurse up to two times, and create a field called ``degrees`` with the +recursion depth information for every document. + +.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggBuilders.java + :start-after: // begin graphLookupDepth + :end-before: // end graphLookupDepth + :language: java + :dedent: + +Using ``GraphLookupOptions``, you can specify a filter that documents must match +in order for MongoDB to include them in your search. In this +example, only links with "golf" in their ``hobbies`` field will be included. + +.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggBuilders.java + :start-after: // begin graphLookupMatch + :end-before: // end graphLookupMatch + :language: java + :dedent: + +SortByCount +----------- + +Use the ``sortByCount()`` method to create a :manual:`$sortByCount ` +pipeline stage that groups documents by a given expression and then sorts +these groups by count in descending order. + +.. tip:: + + The ``$sortByCount`` stage is identical to a ``$group`` stage with a + ``$sum`` accumulator followed by a ``$sort`` stage. + + .. code-block:: json + + [ + { "$group": { "_id": , "count": { "$sum": 1 } } }, + { "$sort": { "count": -1 } } + ] + +The following example groups documents by the truncated value of the field ``x`` +and computes the count for each distinct value: + +.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggBuilders.java + :start-after: // begin sortByCount + :end-before: // end sortByCount + :language: java + :dedent: + + +ReplaceRoot +----------- + +Use the ``replaceRoot()`` method to create a :manual:`$replaceRoot ` +pipeline stage that replaces each input document with the specified document. + +The following example replaces each input document with the nested document +in the ``spanish_translation`` field: + +.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggBuilders.java + :start-after: // begin replaceRoot + :end-before: // end replaceRoot + :language: java + :dedent: + +AddFields +--------- + +Use the ``addFields()`` method to create an :manual:`$addFields ` +pipeline stage that adds new fields to documents. + +.. tip:: + + Use ``$addFields`` when you do not want to project field inclusion + or exclusion. + +The following example adds two new fields, ``a`` and ``b`` to the input documents: + +.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggBuilders.java + :start-after: // begin addFields + :end-before: // end addFields + :language: java + :dedent: + +Count +----- + +Use the ``count()`` method to create a :manual:`$count ` +pipeline stage that counts the number of documents that enter the stage, and assigns +that value to a specified field name. If you do not specify a field, +``count()`` defaults the field name to "count". + +.. tip:: + + The ``$count`` stage is syntactic sugar for: + + .. code-block:: json + + { "$group":{ "_id": 0, "count": { "$sum" : 1 } } } + +The following example creates a pipeline stage that outputs the count of incoming +documents in a field called "total": + +.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggBuilders.java + :start-after: // begin count + :end-before: // end count + :language: java + :dedent: + + +Bucket +------ + +Use the ``bucket()`` method to create a :manual:`$bucket ` +pipeline stage that automates the bucketing of data around predefined boundary +values. + +The following example creates a pipeline stage that groups incoming documents based +on the value of their ``screenSize`` field, inclusive of the lower boundary +and exclusive of the upper boundary. + +.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggBuilders.java + :start-after: // begin basicBucket + :end-before: // end basicBucket + :language: java + :dedent: + +Use the ``BucketOptions`` class to specify a default bucket for values +outside of the specified boundaries, and to specify additional accumulators. + +The following example creates a pipeline stage that groups incoming documents based +on the value of their ``screenSize`` field, counting the number of documents +that fall within each bucket, pushing the value of ``screenSize`` into a +field called ``matches``, and capturing any screen sizes greater than "70" +into a bucket called "monster" for monstrously large screen sizes: + +.. tip:: + + The driver includes the `Accumulators <{+core-api+}/client/model/Accumulators.html>`__ + class with static factory methods for each of the supported accumulators. + +.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggBuilders.java + :start-after: // begin bucketOptions + :end-before: // end bucketOptions + :language: java + :dedent: + +BucketAuto +---------- + +Use the ``bucketAuto()`` method to create a :manual:`$bucketAuto ` +pipeline stage that automatically determines the boundaries of each bucket +in its attempt to distribute the documents evenly into a specified number of buckets. + +The following example creates a pipeline stage that will attempt to create and evenly +distribute documents into *10* buckets using the value of their ``price`` field: + +.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggBuilders.java + :start-after: // begin bucketAutoBasic + :end-before: // end bucketAutoBasic + :language: java + :dedent: + + +Use the ``BucketAutoOptions`` class to specify a :wikipedia:`preferred number ` +based scheme to set boundary values, and specify additional accumulators. + +The following example creates a pipeline stage that will attempt to create and evenly +distribute documents into *10* buckets using the value of their ``price`` field, +setting the bucket boundaries at powers of 2 (2, 4, 8, 16, ...). It also counts +the number of documents in each bucket, and calculates their average ``price`` +in a new field called ``avgPrice``: + +.. tip:: + + The driver includes the `Accumulators <{+core-api+}/client/model/Accumulators.html>`__ + class with static factory methods for each of the supported accumulators. + +.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggBuilders.java + :start-after: // begin bucketAutoOptions + :end-before: // end bucketAutoOptions + :language: java + :dedent: + +Facet +----- + +Use the ``facet()`` method to create a :manual:`$facet ` +pipeline stage that allows for the definition of parallel pipelines. + +The following example creates a pipeline stage that executes two parallel aggregations: + +- The first aggregation distributes incoming documents into 5 groups according to + their ``attributes.screen_size`` field. + +- The second aggregation counts all *manufacturers* and returns their count, limited + to the top **5**. + +.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggBuilders.java + :start-after: // begin facet + :end-before: // end facet + :language: java + :dedent: + +.. _builders-aggregates-setWindowFields: + +SetWindowFields +--------------- + +Use the ``setWindowFields()`` method to create a :manual:`$setWindowFields ` +pipeline stage that allows using window operators to perform operations +on a specified span of documents in a collection. + +.. tip:: Window Functions + + The driver includes the `Windows <{+core-api+}/client/model/Windows.html>`__ + class with static factory methods for building windowed computations. + +The following example creates a pipeline stage that computes the +accumulated rainfall and the average temperature over the past month for +each locality from more fine-grained measurements presented in the ``rainfall`` +and ``temperature`` fields: + +.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggBuilders.java + :start-after: // begin setWindowFields + :end-before: // end setWindowFields + :language: java + :dedent: + +Densify +------- + +Use the ``densify()`` method to create a +:manual:`$densify ` pipeline +stage that generates a sequence of documents to span a specified interval. + +.. tip:: + + You can use the ``$densify()`` aggregation stage only when running + MongoDB v5.1 or later. + +Consider the following documents retrieved from the :atlas:`Atlas sample weather dataset ` +that contain measurements for a similar ``position`` field, spaced one hour +apart: + +.. code-block:: none + :copyable: false + + Document{{ _id=5553a..., position=Document{{type=Point, coordinates=[-47.9, 47.6]}}, ts=Mon Mar 05 08:00:00 EST 1984, ... }} + Document{{ _id=5553b..., position=Document{{type=Point, coordinates=[-47.9, 47.6]}}, ts=Mon Mar 05 09:00:00 EST 1984, ... }} + +Suppose you needed to create a pipeline stage that performs the following +actions on these documents: + +- Add a document at every 15-minute interval for which a ``ts`` value does not + already exist. +- Group the documents by the ``position`` field. + +The call to the ``densify()`` aggregation stage builder that accomplishes +these actions resembles the following: + +.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggregateDensify.java + :start-after: // begin densify aggregate + :end-before: // end densify aggregate + :language: java + :dedent: + +The following output highlights the documents generated by the aggregate stage +which contain ``ts`` values every 15 minutes between the existing documents: + +.. code-block:: none + :emphasize-lines: 2-4 + :copyable: false + + Document{{ _id=5553a..., position=Document{{type=Point, coordinates=[-47.9, 47.6]}}, ts=Mon Mar 05 08:00:00 EST 1984, ... }} + Document{{ position=Document{{coordinates=[-47.9, 47.6]}}, ts=Mon Mar 05 08:15:00 EST 1984 }} + Document{{ position=Document{{coordinates=[-47.9, 47.6]}}, ts=Mon Mar 05 08:30:00 EST 1984 }} + Document{{ position=Document{{coordinates=[-47.9, 47.6]}}, ts=Mon Mar 05 08:45:00 EST 1984 }} + Document{{ _id=5553b..., position=Document{{type=Point, coordinates=[-47.9, 47.6]}}, ts=Mon Mar 05 09:00:00 EST 1984, ... }} + +See the `densify package API documentation <{+core-api+}/client/model/densify/package-summary.html>`__ +for more information. + +Fill +---- + +Use the ``fill()`` method to create a +`$fill `__ +pipeline stage that populates ``null`` and missing field values. + +.. tip:: + + You can use the ``$fill()`` aggregation stage only when running + MongoDB v5.3 or later. + +Consider the following documents that contain temperature and air pressure +measurements at an hourly interval: + +.. code-block:: none + :copyable: false + + Document{{_id=6308a..., hour=1, temperature=23C, air_pressure=29.74}} + Document{{_id=6308b..., hour=2, temperature=23.5C}} + Document{{_id=6308c..., hour=3, temperature=null, air_pressure=29.76}} + +Suppose you needed to populate missing temperature and air pressure +data points in the documents as follows: + +- Populate the ``air_pressure`` field for hour "2" using linear interpolation + to calculate the value. +- Set the missing ``temperature`` value to "23.6C" for hour "3". + +The call to the ``fill()`` aggregation stage builder that accomplishes +these actions resembles the following: + +.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggregateFill.java + :start-after: // begin fill aggregate + :end-before: // end fill aggregate + :language: java + :dedent: + +The following output highlights the documents that contain fields populated +by the aggregate stage: + +.. code-block:: none + :emphasize-lines: 2,3 + :copyable: false + + Document{{_id=6308a..., hour=1, temperature=23C, air_pressure=29.74}} + Document{{_id=6308b..., hour=2, temperature=23.5C, air_pressure=29.75}} + Document{{_id=6308c..., hour=3, temperature=23.6C, air_pressure=29.76}} + +See the `fill package API documentation <{+core-api+}/client/model/fill/package-summary.html>`__ +for more information. + +Atlas Full-Text Search +---------------------- + +Use the ``search()`` method to create a :manual:`$search ` +pipeline stage that specifies a full-text search of one or more fields. + +.. tip:: Only Available on Atlas for MongoDB v4.2 and later + + This aggregation pipeline operator is only available for collections hosted + on :atlas:`MongoDB Atlas ` clusters running v4.2 or later that are + covered by an :atlas:`Atlas search index `. + Learn more about the required setup and the functionality of this operator + from the :ref:`Atlas Search ` documentation. + +The following example creates a pipeline stage that searches the ``title`` +field for text that contains the word "Future": + +.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggregateSearchBuilderExample.java + :start-after: // begin atlasTextSearch + :end-before: // end atlasTextSearch + :language: java + :dedent: + +Learn more about the builders from the +`search package API documentation <{+core-api+}/client/model/search/package-summary.html>`__. + +Atlas Search Metadata +--------------------- + +Use the ``searchMeta()`` method to create a +:manual:`$searchMeta ` +pipeline stage which returns only the metadata part of the results from +Atlas full-text search queries. + +.. tip:: Only Available on Atlas for MongoDB v4.4.11 and later + + This aggregation pipeline operator is only available + on :atlas:`MongoDB Atlas ` clusters running v4.4.11 and later. For a + detailed list of version availability, see the MongoDB Atlas documentation + on :atlas:`$searchMeta `. + +The following example shows the ``count`` metadata for an Atlas search +aggregation stage: + +.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggregateSearchBuilderExample.java + :start-after: // begin atlasSearchMeta + :end-before: // end atlasSearchMeta + :language: java + :dedent: + +Learn more about this helper from the +`searchMeta() API documentation <{+core-api+}/client/model/Aggregates.html#searchMeta(com.mongodb.client.model.search.SearchCollector)>`__. From a06c1942c63ab8848a460bdd45fb429ee347c19f Mon Sep 17 00:00:00 2001 From: Mike Woofter <108414937+mongoKart@users.noreply.github.com> Date: Thu, 10 Apr 2025 09:46:45 -0500 Subject: [PATCH 02/17] wip --- source/aggregation.txt | 220 +++- .../aggregation/{builders.txt => stages.txt} | 1160 ++++++++--------- 2 files changed, 760 insertions(+), 620 deletions(-) rename source/aggregation/{builders.txt => stages.txt} (82%) diff --git a/source/aggregation.txt b/source/aggregation.txt index 8e794b8a..5517af16 100644 --- a/source/aggregation.txt +++ b/source/aggregation.txt @@ -7,7 +7,7 @@ Aggregation .. facet:: :name: genre :values: reference - + .. meta:: :keywords: code example, transform, pipeline @@ -21,7 +21,8 @@ Aggregation :titlesonly: :maxdepth: 1 - Builders Syntax + Aggergation Stages + Aggergation Operators Overview -------- @@ -96,6 +97,221 @@ performing aggregation operations: - The :manual:`$graphLookup ` stage has a strict memory limit of 100 megabytes and ignores the ``AllowDiskUse`` property. +Stages +------ + +.. list-table:: + :header-rows: 1 + :widths: 20 80 + + * - Stage + - Description + + * - :pipeline:`$bucket` + + - Categorizes incoming documents into groups, called buckets, + based on a specified expression and bucket boundaries. + + * - :pipeline:`$bucketAuto` + + - Categorizes incoming documents into a specific number of + groups, called buckets, based on a specified expression. + Bucket boundaries are automatically determined in an attempt + to evenly distribute the documents into the specified number + of buckets. + + * - :pipeline:`$changeStream` + + - Returns a :ref:`Change Stream ` cursor for the + collection. This stage can only occur once in an aggregation + pipeline and it must occur as the first stage. + + * - :pipeline:`$changeStreamSplitLargeEvent` + + - .. include:: /includes/changeStreamSplitLargeEvent-introduction.rst + + * - :pipeline:`$count` + + - Returns a count of the number of documents at this stage of + the aggregation pipeline. + + Distinct from the :group:`$count` aggregation accumulator. + + * - :pipeline:`$densify` + + - .. include:: /includes/fact-densify-description.rst + + * - :pipeline:`$documents` + - Returns literal documents from input expressions. + + * - :pipeline:`$facet` + + - Processes multiple :ref:`aggregation pipelines + ` within a single stage on the same set + of input documents. Enables the creation of multi-faceted + aggregations capable of characterizing data across multiple + dimensions, or facets, in a single stage. + + * - :pipeline:`$graphLookup` + + - Performs a recursive search on a collection. To each output + document, adds a new array field that contains the traversal + results of the recursive search for that document. + + * - :pipeline:`$group` + + - Groups input documents by a specified identifier expression + and applies the accumulator expression(s), if specified, to + each group. Consumes all input documents and outputs one + document per each distinct group. The output documents only + contain the identifier field and, if specified, accumulated + fields. + + * - :pipeline:`$limit` + + - Passes the first *n* documents unmodified to the pipeline + where *n* is the specified limit. For each input document, + outputs either one document (for the first *n* documents) or + zero documents (after the first *n* documents). + + * - :pipeline:`$lookup` + + - Performs a left outer join to another collection in the + *same* database to filter in documents from the "joined" + collection for processing. + + * - :pipeline:`$match` + + - Filters the document stream to allow only matching documents + to pass unmodified into the next pipeline stage. + :pipeline:`$match` uses standard MongoDB queries. For each + input document, outputs either one document (a match) or zero + documents (no match). + + * - :pipeline:`$merge` + + - Writes the resulting documents of the aggregation pipeline to + a collection. The stage can incorporate (insert new + documents, merge documents, replace documents, keep existing + documents, fail the operation, process documents with a + custom update pipeline) the results into an output + collection. To use the :pipeline:`$merge` stage, it must be + the last stage in the pipeline. + + * - :pipeline:`$out` + + - Writes the resulting documents of the aggregation pipeline to + a collection. To use the :pipeline:`$out` stage, it must be + the last stage in the pipeline. + + * - :pipeline:`$project` + + - Reshapes each document in the stream, such as by adding new + fields or removing existing fields. For each input document, + outputs one document. + + See also :pipeline:`$unset` for removing existing fields. + + * - :pipeline:`$replaceRoot` + + - Replaces a document with the specified embedded document. The + operation replaces all existing fields in the input document, + including the ``_id`` field. Specify a document embedded in + the input document to promote the embedded document to the + top level. + + * - :pipeline:`$sample` + + - Randomly selects the specified number of documents from its + input. + + * - :pipeline:`$search` + + - Performs a full-text search of the field or fields in an + :atlas:`Atlas ` + collection. + + ``$search`` is only available for MongoDB Atlas clusters, and is not + available for self-managed deployments. To learn more, see + :atlas:`Atlas Search Aggregation Pipeline Stages + `. + + * - :pipeline:`$searchMeta` + + - Returns different types of metadata result documents for the + :atlas:`Atlas Search ` query against an + :atlas:`Atlas ` + collection. + + ``$searchMeta`` is only available for MongoDB Atlas clusters, + and is not available for self-managed deployments. To learn + more, see :atlas:`Atlas Search Aggregation Pipeline Stages + `. + + * - :pipeline:`$set` + + - Adds new fields to documents. Similar to + :pipeline:`$project`, :pipeline:`$set` reshapes each + document in the stream; specifically, by adding new fields to + output documents that contain both the existing fields + from the input documents and the newly added fields. + + :pipeline:`$set` is an alias for :pipeline:`$addFields` stage. + + * - :pipeline:`$setWindowFields` + + - Groups documents into windows and applies one or more + operators to the documents in each window. + + .. versionadded:: 5.0 + + * - :pipeline:`$skip` + + - Skips the first *n* documents where *n* is the specified skip + number and passes the remaining documents unmodified to the + pipeline. For each input document, outputs either zero + documents (for the first *n* documents) or one document (if + after the first *n* documents). + + * - :pipeline:`$sort` + + - Reorders the document stream by a specified sort key. Only + the order changes; the documents remain unmodified. For each + input document, outputs one document. + + * - :pipeline:`$sortByCount` + + - Groups incoming documents based on the value of a specified + expression, then computes the count of documents in each + distinct group. + + * - :pipeline:`$unionWith` + + - Performs a union of two collections; i.e. combines + pipeline results from two collections into a single + result set. + + * - :pipeline:`$unwind` + + - Deconstructs an array field from the input documents to + output a document for *each* element. Each output document + replaces the array with an element value. For each input + document, outputs *n* documents where *n* is the number of + array elements and can be zero for an empty array. + + * - :pipeline:`$vectorSearch` + + - Performs an :abbr:`ANN (Approximate Nearest Neighbor)` or + :abbr:`ENN (Exact Nearest Neighbor)` search on a + vector in the specified field of an + :atlas:`Atlas ` collection. + + ``$vectorSearch`` is only available for MongoDB Atlas clusters + running MongoDB v6.0.11 or higher, and is not available for + self-managed deployments. + + .. versionadded:: 7.0.2 + Aggregation Example ------------------- diff --git a/source/aggregation/builders.txt b/source/aggregation/stages.txt similarity index 82% rename from source/aggregation/builders.txt rename to source/aggregation/stages.txt index 09f8da9c..2c2795df 100644 --- a/source/aggregation/builders.txt +++ b/source/aggregation/stages.txt @@ -1,8 +1,8 @@ -.. _csharp-aggregation-builders: +.. _csharp-aggregation-stages: -========================================= -Builders Syntax for Aggregation Pipelines -========================================= +================== +Aggregation Stages +================== .. facet:: :name: genre @@ -20,65 +20,153 @@ Builders Syntax for Aggregation Pipelines Overview -------- -This page describes the aggregation stages available in the {+driver-short+}. +Bucket +------ -Match ------ +Use the ``bucket()`` method to create a :manual:`$bucket ` +pipeline stage that automates the bucketing of data around predefined boundary +values. -Use the ``match()`` method to create a :manual:`$match ` -pipeline stage that matches incoming documents against the specified -query filter, filtering out documents that do not match. +The following example creates a pipeline stage that groups incoming documents based +on the value of their ``screenSize`` field, inclusive of the lower boundary +and exclusive of the upper boundary. + +.. literalinclude:: /includes/aggregation/Builders.cs + :start-after: // begin basicBucket + :end-before: // end basicBucket + :language: csharp + :dedent: + +Use the ``BucketOptions`` class to specify a default bucket for values +outside of the specified boundaries, and to specify additional accumulators. + +The following example creates a pipeline stage that groups incoming documents based +on the value of their ``screenSize`` field, counting the number of documents +that fall within each bucket, pushing the value of ``screenSize`` into a +field called ``matches``, and capturing any screen sizes greater than "70" +into a bucket called "monster" for monstrously large screen sizes: .. tip:: - The filter can be an instance of any class that implements ``Bson``, but it's - convenient to combine with use of the :ref:`Filters ` class. + The driver includes the `Accumulators <{+core-api+}/client/model/Accumulators.html>`__ + class with static factory methods for each of the supported accumulators. -The following example creates a pipeline stage that matches all documents where the -``title`` field is equal to "The Shawshank Redemption": +.. literalinclude:: /includes/aggregation/Builders.cs + :start-after: // begin bucketOptions + :end-before: // end bucketOptions + :language: csharp + :dedent: -.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggBuilders.java - :start-after: begin match - :end-before: end match - :language: java +BucketAuto +---------- + +Use the ``bucketAuto()`` method to create a :manual:`$bucketAuto ` +pipeline stage that automatically determines the boundaries of each bucket +in its attempt to distribute the documents evenly into a specified number of buckets. + +The following example creates a pipeline stage that will attempt to create and evenly +distribute documents into *10* buckets using the value of their ``price`` field: + +.. literalinclude:: /includes/aggregation/Builders.cs + :start-after: // begin bucketAutoBasic + :end-before: // end bucketAutoBasic + :language: csharp :dedent: -Project -------- +Use the ``BucketAutoOptions`` class to specify a :wikipedia:`preferred number ` +based scheme to set boundary values, and specify additional accumulators. -Use the ``project()`` method to create a :manual:`$project ` -pipeline stage that project specified document fields. Field projection -in aggregation follows the same rules as :ref:`field projection in queries `. +The following example creates a pipeline stage that will attempt to create and evenly +distribute documents into *10* buckets using the value of their ``price`` field, +setting the bucket boundaries at powers of 2 (2, 4, 8, 16, ...). It also counts +the number of documents in each bucket, and calculates their average ``price`` +in a new field called ``avgPrice``: + +.. literalinclude:: /includes/aggregation/Builders.cs + :start-after: // begin bucketAutoOptions + :end-before: // end bucketAutoOptions + :language: csharp + :dedent: + +Count +----- + +Use the ``count()`` method to create a :manual:`$count ` +pipeline stage that counts the number of documents that enter the stage, and assigns +that value to a specified field name. If you do not specify a field, +``count()`` defaults the field name to "count". .. tip:: - Though the projection can be an instance of any class that implements ``Bson``, - it's convenient to combine with use of :ref:`Projections `. + The ``$count`` stage is syntactic sugar for: -The following example creates a pipeline stage that excludes the ``_id`` field but -includes the ``title`` and ``plot`` fields: + .. code-block:: json -.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggBuilders.java - :start-after: begin project - :end-before: end project - :language: java + { "$group":{ "_id": 0, "count": { "$sum" : 1 } } } + +The following example creates a pipeline stage that outputs the count of incoming +documents in a field called "total": + +.. literalinclude:: /includes/aggregation/Builders.cs + :start-after: // begin count + :end-before: // end count + :language: csharp :dedent: -Projecting Computed Fields -~~~~~~~~~~~~~~~~~~~~~~~~~~ +Densify +------- -The ``$project`` stage can project computed fields as well. +Use the ``densify()`` method to create a +:manual:`$densify ` pipeline +stage that generates a sequence of documents to span a specified interval. -The following example creates a pipeline stage that projects the ``rated`` field -into a new field called ``rating``, effectively renaming the field. +.. tip:: -.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggBuilders.java - :start-after: begin computed - :end-before: end computed - :language: java + You can use the ``$densify()`` aggregation stage only when running + MongoDB v5.1 or later. + +Consider the following documents retrieved from the :atlas:`Atlas sample weather dataset ` +that contain measurements for a similar ``position`` field, spaced one hour +apart: + +.. code-block:: none + :copyable: false + + Document{{ _id=5553a..., position=Document{{type=Point, coordinates=[-47.9, 47.6]}}, ts=Mon Mar 05 08:00:00 EST 1984, ... }} + Document{{ _id=5553b..., position=Document{{type=Point, coordinates=[-47.9, 47.6]}}, ts=Mon Mar 05 09:00:00 EST 1984, ... }} + +Suppose you needed to create a pipeline stage that performs the following +actions on these documents: + +- Add a document at every 15-minute interval for which a ``ts`` value does not + already exist. +- Group the documents by the ``position`` field. + +The call to the ``densify()`` aggregation stage builder that accomplishes +these actions resembles the following: + +.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggregateDensify.java + :start-after: // begin densify aggregate + :end-before: // end densify aggregate + :language: csharp :dedent: +The following output highlights the documents generated by the aggregate stage +which contain ``ts`` values every 15 minutes between the existing documents: + +.. code-block:: none + :emphasize-lines: 2-4 + :copyable: false + + Document{{ _id=5553a..., position=Document{{type=Point, coordinates=[-47.9, 47.6]}}, ts=Mon Mar 05 08:00:00 EST 1984, ... }} + Document{{ position=Document{{coordinates=[-47.9, 47.6]}}, ts=Mon Mar 05 08:15:00 EST 1984 }} + Document{{ position=Document{{coordinates=[-47.9, 47.6]}}, ts=Mon Mar 05 08:30:00 EST 1984 }} + Document{{ position=Document{{coordinates=[-47.9, 47.6]}}, ts=Mon Mar 05 08:45:00 EST 1984 }} + Document{{ _id=5553b..., position=Document{{type=Point, coordinates=[-47.9, 47.6]}}, ts=Mon Mar 05 09:00:00 EST 1984, ... }} + +See the `densify package API documentation <{+core-api+}/client/model/densify/package-summary.html>`__ +for more information. Documents --------- @@ -95,10 +183,10 @@ pipeline stage that returns literal documents from input values. The following example creates a pipeline stage that creates sample documents with a ``title`` field: -.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggBuilders.java +.. literalinclude:: /includes/aggregation/Builders.cs :start-after: // begin documents :end-before: // end documents - :language: java + :language: csharp :dedent: .. important:: @@ -107,59 +195,90 @@ sample documents with a ``title`` field: you must call the ``aggregate()`` method on a database instead of on a collection. +Facet +----- + +Use the ``facet()`` method to create a :manual:`$facet ` +pipeline stage that allows for the definition of parallel pipelines. -Sample ------- +The following example creates a pipeline stage that executes two parallel aggregations: -Use the ``sample()`` method to create a :manual:`$sample ` -pipeline stage to randomly select documents from input. +- The first aggregation distributes incoming documents into 5 groups according to + their ``attributes.screen_size`` field. -The following example creates a pipeline stage that randomly selects 5 documents: +- The second aggregation counts all *manufacturers* and returns their count, limited + to the top **5**. -.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggBuilders.java - :start-after: // begin sample - :end-before: // end sample - :language: java +.. literalinclude:: /includes/aggregation/Builders.cs + :start-after: // begin facet + :end-before: // end facet + :language: csharp :dedent: +GraphLookup +----------- +Use the ``graphLookup()`` method to create a :manual:`$graphLookup ` +pipeline stage that performs a recursive search on a specified collection to match +a specified field in one document to a specified field of another document. -Sort ----- +The following example computes the social network graph for users in the +``contacts`` collection, recursively matching the value in the ``friends`` field +to the ``name`` field: -Use the ``sort()`` method to create a :manual:`$sort ` -pipeline stage to sort by the specified criteria. +.. literalinclude:: /includes/aggregation/Builders.cs + :start-after: // begin graphLookupBasic + :end-before: // end graphLookupBasic + :language: csharp + :dedent: -.. tip:: +Using ``GraphLookupOptions``, you can specify the depth to recurse as well as +the name of the depth field, if desired. In this example, ``$graphLookup`` will +recurse up to two times, and create a field called ``degrees`` with the +recursion depth information for every document. - Though the sort criteria can be an instance of any class that - implements ``Bson``, it's convenient to combine with use of :ref:`Sorts `. +.. literalinclude:: /includes/aggregation/Builders.cs + :start-after: // begin graphLookupDepth + :end-before: // end graphLookupDepth + :language: csharp + :dedent: -The following example creates a pipeline stage that sorts in descending order according -to the value of the ``year`` field and then in ascending order according to the -value of the ``title`` field: +Using ``GraphLookupOptions``, you can specify a filter that documents must match +in order for MongoDB to include them in your search. In this +example, only links with "golf" in their ``hobbies`` field will be included. -.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggBuilders.java - :start-after: // begin sortStage - :end-before: // end sortStage - :language: java +.. literalinclude:: /includes/aggregation/Builders.cs + :start-after: // begin graphLookupMatch + :end-before: // end graphLookupMatch + :language: csharp :dedent: -Skip ----- +Group +----- -Use the ``skip()`` method to create a :manual:`$skip ` -pipeline stage to skip over the specified number of documents before -passing documents into the next stage. +Use the ``group()`` method to create a :manual:`$group ` +pipeline stage to group documents by a specified expression and output a document +for each distinct grouping. -The following example creates a pipeline stage that skips the first ``5`` documents: +.. tip:: -.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggBuilders.java - :start-after: // begin skip - :end-before: // end skip - :language: java + The driver includes the `Accumulators <{+core-api+}/client/model/Accumulators.html>`__ + class with static factory methods for each of the supported accumulators. + +The following example creates a pipeline stage that groups documents by the value +of the ``customerId`` field. Each group accumulates the sum and average +of the values of the ``quantity`` field into the ``totalQuantity`` and +``averageQuantity`` fields. + +.. literalinclude:: /includes/aggregation/Builders.cs + :start-after: // begin group + :end-before: // end group + :language: csharp :dedent: +Learn more about accumulator operators from the Server manual section +on :manual:`Accumulators `. + Limit ----- @@ -168,10 +287,10 @@ to limit the number of documents passed to the next stage. The following example creates a pipeline stage that limits the number of documents to ``10``: -.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggBuilders.java +.. literalinclude:: /includes/aggregation/Builders.cs :start-after: // begin limit :end-before: // end limit - :language: java + :language: csharp :dedent: Lookup @@ -189,10 +308,10 @@ join between the ``movies`` and ``comments`` collections: - It joins the ``_id`` field from ``movies`` to the ``movie_id`` field in ``comments`` - It outputs the results in the ``joined_comments`` field: -.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggBuilders.java +.. literalinclude:: /includes/aggregation/Builders.cs :start-after: // begin basic lookup :end-before: // end basic lookup - :language: java + :language: csharp :dedent: Full Join and Uncorrelated Subqueries @@ -202,378 +321,258 @@ The following example creates a pipeline stage that joins two collections, ``ord and ``warehouses``, by the item and whether the available quantity is enough to fulfill the ordered quantity: -.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggBuilders.java +.. literalinclude:: /includes/aggregation/Builders.cs :start-after: // begin advanced lookup :end-before: // end advanced lookup - :language: java + :language: csharp :dedent: -Group +Match ----- -Use the ``group()`` method to create a :manual:`$group ` -pipeline stage to group documents by a specified expression and output a document -for each distinct grouping. +Use the ``match()`` method to create a :manual:`$match ` +pipeline stage that matches incoming documents against the specified +query filter, filtering out documents that do not match. .. tip:: - The driver includes the `Accumulators <{+core-api+}/client/model/Accumulators.html>`__ - class with static factory methods for each of the supported accumulators. + The filter can be an instance of any class that implements ``Bson``, but it's + convenient to combine with use of the :ref:`Filters ` class. -The following example creates a pipeline stage that groups documents by the value -of the ``customerId`` field. Each group accumulates the sum and average -of the values of the ``quantity`` field into the ``totalQuantity`` and -``averageQuantity`` fields. +The following example creates a pipeline stage that matches all documents where the +``title`` field is equal to "The Shawshank Redemption": -.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggBuilders.java - :start-after: // begin group - :end-before: // end group - :language: java +.. literalinclude:: /includes/aggregation/Builders.cs + :start-after: begin match + :end-before: end match + :language: csharp :dedent: -Learn more about accumulator operators from the Server manual section -on :manual:`Accumulators `. +Merge +----- -.. _java_aggregates_pick_n: +Use the ``merge()`` method to create a :manual:`$merge ` +pipeline stage that merges all documents into the specified collection. -Pick-N Accumulators -------------------- +.. important:: -The pick-n accumulators are aggregation accumulation operators that return -the top and bottom elements given a specific ordering. Use one of the -following builders to create an aggregation accumulation operator: - -- :ref:`minN() ` -- :ref:`maxN() ` -- :ref:`firstN() ` -- :ref:`lastN() ` -- :ref:`top() ` -- :ref:`topN() ` -- :ref:`bottom() ` -- :ref:`bottomN() ` - -.. tip:: - - You can only perform aggregation operations with these pick-n accumulators - when running MongoDB v5.2 or later. - -Learn which aggregation pipeline stages you can use accumulator operators with -from the Server manual section on -:manual:`Accumulators `. - -.. _java_aggregates_min_n: - -MinN -~~~~ - -The ``minN()`` builder creates the :manual:`$minN ` -accumulator which returns data from documents that contain the ``n`` lowest -values of a grouping. - -.. tip:: - - The ``$minN`` and ``$bottomN`` accumulators can perform similar tasks. - See - :manual:`Comparison of $minN and $bottomN Accumulators ` - for recommended usage of each. + The ``$merge`` stage must be the last stage in any aggregation pipeline. -The following example demonstrates how to use the ``minN()`` method to return -the lowest three ``imdb.rating`` values for movies, grouped by ``year``: +The following example merges the pipeline into the ``authors`` collection using the default +options: -.. literalinclude:: /includes/fundamentals/code-snippets/builders/AccumulatorsPickN.java - :start-after: // begin minN accumulator - :end-before: // end minN accumulator - :language: java +.. literalinclude:: /includes/aggregation/Builders.cs + :start-after: // begin mergeStage + :end-before: // end mergeStage + :language: csharp :dedent: -See the `minN() API documentation <{+core-api+}/client/model/Accumulators.html#minN(java.lang.String,InExpression,NExpression)>`__ -for more information. - -.. _java_aggregates_max_n: - -MaxN -~~~~ - -The ``maxN()`` accumulator returns data from documents that contain the ``n`` -highest values of a grouping. - -The following example demonstrates how to use the ``maxN()`` method to -return the highest two ``imdb.rating`` values for movies, grouped by ``year``: +The following example merges the pipeline into the ``customers`` collection in the +``reporting`` database using some options that specify to replace +the document if both ``date`` and ``customerId`` match, otherwise insert the +document: -.. literalinclude:: /includes/fundamentals/code-snippets/builders/AccumulatorsPickN.java - :start-after: // begin maxN accumulator - :end-before: // end maxN accumulator - :language: java +.. literalinclude:: /includes/aggregation/Builders.cs + :start-after: // begin mergeOptions + :end-before: // end mergeOptions + :language: csharp :dedent: -See the `maxN() API documentation <{+core-api+}/client/model/Accumulators.html#maxN(java.lang.String,InExpression,NExpression)>`__ -for more information. - -.. _java_aggregates_first_n: - -FirstN -~~~~~~ +Out +--- -The ``firstN()`` accumulator returns data from the first ``n`` documents in -each grouping for the specified sort order. +Use the ``out()`` method to create an :manual:`$out ` +pipeline stage that writes all documents to the specified collection in +the same database. -.. tip:: +.. important:: - The ``$firstN`` and ``$topN`` accumulators can perform similar tasks. - See - :manual:`Comparison of $firstN and $topN Accumulators ` - for recommended usage of each. + The ``$out`` stage must be the last stage in any aggregation pipeline. -The following example demonstrates how to use the ``firstN()`` method to -return the first four movie ``title`` values, based on the order they came -into the stage, grouped by ``year``: +The following example writes the results of the pipeline to the ``authors`` +collection: -.. literalinclude:: /includes/fundamentals/code-snippets/builders/AccumulatorsPickN.java - :start-after: // begin firstN accumulator - :end-before: // end firstN accumulator - :language: java +.. literalinclude:: /includes/aggregation/Builders.cs + :start-after: // begin out + :end-before: // end out + :language: csharp :dedent: -See the `firstN() API documentation <{+core-api+}/client/model/Accumulators.html#firstN(java.lang.String,InExpression,NExpression)>`__ -for more information. +Project +------- -.. _java_aggregates_last_n: +Use the ``project()`` method to create a :manual:`$project ` +pipeline stage that project specified document fields. Field projection +in aggregation follows the same rules as :ref:`field projection in queries `. -LastN -~~~~~ +.. tip:: -The ``lastN()`` accumulator returns data from the last ``n`` documents in -each grouping for the specified sort order. + Though the projection can be an instance of any class that implements ``Bson``, + it's convenient to combine with use of :ref:`Projections `. -The following example demonstrates how to use the ``lastN()`` method to show -the last three movie ``title`` values, based on the the order they came into -the stage, grouped by ``year``: +The following example creates a pipeline stage that excludes the ``_id`` field but +includes the ``title`` and ``plot`` fields: -.. literalinclude:: /includes/fundamentals/code-snippets/builders/AccumulatorsPickN.java - :start-after: // begin lastN accumulator - :end-before: // end lastN accumulator - :language: java +.. literalinclude:: /includes/aggregation/Builders.cs + :start-after: begin project + :end-before: end project + :language: csharp :dedent: -See the `lastN() API documentation <{+core-api+}/client/model/Accumulators.html#lastN(java.lang.String,InExpression,NExpression)>`__ -for more information. - -.. _java_aggregates_top: - -Top -~~~ +Projecting Computed Fields +~~~~~~~~~~~~~~~~~~~~~~~~~~ -The ``top()`` accumulator returns data from the first document in a group -based on the specified sort order. +The ``$project`` stage can project computed fields as well. -The following example demonstrates how to use the ``top()`` method to return -the ``title`` and ``imdb.rating`` values for the top rated movies based on the -``imdb.rating``, grouped by ``year``. +The following example creates a pipeline stage that projects the ``rated`` field +into a new field called ``rating``, effectively renaming the field. -.. literalinclude:: /includes/fundamentals/code-snippets/builders/AccumulatorsPickN.java - :start-after: // begin top accumulator - :end-before: // end top accumulator - :language: java +.. literalinclude:: /includes/aggregation/Builders.cs + :start-after: begin computed + :end-before: end computed + :language: csharp :dedent: -See the `top() API documentation <{+core-api+}/client/model/Accumulators.html#top(java.lang.String,org.bson.conversions.Bson,OutExpression)>`__ -for more information. - -.. _java_aggregates_top_n: - -TopN -~~~~ - -The ``topN()`` accumulator returns data from documents that contain the -highest ``n`` values for the specified field. - -.. tip:: +ReplaceRoot +----------- - The ``$firstN`` and ``$topN`` accumulators can perform similar tasks. - See - :manual:`Comparison of $firstN and $topN Accumulators ` - for recommended usage of each. +Use the ``replaceRoot()`` method to create a :manual:`$replaceRoot ` +pipeline stage that replaces each input document with the specified document. -The following example demonstrates how to use the ``topN()`` method to return -the ``title`` and ``runtime`` values of the three longest movies based on the -``runtime`` values, grouped by ``year``. +The following example replaces each input document with the nested document +in the ``spanish_translation`` field: -.. literalinclude:: /includes/fundamentals/code-snippets/builders/AccumulatorsPickN.java - :start-after: // begin topN accumulator - :end-before: // end topN accumulator - :language: java +.. literalinclude:: /includes/aggregation/Builders.cs + :start-after: // begin replaceRoot + :end-before: // end replaceRoot + :language: csharp :dedent: -See the `topN() API documentation <{+core-api+}/client/model/Accumulators.html#topN(java.lang.String,org.bson.conversions.Bson,OutExpression,NExpression)>`__ -for more information. - -.. _java_aggregates_bottom: - -Bottom -~~~~~~ +Sample +------ -The ``bottom()`` accumulator returns data from the last document in a group -based on the specified sort order. +Use the ``sample()`` method to create a :manual:`$sample ` +pipeline stage to randomly select documents from input. -The following example demonstrates how to use the ``bottom()`` method to -return the ``title`` and ``runtime`` values of the shortest movie based on the -``runtime`` value, grouped by ``year``. +The following example creates a pipeline stage that randomly selects 5 documents: -.. literalinclude:: /includes/fundamentals/code-snippets/builders/AccumulatorsPickN.java - :start-after: // begin bottom accumulator - :end-before: // end bottom accumulator - :language: java +.. literalinclude:: /includes/aggregation/Builders.cs + :start-after: // begin sample + :end-before: // end sample + :language: csharp :dedent: -See the `bottom() API documentation <{+core-api+}/client/model/Accumulators.html#bottom(java.lang.String,org.bson.conversions.Bson,OutExpression)>`__ -for more information. - -.. _java_aggregates_bottom_n: - -BottomN -~~~~~~~ +Atlas Full-Text Search +---------------------- -The ``bottomN()`` accumulator returns data from documents that contain the -lowest ``n`` values for the specified field. +Use the ``search()`` method to create a :manual:`$search ` +pipeline stage that specifies a full-text search of one or more fields. -.. tip:: +.. tip:: Only Available on Atlas for MongoDB v4.2 and later - The ``$minN`` and ``$bottomN`` accumulators can perform similar tasks. - See :manual:`Comparison of $minN and $bottomN Accumulators ` - for recommended usage of each. + This aggregation pipeline operator is only available for collections hosted + on :atlas:`MongoDB Atlas ` clusters running v4.2 or later that are + covered by an :atlas:`Atlas search index `. + Learn more about the required setup and the functionality of this operator + from the :ref:`Atlas Search ` documentation. -The following example demonstrates how to use the ``bottomN()`` method to -return the ``title`` and ``imdb.rating`` values of the two lowest rated movies -based on the ``imdb.rating`` value, grouped by ``year``: +The following example creates a pipeline stage that searches the ``title`` +field for text that contains the word "Future": -.. literalinclude:: /includes/fundamentals/code-snippets/builders/AccumulatorsPickN.java - :start-after: // begin bottomN accumulator - :end-before: // end bottomN accumulator - :language: java +.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggregateSearchBuilderExample.java + :start-after: // begin atlasTextSearch + :end-before: // end atlasTextSearch + :language: csharp :dedent: -See the `bottomN() API documentation <{+core-api+}/client/model/Accumulators.html#bottomN(java.lang.String,org.bson.conversions.Bson,OutExpression,NExpression)>`__ -for more information. - -Unwind ------- - -Use the ``unwind()`` method to create an :manual:`$unwind ` -pipeline stage to deconstruct an array field from input documents, creating -an output document for each array element. +Learn more about the builders from the +`search package API documentation <{+core-api+}/client/model/search/package-summary.html>`__. +Atlas Search Metadata +--------------------- -The following example creates a document for each element in the ``sizes`` array: +Use the ``searchMeta()`` method to create a +:manual:`$searchMeta ` +pipeline stage which returns only the metadata part of the results from +Atlas full-text search queries. -.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggBuilders.java - :start-after: // begin unwindStage - :end-before: // end unwindStage - :language: java - :dedent: +.. tip:: Only Available on Atlas for MongoDB v4.4.11 and later -To preserve documents that have missing or ``null`` -values for the array field, or where array is empty: + This aggregation pipeline operator is only available + on :atlas:`MongoDB Atlas ` clusters running v4.4.11 and later. For a + detailed list of version availability, see the MongoDB Atlas documentation + on :atlas:`$searchMeta `. +The following example shows the ``count`` metadata for an Atlas search +aggregation stage: -.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggBuilders.java - :start-after: // begin unwindPreserve - :end-before: // end unwindPreserve - :language: java +.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggregateSearchBuilderExample.java + :start-after: // begin atlasSearchMeta + :end-before: // end atlasSearchMeta + :language: csharp :dedent: -To include the array index, in this example in a field called ``"position"``: +Learn more about this helper from the +`searchMeta() API documentation <{+core-api+}/client/model/Aggregates.html#searchMeta(com.mongodb.client.model.search.SearchCollector)>`__. -.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggBuilders.java - :start-after: // begin unwindIndex - :end-before: // end unwindIndex - :language: java - :dedent: +.. _builders-aggregates-setWindowFields: -Out ---- +SetWindowFields +--------------- -Use the ``out()`` method to create an :manual:`$out ` -pipeline stage that writes all documents to the specified collection in -the same database. +Use the ``setWindowFields()`` method to create a :manual:`$setWindowFields ` +pipeline stage that allows using window operators to perform operations +on a specified span of documents in a collection. -.. important:: +.. tip:: Window Functions - The ``$out`` stage must be the last stage in any aggregation pipeline. + The driver includes the `Windows <{+core-api+}/client/model/Windows.html>`__ + class with static factory methods for building windowed computations. -The following example writes the results of the pipeline to the ``authors`` -collection: +The following example creates a pipeline stage that computes the +accumulated rainfall and the average temperature over the past month for +each locality from more fine-grained measurements presented in the ``rainfall`` +and ``temperature`` fields: -.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggBuilders.java - :start-after: // begin out - :end-before: // end out - :language: java +.. literalinclude:: /includes/aggregation/Builders.cs + :start-after: // begin setWindowFields + :end-before: // end setWindowFields + :language: csharp :dedent: -Merge ------ - -Use the ``merge()`` method to create a :manual:`$merge ` -pipeline stage that merges all documents into the specified collection. - -.. important:: - - The ``$merge`` stage must be the last stage in any aggregation pipeline. - -The following example merges the pipeline into the ``authors`` collection using the default -options: +Skip +---- -.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggBuilders.java - :start-after: // begin mergeStage - :end-before: // end mergeStage - :language: java - :dedent: +Use the ``skip()`` method to create a :manual:`$skip ` +pipeline stage to skip over the specified number of documents before +passing documents into the next stage. -The following example merges the pipeline into the ``customers`` collection in the -``reporting`` database using some options that specify to replace -the document if both ``date`` and ``customerId`` match, otherwise insert the -document: +The following example creates a pipeline stage that skips the first ``5`` documents: -.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggBuilders.java - :start-after: // begin mergeOptions - :end-before: // end mergeOptions - :language: java +.. literalinclude:: /includes/aggregation/Builders.cs + :start-after: // begin skip + :end-before: // end skip + :language: csharp :dedent: -GraphLookup ------------ - -Use the ``graphLookup()`` method to create a :manual:`$graphLookup ` -pipeline stage that performs a recursive search on a specified collection to match -a specified field in one document to a specified field of another document. - -The following example computes the social network graph for users in the -``contacts`` collection, recursively matching the value in the ``friends`` field -to the ``name`` field: +Sort +---- -.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggBuilders.java - :start-after: // begin graphLookupBasic - :end-before: // end graphLookupBasic - :language: java - :dedent: +Use the ``sort()`` method to create a :manual:`$sort ` +pipeline stage to sort by the specified criteria. -Using ``GraphLookupOptions``, you can specify the depth to recurse as well as -the name of the depth field, if desired. In this example, ``$graphLookup`` will -recurse up to two times, and create a field called ``degrees`` with the -recursion depth information for every document. +.. tip:: -.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggBuilders.java - :start-after: // begin graphLookupDepth - :end-before: // end graphLookupDepth - :language: java - :dedent: + Though the sort criteria can be an instance of any class that + implements ``Bson``, it's convenient to combine with use of :ref:`Sorts `. -Using ``GraphLookupOptions``, you can specify a filter that documents must match -in order for MongoDB to include them in your search. In this -example, only links with "golf" in their ``hobbies`` field will be included. +The following example creates a pipeline stage that sorts in descending order according +to the value of the ``year`` field and then in ascending order according to the +value of the ``title`` field: -.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggBuilders.java - :start-after: // begin graphLookupMatch - :end-before: // end graphLookupMatch - :language: java +.. literalinclude:: /includes/aggregation/Builders.cs + :start-after: // begin sortStage + :end-before: // end sortStage + :language: csharp :dedent: SortByCount @@ -598,347 +597,272 @@ these groups by count in descending order. The following example groups documents by the truncated value of the field ``x`` and computes the count for each distinct value: -.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggBuilders.java +.. literalinclude:: /includes/aggregation/Builders.cs :start-after: // begin sortByCount :end-before: // end sortByCount - :language: java + :language: csharp :dedent: -ReplaceRoot ------------ - -Use the ``replaceRoot()`` method to create a :manual:`$replaceRoot ` -pipeline stage that replaces each input document with the specified document. - -The following example replaces each input document with the nested document -in the ``spanish_translation`` field: - -.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggBuilders.java - :start-after: // begin replaceRoot - :end-before: // end replaceRoot - :language: java - :dedent: -AddFields ---------- -Use the ``addFields()`` method to create an :manual:`$addFields ` -pipeline stage that adds new fields to documents. -.. tip:: +Unwind +------ - Use ``$addFields`` when you do not want to project field inclusion - or exclusion. +Use the ``unwind()`` method to create an :manual:`$unwind ` +pipeline stage to deconstruct an array field from input documents, creating +an output document for each array element. -The following example adds two new fields, ``a`` and ``b`` to the input documents: +The following example creates a document for each element in the ``sizes`` array: -.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggBuilders.java - :start-after: // begin addFields - :end-before: // end addFields - :language: java +.. literalinclude:: /includes/aggregation/Builders.cs + :start-after: // begin unwindStage + :end-before: // end unwindStage + :language: csharp :dedent: -Count ------ - -Use the ``count()`` method to create a :manual:`$count ` -pipeline stage that counts the number of documents that enter the stage, and assigns -that value to a specified field name. If you do not specify a field, -``count()`` defaults the field name to "count". - -.. tip:: - - The ``$count`` stage is syntactic sugar for: +To preserve documents that have missing or ``null`` +values for the array field, or where array is empty: - .. code-block:: json - { "$group":{ "_id": 0, "count": { "$sum" : 1 } } } - -The following example creates a pipeline stage that outputs the count of incoming -documents in a field called "total": - -.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggBuilders.java - :start-after: // begin count - :end-before: // end count - :language: java +.. literalinclude:: /includes/aggregation/Builders.cs + :start-after: // begin unwindPreserve + :end-before: // end unwindPreserve + :language: csharp :dedent: +To include the array index, in this example in a field called ``"position"``: -Bucket ------- - -Use the ``bucket()`` method to create a :manual:`$bucket ` -pipeline stage that automates the bucketing of data around predefined boundary -values. - -The following example creates a pipeline stage that groups incoming documents based -on the value of their ``screenSize`` field, inclusive of the lower boundary -and exclusive of the upper boundary. - -.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggBuilders.java - :start-after: // begin basicBucket - :end-before: // end basicBucket - :language: java +.. literalinclude:: /includes/aggregation/Builders.cs + :start-after: // begin unwindIndex + :end-before: // end unwindIndex + :language: csharp :dedent: -Use the ``BucketOptions`` class to specify a default bucket for values -outside of the specified boundaries, and to specify additional accumulators. -The following example creates a pipeline stage that groups incoming documents based -on the value of their ``screenSize`` field, counting the number of documents -that fall within each bucket, pushing the value of ``screenSize`` into a -field called ``matches``, and capturing any screen sizes greater than "70" -into a bucket called "monster" for monstrously large screen sizes: -.. tip:: - The driver includes the `Accumulators <{+core-api+}/client/model/Accumulators.html>`__ - class with static factory methods for each of the supported accumulators. - -.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggBuilders.java - :start-after: // begin bucketOptions - :end-before: // end bucketOptions - :language: java - :dedent: - -BucketAuto ----------- -Use the ``bucketAuto()`` method to create a :manual:`$bucketAuto ` -pipeline stage that automatically determines the boundaries of each bucket -in its attempt to distribute the documents evenly into a specified number of buckets. - -The following example creates a pipeline stage that will attempt to create and evenly -distribute documents into *10* buckets using the value of their ``price`` field: -.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggBuilders.java - :start-after: // begin bucketAutoBasic - :end-before: // end bucketAutoBasic - :language: java - :dedent: +.. _java_aggregates_pick_n: -Use the ``BucketAutoOptions`` class to specify a :wikipedia:`preferred number ` -based scheme to set boundary values, and specify additional accumulators. +Pick-N Accumulators +------------------- -The following example creates a pipeline stage that will attempt to create and evenly -distribute documents into *10* buckets using the value of their ``price`` field, -setting the bucket boundaries at powers of 2 (2, 4, 8, 16, ...). It also counts -the number of documents in each bucket, and calculates their average ``price`` -in a new field called ``avgPrice``: +The pick-n accumulators are aggregation accumulation operators that return +the top and bottom elements given a specific ordering. Use one of the +following builders to create an aggregation accumulation operator: + +- :ref:`minN() ` +- :ref:`maxN() ` +- :ref:`firstN() ` +- :ref:`lastN() ` +- :ref:`top() ` +- :ref:`topN() ` +- :ref:`bottom() ` +- :ref:`bottomN() ` .. tip:: - The driver includes the `Accumulators <{+core-api+}/client/model/Accumulators.html>`__ - class with static factory methods for each of the supported accumulators. + You can only perform aggregation operations with these pick-n accumulators + when running MongoDB v5.2 or later. -.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggBuilders.java - :start-after: // begin bucketAutoOptions - :end-before: // end bucketAutoOptions - :language: java - :dedent: +Learn which aggregation pipeline stages you can use accumulator operators with +from the Server manual section on +:manual:`Accumulators `. -Facet ------ +.. _java_aggregates_min_n: -Use the ``facet()`` method to create a :manual:`$facet ` -pipeline stage that allows for the definition of parallel pipelines. +MinN +~~~~ -The following example creates a pipeline stage that executes two parallel aggregations: +The ``minN()`` builder creates the :manual:`$minN ` +accumulator which returns data from documents that contain the ``n`` lowest +values of a grouping. -- The first aggregation distributes incoming documents into 5 groups according to - their ``attributes.screen_size`` field. +.. tip:: -- The second aggregation counts all *manufacturers* and returns their count, limited - to the top **5**. + The ``$minN`` and ``$bottomN`` accumulators can perform similar tasks. + See + :manual:`Comparison of $minN and $bottomN Accumulators ` + for recommended usage of each. -.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggBuilders.java - :start-after: // begin facet - :end-before: // end facet - :language: java - :dedent: +The following example demonstrates how to use the ``minN()`` method to return +the lowest three ``imdb.rating`` values for movies, grouped by ``year``: -.. _builders-aggregates-setWindowFields: +.. literalinclude:: /includes/fundamentals/code-snippets/builders/AccumulatorsPickN.java + :start-after: // begin minN accumulator + :end-before: // end minN accumulator + :language: csharp + :dedent: -SetWindowFields ---------------- +See the `minN() API documentation <{+core-api+}/client/model/Accumulators.html#minN(java.lang.String,InExpression,NExpression)>`__ +for more information. -Use the ``setWindowFields()`` method to create a :manual:`$setWindowFields ` -pipeline stage that allows using window operators to perform operations -on a specified span of documents in a collection. +.. _java_aggregates_max_n: -.. tip:: Window Functions +MaxN +~~~~ - The driver includes the `Windows <{+core-api+}/client/model/Windows.html>`__ - class with static factory methods for building windowed computations. +The ``maxN()`` accumulator returns data from documents that contain the ``n`` +highest values of a grouping. -The following example creates a pipeline stage that computes the -accumulated rainfall and the average temperature over the past month for -each locality from more fine-grained measurements presented in the ``rainfall`` -and ``temperature`` fields: +The following example demonstrates how to use the ``maxN()`` method to +return the highest two ``imdb.rating`` values for movies, grouped by ``year``: -.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggBuilders.java - :start-after: // begin setWindowFields - :end-before: // end setWindowFields - :language: java +.. literalinclude:: /includes/fundamentals/code-snippets/builders/AccumulatorsPickN.java + :start-after: // begin maxN accumulator + :end-before: // end maxN accumulator + :language: csharp :dedent: -Densify -------- +See the `maxN() API documentation <{+core-api+}/client/model/Accumulators.html#maxN(java.lang.String,InExpression,NExpression)>`__ +for more information. -Use the ``densify()`` method to create a -:manual:`$densify ` pipeline -stage that generates a sequence of documents to span a specified interval. +.. _java_aggregates_first_n: -.. tip:: +FirstN +~~~~~~ - You can use the ``$densify()`` aggregation stage only when running - MongoDB v5.1 or later. +The ``firstN()`` accumulator returns data from the first ``n`` documents in +each grouping for the specified sort order. -Consider the following documents retrieved from the :atlas:`Atlas sample weather dataset ` -that contain measurements for a similar ``position`` field, spaced one hour -apart: +.. tip:: -.. code-block:: none - :copyable: false + The ``$firstN`` and ``$topN`` accumulators can perform similar tasks. + See + :manual:`Comparison of $firstN and $topN Accumulators ` + for recommended usage of each. - Document{{ _id=5553a..., position=Document{{type=Point, coordinates=[-47.9, 47.6]}}, ts=Mon Mar 05 08:00:00 EST 1984, ... }} - Document{{ _id=5553b..., position=Document{{type=Point, coordinates=[-47.9, 47.6]}}, ts=Mon Mar 05 09:00:00 EST 1984, ... }} +The following example demonstrates how to use the ``firstN()`` method to +return the first four movie ``title`` values, based on the order they came +into the stage, grouped by ``year``: -Suppose you needed to create a pipeline stage that performs the following -actions on these documents: +.. literalinclude:: /includes/fundamentals/code-snippets/builders/AccumulatorsPickN.java + :start-after: // begin firstN accumulator + :end-before: // end firstN accumulator + :language: csharp + :dedent: -- Add a document at every 15-minute interval for which a ``ts`` value does not - already exist. -- Group the documents by the ``position`` field. +See the `firstN() API documentation <{+core-api+}/client/model/Accumulators.html#firstN(java.lang.String,InExpression,NExpression)>`__ +for more information. -The call to the ``densify()`` aggregation stage builder that accomplishes -these actions resembles the following: +.. _java_aggregates_last_n: -.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggregateDensify.java - :start-after: // begin densify aggregate - :end-before: // end densify aggregate - :language: java - :dedent: +LastN +~~~~~ -The following output highlights the documents generated by the aggregate stage -which contain ``ts`` values every 15 minutes between the existing documents: +The ``lastN()`` accumulator returns data from the last ``n`` documents in +each grouping for the specified sort order. -.. code-block:: none - :emphasize-lines: 2-4 - :copyable: false +The following example demonstrates how to use the ``lastN()`` method to show +the last three movie ``title`` values, based on the the order they came into +the stage, grouped by ``year``: - Document{{ _id=5553a..., position=Document{{type=Point, coordinates=[-47.9, 47.6]}}, ts=Mon Mar 05 08:00:00 EST 1984, ... }} - Document{{ position=Document{{coordinates=[-47.9, 47.6]}}, ts=Mon Mar 05 08:15:00 EST 1984 }} - Document{{ position=Document{{coordinates=[-47.9, 47.6]}}, ts=Mon Mar 05 08:30:00 EST 1984 }} - Document{{ position=Document{{coordinates=[-47.9, 47.6]}}, ts=Mon Mar 05 08:45:00 EST 1984 }} - Document{{ _id=5553b..., position=Document{{type=Point, coordinates=[-47.9, 47.6]}}, ts=Mon Mar 05 09:00:00 EST 1984, ... }} +.. literalinclude:: /includes/fundamentals/code-snippets/builders/AccumulatorsPickN.java + :start-after: // begin lastN accumulator + :end-before: // end lastN accumulator + :language: csharp + :dedent: -See the `densify package API documentation <{+core-api+}/client/model/densify/package-summary.html>`__ +See the `lastN() API documentation <{+core-api+}/client/model/Accumulators.html#lastN(java.lang.String,InExpression,NExpression)>`__ for more information. -Fill ----- - -Use the ``fill()`` method to create a -`$fill `__ -pipeline stage that populates ``null`` and missing field values. +.. _java_aggregates_top: -.. tip:: +Top +~~~ - You can use the ``$fill()`` aggregation stage only when running - MongoDB v5.3 or later. +The ``top()`` accumulator returns data from the first document in a group +based on the specified sort order. -Consider the following documents that contain temperature and air pressure -measurements at an hourly interval: +The following example demonstrates how to use the ``top()`` method to return +the ``title`` and ``imdb.rating`` values for the top rated movies based on the +``imdb.rating``, grouped by ``year``. -.. code-block:: none - :copyable: false +.. literalinclude:: /includes/fundamentals/code-snippets/builders/AccumulatorsPickN.java + :start-after: // begin top accumulator + :end-before: // end top accumulator + :language: csharp + :dedent: - Document{{_id=6308a..., hour=1, temperature=23C, air_pressure=29.74}} - Document{{_id=6308b..., hour=2, temperature=23.5C}} - Document{{_id=6308c..., hour=3, temperature=null, air_pressure=29.76}} +See the `top() API documentation <{+core-api+}/client/model/Accumulators.html#top(java.lang.String,org.bson.conversions.Bson,OutExpression)>`__ +for more information. -Suppose you needed to populate missing temperature and air pressure -data points in the documents as follows: +.. _java_aggregates_top_n: -- Populate the ``air_pressure`` field for hour "2" using linear interpolation - to calculate the value. -- Set the missing ``temperature`` value to "23.6C" for hour "3". +TopN +~~~~ -The call to the ``fill()`` aggregation stage builder that accomplishes -these actions resembles the following: +The ``topN()`` accumulator returns data from documents that contain the +highest ``n`` values for the specified field. -.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggregateFill.java - :start-after: // begin fill aggregate - :end-before: // end fill aggregate - :language: java - :dedent: +.. tip:: -The following output highlights the documents that contain fields populated -by the aggregate stage: + The ``$firstN`` and ``$topN`` accumulators can perform similar tasks. + See + :manual:`Comparison of $firstN and $topN Accumulators ` + for recommended usage of each. -.. code-block:: none - :emphasize-lines: 2,3 - :copyable: false +The following example demonstrates how to use the ``topN()`` method to return +the ``title`` and ``runtime`` values of the three longest movies based on the +``runtime`` values, grouped by ``year``. - Document{{_id=6308a..., hour=1, temperature=23C, air_pressure=29.74}} - Document{{_id=6308b..., hour=2, temperature=23.5C, air_pressure=29.75}} - Document{{_id=6308c..., hour=3, temperature=23.6C, air_pressure=29.76}} +.. literalinclude:: /includes/fundamentals/code-snippets/builders/AccumulatorsPickN.java + :start-after: // begin topN accumulator + :end-before: // end topN accumulator + :language: csharp + :dedent: -See the `fill package API documentation <{+core-api+}/client/model/fill/package-summary.html>`__ +See the `topN() API documentation <{+core-api+}/client/model/Accumulators.html#topN(java.lang.String,org.bson.conversions.Bson,OutExpression,NExpression)>`__ for more information. -Atlas Full-Text Search ----------------------- - -Use the ``search()`` method to create a :manual:`$search ` -pipeline stage that specifies a full-text search of one or more fields. +.. _java_aggregates_bottom: -.. tip:: Only Available on Atlas for MongoDB v4.2 and later +Bottom +~~~~~~ - This aggregation pipeline operator is only available for collections hosted - on :atlas:`MongoDB Atlas ` clusters running v4.2 or later that are - covered by an :atlas:`Atlas search index `. - Learn more about the required setup and the functionality of this operator - from the :ref:`Atlas Search ` documentation. +The ``bottom()`` accumulator returns data from the last document in a group +based on the specified sort order. -The following example creates a pipeline stage that searches the ``title`` -field for text that contains the word "Future": +The following example demonstrates how to use the ``bottom()`` method to +return the ``title`` and ``runtime`` values of the shortest movie based on the +``runtime`` value, grouped by ``year``. -.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggregateSearchBuilderExample.java - :start-after: // begin atlasTextSearch - :end-before: // end atlasTextSearch - :language: java +.. literalinclude:: /includes/fundamentals/code-snippets/builders/AccumulatorsPickN.java + :start-after: // begin bottom accumulator + :end-before: // end bottom accumulator + :language: csharp :dedent: -Learn more about the builders from the -`search package API documentation <{+core-api+}/client/model/search/package-summary.html>`__. +See the `bottom() API documentation <{+core-api+}/client/model/Accumulators.html#bottom(java.lang.String,org.bson.conversions.Bson,OutExpression)>`__ +for more information. -Atlas Search Metadata ---------------------- +.. _java_aggregates_bottom_n: -Use the ``searchMeta()`` method to create a -:manual:`$searchMeta ` -pipeline stage which returns only the metadata part of the results from -Atlas full-text search queries. +BottomN +~~~~~~~ -.. tip:: Only Available on Atlas for MongoDB v4.4.11 and later +The ``bottomN()`` accumulator returns data from documents that contain the +lowest ``n`` values for the specified field. - This aggregation pipeline operator is only available - on :atlas:`MongoDB Atlas ` clusters running v4.4.11 and later. For a - detailed list of version availability, see the MongoDB Atlas documentation - on :atlas:`$searchMeta `. +.. tip:: -The following example shows the ``count`` metadata for an Atlas search -aggregation stage: + The ``$minN`` and ``$bottomN`` accumulators can perform similar tasks. + See :manual:`Comparison of $minN and $bottomN Accumulators ` + for recommended usage of each. -.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggregateSearchBuilderExample.java - :start-after: // begin atlasSearchMeta - :end-before: // end atlasSearchMeta - :language: java +The following example demonstrates how to use the ``bottomN()`` method to +return the ``title`` and ``imdb.rating`` values of the two lowest rated movies +based on the ``imdb.rating`` value, grouped by ``year``: + +.. literalinclude:: /includes/fundamentals/code-snippets/builders/AccumulatorsPickN.java + :start-after: // begin bottomN accumulator + :end-before: // end bottomN accumulator + :language: csharp :dedent: -Learn more about this helper from the -`searchMeta() API documentation <{+core-api+}/client/model/Aggregates.html#searchMeta(com.mongodb.client.model.search.SearchCollector)>`__. +See the `bottomN() API documentation <{+core-api+}/client/model/Accumulators.html#bottomN(java.lang.String,org.bson.conversions.Bson,OutExpression,NExpression)>`__ +for more information. \ No newline at end of file From e410530855fb6939d0f2414c822d5cb5cfceb9e2 Mon Sep 17 00:00:00 2001 From: Mike Woofter <108414937+mongoKart@users.noreply.github.com> Date: Thu, 10 Apr 2025 12:12:14 -0500 Subject: [PATCH 03/17] wip --- source/aggregation.txt | 356 +++---- source/aggregation/bucket.txt | 60 ++ source/aggregation/bucketAuto.txt | 51 + source/aggregation/changeStream.txt | 18 + .../changeStreamSplitLargeEvent copy.txt | 18 + source/aggregation/count.txt | 44 + source/aggregation/densify.txt | 73 ++ source/aggregation/documents.txt | 46 + source/aggregation/facet.txt | 39 + source/aggregation/graphLookup.txt | 57 ++ source/aggregation/group.txt | 45 + source/aggregation/limit.txt | 33 + source/aggregation/lookup.txt | 53 ++ source/aggregation/match.txt | 39 + source/aggregation/merge.txt | 49 + source/aggregation/out.txt | 38 + source/aggregation/project.txt | 55 ++ source/aggregation/rankFusion.txt | 18 + source/aggregation/replaceRoot.txt | 34 + source/aggregation/replaceWith.txt | 18 + source/aggregation/sample.txt | 33 + source/aggregation/search.txt | 45 + source/aggregation/searchMeta.txt | 46 + source/aggregation/set.txt | 18 + source/aggregation/setWindowFields.txt | 44 + source/aggregation/skip.txt | 34 + source/aggregation/sort.txt | 40 + source/aggregation/sortByCount.txt | 47 + source/aggregation/stages.txt | 868 ------------------ source/aggregation/unionWith.txt | 18 + source/aggregation/unwind.txt | 52 ++ source/aggregation/vectorSearch.txt | 18 + 32 files changed, 1371 insertions(+), 1036 deletions(-) create mode 100644 source/aggregation/bucket.txt create mode 100644 source/aggregation/bucketAuto.txt create mode 100644 source/aggregation/changeStream.txt create mode 100644 source/aggregation/changeStreamSplitLargeEvent copy.txt create mode 100644 source/aggregation/count.txt create mode 100644 source/aggregation/densify.txt create mode 100644 source/aggregation/documents.txt create mode 100644 source/aggregation/facet.txt create mode 100644 source/aggregation/graphLookup.txt create mode 100644 source/aggregation/group.txt create mode 100644 source/aggregation/limit.txt create mode 100644 source/aggregation/lookup.txt create mode 100644 source/aggregation/match.txt create mode 100644 source/aggregation/merge.txt create mode 100644 source/aggregation/out.txt create mode 100644 source/aggregation/project.txt create mode 100644 source/aggregation/rankFusion.txt create mode 100644 source/aggregation/replaceRoot.txt create mode 100644 source/aggregation/replaceWith.txt create mode 100644 source/aggregation/sample.txt create mode 100644 source/aggregation/search.txt create mode 100644 source/aggregation/searchMeta.txt create mode 100644 source/aggregation/set.txt create mode 100644 source/aggregation/setWindowFields.txt create mode 100644 source/aggregation/skip.txt create mode 100644 source/aggregation/sort.txt create mode 100644 source/aggregation/sortByCount.txt delete mode 100644 source/aggregation/stages.txt create mode 100644 source/aggregation/unionWith.txt create mode 100644 source/aggregation/unwind.txt create mode 100644 source/aggregation/vectorSearch.txt diff --git a/source/aggregation.txt b/source/aggregation.txt index 5517af16..a27a8ac1 100644 --- a/source/aggregation.txt +++ b/source/aggregation.txt @@ -1,15 +1,15 @@ .. _csharp-aggregation: -=========== -Aggregation -=========== +====================== +Aggregation Operations +====================== .. facet:: :name: genre :values: reference .. meta:: - :keywords: code example, transform, pipeline + :keywords: dotnet, code example, transform, pipeline .. contents:: On this page :local: @@ -97,8 +97,118 @@ performing aggregation operations: - The :manual:`$graphLookup ` stage has a strict memory limit of 100 megabytes and ignores the ``AllowDiskUse`` property. -Stages ------- +Syntax Options +-------------- + +This section describes the approaches that you can use to create an aggregation pipeline. + +Builders +~~~~~~~~ + + + + +To perform an aggregation, pass a list of aggregation stages to the +``IMongoCollection.Aggregate()`` method. + +.. note:: + + This example uses the ``sample_restaurants.restaurants`` collection + from the :atlas:`Atlas sample datasets `. To learn how to create a + free MongoDB Atlas cluster and load the sample datasets, see :ref:`csharp-get-started`. + +The following code example produces a count of the number of bakeries in each borough +of New York City. To do so, it uses an aggregation pipeline that contains the following stages: + +- A :manual:`$match ` stage to filter for documents whose + ``cuisine`` field contains the value ``"Bakery"``. + +- A :manual:`$group ` stage to group the matching + documents by the ``borough`` field, accumulating a count of documents for each distinct value + of that field. + +The following sections implement this example by using LINQ, Builders, and BsonDocument +approaches to create and combine the aggregation stages used in the example pipeline. + +LINQ Approach +~~~~~~~~~~~~~ + +.. io-code-block:: + + .. input:: /includes/fundamentals/code-examples/LinqAggregation.cs + :language: csharp + :dedent: + :start-after: begin-aggregation + :end-before: end-aggregation + + .. output:: + :language: console + :visible: false + + { _id = Bronx, Count = 71 } + { _id = Brooklyn, Count = 173 } + { _id = Staten Island, Count = 20 } + { _id = Missing, Count = 2 } + { _id = Manhattan, Count = 221 } + { _id = Queens, Count = 204 } + +To learn more about using LINQ to construct aggregation pipelines, see the +:ref:`csharp-linq` guide. + +Builders Approach +~~~~~~~~~~~~~~~~~ + +.. io-code-block:: + + .. input:: /includes/fundamentals/code-examples/BuilderAggregation.cs + :language: csharp + :dedent: + :start-after: begin-aggregation + :end-before: end-aggregation + + .. output:: + :language: console + :visible: false + + { _id = Bronx, Count = 71 } + { _id = Brooklyn, Count = 173 } + { _id = Staten Island, Count = 20 } + { _id = Missing, Count = 2 } + { _id = Manhattan, Count = 221 } + { _id = Queens, Count = 204 } + +To learn more about using builders to construct aggregation pipelines, +see the :ref:`csharp-builders-aggregation` section of the Operations with Builders guide. + +BsonDocument Approach +~~~~~~~~~~~~~~~~~~~~~ + +.. io-code-block:: + + .. input:: /includes/fundamentals/code-examples/Aggregation.cs + :language: csharp + :dedent: + :start-after: begin-aggregation + :end-before: end-aggregation + + .. output:: + :language: console + :visible: false + + { "_id" : "Brooklyn", "count" : 173 } + { "_id" : "Manhattan", "count" : 221 } + { "_id" : "Bronx", "count" : 71 } + { "_id" : "Missing", "count" : 2 } + { "_id" : "Staten Island", "count" : 20 } + { "_id" : "Queens", "count" : 204 } + + +Aggregation Stage Methods +------------------------- + +The following table lists the builder methods in the {+driver-short+} that correspond +to stages in the aggregation pipeline. For more information about a method, click the +method name. .. list-table:: :header-rows: 1 @@ -107,12 +217,12 @@ Stages * - Stage - Description - * - :pipeline:`$bucket` + * - :ref:`Bucket() ` - Categorizes incoming documents into groups, called buckets, based on a specified expression and bucket boundaries. - * - :pipeline:`$bucketAuto` + * - :ref:`BucketAuto() ` - Categorizes incoming documents into a specific number of groups, called buckets, based on a specified expression. @@ -120,99 +230,99 @@ Stages to evenly distribute the documents into the specified number of buckets. - * - :pipeline:`$changeStream` + * - :ref:`ChangeStream() ` - - Returns a :ref:`Change Stream ` cursor for the - collection. This stage can only occur once in an aggregation + - Returns a change stream cursor for the + collection. This stage can occur only once in an aggregation pipeline and it must occur as the first stage. - * - :pipeline:`$changeStreamSplitLargeEvent` + * - :ref:`ChangeStreamSplitLargeEvent() ` + + - Splits large change stream events that exceed 16 MB into smaller fragments returned + in a change stream cursor. - - .. include:: /includes/changeStreamSplitLargeEvent-introduction.rst + You can use $changeStreamSplitLargeEvent only in a $changeStream pipeline, and + it must be the final stage in the pipeline. - * - :pipeline:`$count` + * - :ref:`Count() ` - Returns a count of the number of documents at this stage of the aggregation pipeline. - Distinct from the :group:`$count` aggregation accumulator. + * - :ref:`Densify() ` - * - :pipeline:`$densify` + - Creates new documents in a sequence of documents where certain values in a field are missing. - - .. include:: /includes/fact-densify-description.rst - - * - :pipeline:`$documents` + * - :ref:`Documents() ` + - Returns literal documents from input expressions. - * - :pipeline:`$facet` + * - :ref:`Facet() ` - - Processes multiple :ref:`aggregation pipelines - ` within a single stage on the same set + - Processes multiple aggregation pipelines + within a single stage on the same set of input documents. Enables the creation of multi-faceted aggregations capable of characterizing data across multiple dimensions, or facets, in a single stage. - * - :pipeline:`$graphLookup` + * - :ref:`GraphLookup() ` - Performs a recursive search on a collection. To each output document, adds a new array field that contains the traversal results of the recursive search for that document. - * - :pipeline:`$group` + * - :ref:`Group() ` - Groups input documents by a specified identifier expression - and applies the accumulator expression(s), if specified, to + and applies the accumulator expressions, if specified, to each group. Consumes all input documents and outputs one - document per each distinct group. The output documents only - contain the identifier field and, if specified, accumulated + document per each distinct group. The output documents + contain only the identifier field and, if specified, accumulated fields. - * - :pipeline:`$limit` + * - :ref:`Limit() ` - - Passes the first *n* documents unmodified to the pipeline + - Passes the first *n* documents unmodified to the pipeline, where *n* is the specified limit. For each input document, outputs either one document (for the first *n* documents) or zero documents (after the first *n* documents). - * - :pipeline:`$lookup` + * - :ref:`Lookup() ` - Performs a left outer join to another collection in the *same* database to filter in documents from the "joined" collection for processing. - * - :pipeline:`$match` + * - :ref:`Match() ` - Filters the document stream to allow only matching documents to pass unmodified into the next pipeline stage. - :pipeline:`$match` uses standard MongoDB queries. For each - input document, outputs either one document (a match) or zero + For each input document, outputs either one document (a match) or zero documents (no match). - * - :pipeline:`$merge` + * - :ref:`Merge() ` - Writes the resulting documents of the aggregation pipeline to a collection. The stage can incorporate (insert new documents, merge documents, replace documents, keep existing documents, fail the operation, process documents with a custom update pipeline) the results into an output - collection. To use the :pipeline:`$merge` stage, it must be + collection. To use this stage, it must be the last stage in the pipeline. - * - :pipeline:`$out` + * - :ref:`Out() ` - Writes the resulting documents of the aggregation pipeline to - a collection. To use the :pipeline:`$out` stage, it must be + a collection. To use this stage, it must be the last stage in the pipeline. - * - :pipeline:`$project` + * - :ref:`Project() ` - Reshapes each document in the stream, such as by adding new fields or removing existing fields. For each input document, outputs one document. - See also :pipeline:`$unset` for removing existing fields. - - * - :pipeline:`$replaceRoot` + * - :ref:`ReplaceRoot() ` - Replaces a document with the specified embedded document. The operation replaces all existing fields in the input document, @@ -220,195 +330,105 @@ Stages the input document to promote the embedded document to the top level. - * - :pipeline:`$sample` + The ``$replaceWith`` stage is an alias for the ``$replaceRoot`` stage. + + * - :ref:`ReplaceWith() ` + + - Replaces a document with the specified embedded document. + The operation replaces all existing fields in the input document, including + the ``_id`` field. Specify a document embedded in the input document to promote + the embedded document to the top level. + + The ``$replaceWith`` stage is an alias for the ``$replaceRoot`` stage. + + * - :ref:`Sample() ` - Randomly selects the specified number of documents from its input. - * - :pipeline:`$search` + * - :ref:`Search() ` - Performs a full-text search of the field or fields in an :atlas:`Atlas ` collection. - ``$search`` is only available for MongoDB Atlas clusters, and is not + This stage is available only for MongoDB Atlas clusters, and is not available for self-managed deployments. To learn more, see :atlas:`Atlas Search Aggregation Pipeline Stages - `. + ` in the Atlas documentation. - * - :pipeline:`$searchMeta` + * - :ref:`SearchMeta() ` - Returns different types of metadata result documents for the :atlas:`Atlas Search ` query against an :atlas:`Atlas ` collection. - ``$searchMeta`` is only available for MongoDB Atlas clusters, + This stage is available only for MongoDB Atlas clusters, and is not available for self-managed deployments. To learn more, see :atlas:`Atlas Search Aggregation Pipeline Stages - `. + ` in the Atlas documentation. - * - :pipeline:`$set` + * - :ref:`Set() ` - - Adds new fields to documents. Similar to - :pipeline:`$project`, :pipeline:`$set` reshapes each - document in the stream; specifically, by adding new fields to + - Adds new fields to documents. Like the ``Project()`` method, + this method reshapes each + document in the stream by adding new fields to output documents that contain both the existing fields from the input documents and the newly added fields. - :pipeline:`$set` is an alias for :pipeline:`$addFields` stage. - - * - :pipeline:`$setWindowFields` + * - :ref:`SetWindowFields() ` - Groups documents into windows and applies one or more operators to the documents in each window. .. versionadded:: 5.0 - * - :pipeline:`$skip` + * - :ref:`Skip() ` - - Skips the first *n* documents where *n* is the specified skip - number and passes the remaining documents unmodified to the + - Skips the first *n* documents, where *n* is the specified skip + number, and passes the remaining documents unmodified to the pipeline. For each input document, outputs either zero documents (for the first *n* documents) or one document (if after the first *n* documents). - * - :pipeline:`$sort` + * - :ref:`Sort() ` - - Reorders the document stream by a specified sort key. Only - the order changes; the documents remain unmodified. For each - input document, outputs one document. + - Reorders the document stream by a specified sort key. The documents remain unmodified. + For each input document, outputs one document. - * - :pipeline:`$sortByCount` + * - :ref:`SortByCount() ` - Groups incoming documents based on the value of a specified expression, then computes the count of documents in each distinct group. - * - :pipeline:`$unionWith` + * - :ref:`UnionWith() ` - - Performs a union of two collections; i.e. combines - pipeline results from two collections into a single + - Combines pipeline results from two collections into a single result set. - * - :pipeline:`$unwind` + * - :ref:`Unwind() ` - Deconstructs an array field from the input documents to output a document for *each* element. Each output document replaces the array with an element value. For each input - document, outputs *n* documents where *n* is the number of - array elements and can be zero for an empty array. + document, outputs *n* Documents, where *n* is the number of + array elements. *n* can be zero for an empty array. - * - :pipeline:`$vectorSearch` + * - :ref:`VectorSearch() ` - Performs an :abbr:`ANN (Approximate Nearest Neighbor)` or :abbr:`ENN (Exact Nearest Neighbor)` search on a vector in the specified field of an :atlas:`Atlas ` collection. - ``$vectorSearch`` is only available for MongoDB Atlas clusters + ``$vectorSearch`` is available only for MongoDB Atlas clusters running MongoDB v6.0.11 or higher, and is not available for self-managed deployments. .. versionadded:: 7.0.2 -Aggregation Example -------------------- - -To perform an aggregation, pass a list of aggregation stages to the -``IMongoCollection.Aggregate()`` method. - -.. note:: - - This example uses the ``sample_restaurants.restaurants`` collection - from the :atlas:`Atlas sample datasets `. To learn how to create a - free MongoDB Atlas cluster and load the sample datasets, see :ref:`csharp-get-started`. - -The following code example produces a count of the number of bakeries in each borough -of New York City. To do so, it uses an aggregation pipeline that contains the following stages: - -- A :manual:`$match ` stage to filter for documents whose - ``cuisine`` field contains the value ``"Bakery"``. - -- A :manual:`$group ` stage to group the matching - documents by the ``borough`` field, accumulating a count of documents for each distinct value - of that field. - -The following sections implement this example by using LINQ, Builders, and BsonDocument -approaches to create and combine the aggregation stages used in the example pipeline. - -LINQ Approach -~~~~~~~~~~~~~ - -.. io-code-block:: - - .. input:: /includes/fundamentals/code-examples/LinqAggregation.cs - :language: csharp - :dedent: - :start-after: begin-aggregation - :end-before: end-aggregation - - .. output:: - :language: console - :visible: false - - { _id = Bronx, Count = 71 } - { _id = Brooklyn, Count = 173 } - { _id = Staten Island, Count = 20 } - { _id = Missing, Count = 2 } - { _id = Manhattan, Count = 221 } - { _id = Queens, Count = 204 } - -To learn more about using LINQ to construct aggregation pipelines, see the -:ref:`csharp-linq` guide. - -Builders Approach -~~~~~~~~~~~~~~~~~ - -.. io-code-block:: - - .. input:: /includes/fundamentals/code-examples/BuilderAggregation.cs - :language: csharp - :dedent: - :start-after: begin-aggregation - :end-before: end-aggregation - - .. output:: - :language: console - :visible: false - - { _id = Bronx, Count = 71 } - { _id = Brooklyn, Count = 173 } - { _id = Staten Island, Count = 20 } - { _id = Missing, Count = 2 } - { _id = Manhattan, Count = 221 } - { _id = Queens, Count = 204 } - -To learn more about using builders to construct aggregation pipelines, -see the :ref:`csharp-builders-aggregation` section of the Operations with Builders guide. - -BsonDocument Approach -~~~~~~~~~~~~~~~~~~~~~ - -.. io-code-block:: - - .. input:: /includes/fundamentals/code-examples/Aggregation.cs - :language: csharp - :dedent: - :start-after: begin-aggregation - :end-before: end-aggregation - - .. output:: - :language: console - :visible: false - - { "_id" : "Brooklyn", "count" : 173 } - { "_id" : "Manhattan", "count" : 221 } - { "_id" : "Bronx", "count" : 71 } - { "_id" : "Missing", "count" : 2 } - { "_id" : "Staten Island", "count" : 20 } - { "_id" : "Queens", "count" : 204 } - Additional Information ---------------------- diff --git a/source/aggregation/bucket.txt b/source/aggregation/bucket.txt new file mode 100644 index 00000000..69bc0e2d --- /dev/null +++ b/source/aggregation/bucket.txt @@ -0,0 +1,60 @@ +.. _csharp-aggregation-bucket: + +====== +Bucket +====== + +.. facet:: + :name: genre + :values: reference + +.. meta:: + :keywords: dotnet, code example, transform, pipeline, group + +.. contents:: On this page + :local: + :backlinks: none + :depth: 2 + :class: singlecol + +Overview +-------- + +Use the ``bucket()`` method to create a :manual:`$bucket ` +pipeline stage that automates the bucketing of data around predefined boundary +values. + +Example +------- + +The following example creates a pipeline stage that groups incoming documents based +on the value of their ``screenSize`` field, inclusive of the lower boundary +and exclusive of the upper boundary. + +.. literalinclude:: /includes/aggregation/Builders.cs + :start-after: // begin basicBucket + :end-before: // end basicBucket + :language: csharp + :dedent: + +Use the ``BucketOptions`` class to specify a default bucket for values +outside of the specified boundaries, and to specify additional accumulators. + +The following example creates a pipeline stage that groups incoming documents based +on the value of their ``screenSize`` field, counting the number of documents +that fall within each bucket, pushing the value of ``screenSize`` into a +field called ``matches``, and capturing any screen sizes greater than "70" +into a bucket called "monster" for monstrously large screen sizes: + +.. literalinclude:: /includes/aggregation/Builders.cs + :start-after: // begin bucketOptions + :end-before: // end bucketOptions + :language: csharp + :dedent: + +API Documentation +----------------- + +To learn more about the methods and types used on this page, see the following +API documentation: + diff --git a/source/aggregation/bucketAuto.txt b/source/aggregation/bucketAuto.txt new file mode 100644 index 00000000..ac5fe169 --- /dev/null +++ b/source/aggregation/bucketAuto.txt @@ -0,0 +1,51 @@ +.. _csharp-aggregation-bucketauto: + +========== +BucketAuto +========== + +.. facet:: + :name: genre + :values: reference + +.. meta:: + :keywords: code example, transform, pipeline + +.. contents:: On this page + :local: + :backlinks: none + :depth: 2 + :class: singlecol + + +BucketAuto +---------- + +Use the ``bucketAuto()`` method to create a :manual:`$bucketAuto ` +pipeline stage that automatically determines the boundaries of each bucket +in its attempt to distribute the documents evenly into a specified number of buckets. + +The following example creates a pipeline stage that will attempt to create and evenly +distribute documents into *10* buckets using the value of their ``price`` field: + +.. literalinclude:: /includes/aggregation/Builders.cs + :start-after: // begin bucketAutoBasic + :end-before: // end bucketAutoBasic + :language: csharp + :dedent: + + +Use the ``BucketAutoOptions`` class to specify a :wikipedia:`preferred number ` +based scheme to set boundary values, and specify additional accumulators. + +The following example creates a pipeline stage that will attempt to create and evenly +distribute documents into *10* buckets using the value of their ``price`` field, +setting the bucket boundaries at powers of 2 (2, 4, 8, 16, ...). It also counts +the number of documents in each bucket, and calculates their average ``price`` +in a new field called ``avgPrice``: + +.. literalinclude:: /includes/aggregation/Builders.cs + :start-after: // begin bucketAutoOptions + :end-before: // end bucketAutoOptions + :language: csharp + :dedent: \ No newline at end of file diff --git a/source/aggregation/changeStream.txt b/source/aggregation/changeStream.txt new file mode 100644 index 00000000..85105f7d --- /dev/null +++ b/source/aggregation/changeStream.txt @@ -0,0 +1,18 @@ +.. _csharp-aggregation-changestream: + +============ +ChangeStream +============ + +.. facet:: + :name: genre + :values: reference + +.. meta:: + :keywords: code example, transform, pipeline + +.. contents:: On this page + :local: + :backlinks: none + :depth: 2 + :class: singlecol \ No newline at end of file diff --git a/source/aggregation/changeStreamSplitLargeEvent copy.txt b/source/aggregation/changeStreamSplitLargeEvent copy.txt new file mode 100644 index 00000000..528bca02 --- /dev/null +++ b/source/aggregation/changeStreamSplitLargeEvent copy.txt @@ -0,0 +1,18 @@ +.. _csharp-aggregation-changestreamsplitlargeevent: + +=========================== +changeStreamSplitLargeEvent +=========================== + +.. facet:: + :name: genre + :values: reference + +.. meta:: + :keywords: code example, transform, pipeline + +.. contents:: On this page + :local: + :backlinks: none + :depth: 2 + :class: singlecol \ No newline at end of file diff --git a/source/aggregation/count.txt b/source/aggregation/count.txt new file mode 100644 index 00000000..1fbf6113 --- /dev/null +++ b/source/aggregation/count.txt @@ -0,0 +1,44 @@ +.. _csharp-aggregation-count: + +===== +Count +===== + +.. facet:: + :name: genre + :values: reference + +.. meta:: + :keywords: code example, transform, pipeline + +.. contents:: On this page + :local: + :backlinks: none + :depth: 2 + :class: singlecol + + +Count +----- + +Use the ``count()`` method to create a :manual:`$count ` +pipeline stage that counts the number of documents that enter the stage, and assigns +that value to a specified field name. If you do not specify a field, +``count()`` defaults the field name to "count". + +.. tip:: + + The ``$count`` stage is syntactic sugar for: + + .. code-block:: json + + { "$group":{ "_id": 0, "count": { "$sum" : 1 } } } + +The following example creates a pipeline stage that outputs the count of incoming +documents in a field called "total": + +.. literalinclude:: /includes/aggregation/Builders.cs + :start-after: // begin count + :end-before: // end count + :language: csharp + :dedent: \ No newline at end of file diff --git a/source/aggregation/densify.txt b/source/aggregation/densify.txt new file mode 100644 index 00000000..6fc31b2f --- /dev/null +++ b/source/aggregation/densify.txt @@ -0,0 +1,73 @@ +.. _csharp-aggregation-densify: + +======= +Densify +======= + +.. facet:: + :name: genre + :values: reference + +.. meta:: + :keywords: code example, transform, pipeline + +.. contents:: On this page + :local: + :backlinks: none + :depth: 2 + :class: singlecol + + +Densify +------- + +Use the ``densify()`` method to create a +:manual:`$densify ` pipeline +stage that generates a sequence of documents to span a specified interval. + +.. tip:: + + You can use the ``$densify()`` aggregation stage only when running + MongoDB v5.1 or later. + +Consider the following documents retrieved from the :atlas:`Atlas sample weather dataset ` +that contain measurements for a similar ``position`` field, spaced one hour +apart: + +.. code-block:: none + :copyable: false + + Document{{ _id=5553a..., position=Document{{type=Point, coordinates=[-47.9, 47.6]}}, ts=Mon Mar 05 08:00:00 EST 1984, ... }} + Document{{ _id=5553b..., position=Document{{type=Point, coordinates=[-47.9, 47.6]}}, ts=Mon Mar 05 09:00:00 EST 1984, ... }} + +Suppose you needed to create a pipeline stage that performs the following +actions on these documents: + +- Add a document at every 15-minute interval for which a ``ts`` value does not + already exist. +- Group the documents by the ``position`` field. + +The call to the ``densify()`` aggregation stage builder that accomplishes +these actions resembles the following: + +.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggregateDensify.java + :start-after: // begin densify aggregate + :end-before: // end densify aggregate + :language: csharp + :dedent: + +The following output highlights the documents generated by the aggregate stage +which contain ``ts`` values every 15 minutes between the existing documents: + +.. code-block:: none + :emphasize-lines: 2-4 + :copyable: false + + Document{{ _id=5553a..., position=Document{{type=Point, coordinates=[-47.9, 47.6]}}, ts=Mon Mar 05 08:00:00 EST 1984, ... }} + Document{{ position=Document{{coordinates=[-47.9, 47.6]}}, ts=Mon Mar 05 08:15:00 EST 1984 }} + Document{{ position=Document{{coordinates=[-47.9, 47.6]}}, ts=Mon Mar 05 08:30:00 EST 1984 }} + Document{{ position=Document{{coordinates=[-47.9, 47.6]}}, ts=Mon Mar 05 08:45:00 EST 1984 }} + Document{{ _id=5553b..., position=Document{{type=Point, coordinates=[-47.9, 47.6]}}, ts=Mon Mar 05 09:00:00 EST 1984, ... }} + +See the `densify package API documentation <{+core-api+}/client/model/densify/package-summary.html>`__ +for more information. diff --git a/source/aggregation/documents.txt b/source/aggregation/documents.txt new file mode 100644 index 00000000..6c491ff4 --- /dev/null +++ b/source/aggregation/documents.txt @@ -0,0 +1,46 @@ +.. _csharp-aggregation-documents: + +========= +Documents +========= + +.. facet:: + :name: genre + :values: reference + +.. meta:: + :keywords: code example, transform, pipeline + +.. contents:: On this page + :local: + :backlinks: none + :depth: 2 + :class: singlecol + + +Documents +--------- + +Use the ``documents()`` method to create a +:manual:`$documents ` +pipeline stage that returns literal documents from input values. + +.. important:: + + If you use a ``$documents`` stage in an aggregation pipeline, it must be the first + stage in the pipeline. + +The following example creates a pipeline stage that creates +sample documents with a ``title`` field: + +.. literalinclude:: /includes/aggregation/Builders.cs + :start-after: // begin documents + :end-before: // end documents + :language: csharp + :dedent: + +.. important:: + + If you use the ``documents()`` method to provide the input to an aggregation pipeline, + you must call the ``aggregate()`` method on a database instead of on a + collection. \ No newline at end of file diff --git a/source/aggregation/facet.txt b/source/aggregation/facet.txt new file mode 100644 index 00000000..3e711a28 --- /dev/null +++ b/source/aggregation/facet.txt @@ -0,0 +1,39 @@ +.. _csharp-aggregation-facet: + +===== +Facet +===== + +.. facet:: + :name: genre + :values: reference + +.. meta:: + :keywords: code example, transform, pipeline + +.. contents:: On this page + :local: + :backlinks: none + :depth: 2 + :class: singlecol + + +Facet +----- + +Use the ``facet()`` method to create a :manual:`$facet ` +pipeline stage that allows for the definition of parallel pipelines. + +The following example creates a pipeline stage that executes two parallel aggregations: + +- The first aggregation distributes incoming documents into 5 groups according to + their ``attributes.screen_size`` field. + +- The second aggregation counts all *manufacturers* and returns their count, limited + to the top **5**. + +.. literalinclude:: /includes/aggregation/Builders.cs + :start-after: // begin facet + :end-before: // end facet + :language: csharp + :dedent: \ No newline at end of file diff --git a/source/aggregation/graphLookup.txt b/source/aggregation/graphLookup.txt new file mode 100644 index 00000000..d69e1780 --- /dev/null +++ b/source/aggregation/graphLookup.txt @@ -0,0 +1,57 @@ +.. _csharp-aggregation-graphlookup: + +=========== +GraphLookup +=========== + +.. facet:: + :name: genre + :values: reference + +.. meta:: + :keywords: code example, transform, pipeline + +.. contents:: On this page + :local: + :backlinks: none + :depth: 2 + :class: singlecol + + +GraphLookup +----------- + +Use the ``graphLookup()`` method to create a :manual:`$graphLookup ` +pipeline stage that performs a recursive search on a specified collection to match +a specified field in one document to a specified field of another document. + +The following example computes the social network graph for users in the +``contacts`` collection, recursively matching the value in the ``friends`` field +to the ``name`` field: + +.. literalinclude:: /includes/aggregation/Builders.cs + :start-after: // begin graphLookupBasic + :end-before: // end graphLookupBasic + :language: csharp + :dedent: + +Using ``GraphLookupOptions``, you can specify the depth to recurse as well as +the name of the depth field, if desired. In this example, ``$graphLookup`` will +recurse up to two times, and create a field called ``degrees`` with the +recursion depth information for every document. + +.. literalinclude:: /includes/aggregation/Builders.cs + :start-after: // begin graphLookupDepth + :end-before: // end graphLookupDepth + :language: csharp + :dedent: + +Using ``GraphLookupOptions``, you can specify a filter that documents must match +in order for MongoDB to include them in your search. In this +example, only links with "golf" in their ``hobbies`` field will be included. + +.. literalinclude:: /includes/aggregation/Builders.cs + :start-after: // begin graphLookupMatch + :end-before: // end graphLookupMatch + :language: csharp + :dedent: diff --git a/source/aggregation/group.txt b/source/aggregation/group.txt new file mode 100644 index 00000000..399803ee --- /dev/null +++ b/source/aggregation/group.txt @@ -0,0 +1,45 @@ +.. _csharp-aggregation-group: + +===== +Group +===== + +.. facet:: + :name: genre + :values: reference + +.. meta:: + :keywords: code example, transform, pipeline + +.. contents:: On this page + :local: + :backlinks: none + :depth: 2 + :class: singlecol + + +Group +----- + +Use the ``group()`` method to create a :manual:`$group ` +pipeline stage to group documents by a specified expression and output a document +for each distinct grouping. + +.. tip:: + + The driver includes the `Accumulators <{+core-api+}/client/model/Accumulators.html>`__ + class with static factory methods for each of the supported accumulators. + +The following example creates a pipeline stage that groups documents by the value +of the ``customerId`` field. Each group accumulates the sum and average +of the values of the ``quantity`` field into the ``totalQuantity`` and +``averageQuantity`` fields. + +.. literalinclude:: /includes/aggregation/Builders.cs + :start-after: // begin group + :end-before: // end group + :language: csharp + :dedent: + +Learn more about accumulator operators from the Server manual section +on :manual:`Accumulators `. \ No newline at end of file diff --git a/source/aggregation/limit.txt b/source/aggregation/limit.txt new file mode 100644 index 00000000..ecc86ecd --- /dev/null +++ b/source/aggregation/limit.txt @@ -0,0 +1,33 @@ +.. _csharp-aggregation-limit: + +===== +Limit +===== + +.. facet:: + :name: genre + :values: reference + +.. meta:: + :keywords: code example, transform, pipeline + +.. contents:: On this page + :local: + :backlinks: none + :depth: 2 + :class: singlecol + + +Limit +----- + +Use the :manual:`$limit ` pipeline stage +to limit the number of documents passed to the next stage. + +The following example creates a pipeline stage that limits the number of documents to ``10``: + +.. literalinclude:: /includes/aggregation/Builders.cs + :start-after: // begin limit + :end-before: // end limit + :language: csharp + :dedent: \ No newline at end of file diff --git a/source/aggregation/lookup.txt b/source/aggregation/lookup.txt new file mode 100644 index 00000000..6db7b6e4 --- /dev/null +++ b/source/aggregation/lookup.txt @@ -0,0 +1,53 @@ +.. _csharp-aggregation-lookup: + +====== +Lookup +====== + +.. facet:: + :name: genre + :values: reference + +.. meta:: + :keywords: code example, transform, pipeline + +.. contents:: On this page + :local: + :backlinks: none + :depth: 2 + :class: singlecol + + +Lookup +------ + +Use the ``lookup()`` method to create a :manual:`$lookup ` +pipeline stage to perform joins and uncorrelated subqueries between two collections. + +Left Outer Join +~~~~~~~~~~~~~~~ + +The following example creates a pipeline stage that performs a left outer +join between the ``movies`` and ``comments`` collections: + +- It joins the ``_id`` field from ``movies`` to the ``movie_id`` field in ``comments`` +- It outputs the results in the ``joined_comments`` field: + +.. literalinclude:: /includes/aggregation/Builders.cs + :start-after: // begin basic lookup + :end-before: // end basic lookup + :language: csharp + :dedent: + +Full Join and Uncorrelated Subqueries +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +The following example creates a pipeline stage that joins two collections, ``orders`` +and ``warehouses``, by the item and whether the available quantity is enough +to fulfill the ordered quantity: + +.. literalinclude:: /includes/aggregation/Builders.cs + :start-after: // begin advanced lookup + :end-before: // end advanced lookup + :language: csharp + :dedent: diff --git a/source/aggregation/match.txt b/source/aggregation/match.txt new file mode 100644 index 00000000..64f6a516 --- /dev/null +++ b/source/aggregation/match.txt @@ -0,0 +1,39 @@ +.. _csharp-aggregation-match: + +===== +Match +===== + +.. facet:: + :name: genre + :values: reference + +.. meta:: + :keywords: code example, transform, pipeline + +.. contents:: On this page + :local: + :backlinks: none + :depth: 2 + :class: singlecol + +Match +----- + +Use the ``match()`` method to create a :manual:`$match ` +pipeline stage that matches incoming documents against the specified +query filter, filtering out documents that do not match. + +.. tip:: + + The filter can be an instance of any class that implements ``Bson``, but it's + convenient to combine with use of the :ref:`Filters ` class. + +The following example creates a pipeline stage that matches all documents where the +``title`` field is equal to "The Shawshank Redemption": + +.. literalinclude:: /includes/aggregation/Builders.cs + :start-after: begin match + :end-before: end match + :language: csharp + :dedent: diff --git a/source/aggregation/merge.txt b/source/aggregation/merge.txt new file mode 100644 index 00000000..90a54393 --- /dev/null +++ b/source/aggregation/merge.txt @@ -0,0 +1,49 @@ +.. _csharp-aggregation-merge: + +===== +Merge +===== + +.. facet:: + :name: genre + :values: reference + +.. meta:: + :keywords: code example, transform, pipeline + +.. contents:: On this page + :local: + :backlinks: none + :depth: 2 + :class: singlecol + + +Merge +----- + +Use the ``merge()`` method to create a :manual:`$merge ` +pipeline stage that merges all documents into the specified collection. + +.. important:: + + The ``$merge`` stage must be the last stage in any aggregation pipeline. + +The following example merges the pipeline into the ``authors`` collection using the default +options: + +.. literalinclude:: /includes/aggregation/Builders.cs + :start-after: // begin mergeStage + :end-before: // end mergeStage + :language: csharp + :dedent: + +The following example merges the pipeline into the ``customers`` collection in the +``reporting`` database using some options that specify to replace +the document if both ``date`` and ``customerId`` match, otherwise insert the +document: + +.. literalinclude:: /includes/aggregation/Builders.cs + :start-after: // begin mergeOptions + :end-before: // end mergeOptions + :language: csharp + :dedent: diff --git a/source/aggregation/out.txt b/source/aggregation/out.txt new file mode 100644 index 00000000..c12c03c9 --- /dev/null +++ b/source/aggregation/out.txt @@ -0,0 +1,38 @@ +.. _csharp-aggregation-out: + +=== +Out +=== + +.. facet:: + :name: genre + :values: reference + +.. meta:: + :keywords: code example, transform, pipeline + +.. contents:: On this page + :local: + :backlinks: none + :depth: 2 + :class: singlecol + +Out +--- + +Use the ``out()`` method to create an :manual:`$out ` +pipeline stage that writes all documents to the specified collection in +the same database. + +.. important:: + + The ``$out`` stage must be the last stage in any aggregation pipeline. + +The following example writes the results of the pipeline to the ``authors`` +collection: + +.. literalinclude:: /includes/aggregation/Builders.cs + :start-after: // begin out + :end-before: // end out + :language: csharp + :dedent: \ No newline at end of file diff --git a/source/aggregation/project.txt b/source/aggregation/project.txt new file mode 100644 index 00000000..0c47b837 --- /dev/null +++ b/source/aggregation/project.txt @@ -0,0 +1,55 @@ +.. _csharp-aggregation-project: + +======= +Project +======= + +.. facet:: + :name: genre + :values: reference + +.. meta:: + :keywords: code example, transform, pipeline + +.. contents:: On this page + :local: + :backlinks: none + :depth: 2 + :class: singlecol + + +Project +------- + +Use the ``project()`` method to create a :manual:`$project ` +pipeline stage that project specified document fields. Field projection +in aggregation follows the same rules as :ref:`field projection in queries `. + +.. tip:: + + Though the projection can be an instance of any class that implements ``Bson``, + it's convenient to combine with use of :ref:`Projections `. + +The following example creates a pipeline stage that excludes the ``_id`` field but +includes the ``title`` and ``plot`` fields: + +.. literalinclude:: /includes/aggregation/Builders.cs + :start-after: begin project + :end-before: end project + :language: csharp + :dedent: + + +Projecting Computed Fields +~~~~~~~~~~~~~~~~~~~~~~~~~~ + +The ``$project`` stage can project computed fields as well. + +The following example creates a pipeline stage that projects the ``rated`` field +into a new field called ``rating``, effectively renaming the field. + +.. literalinclude:: /includes/aggregation/Builders.cs + :start-after: begin computed + :end-before: end computed + :language: csharp + :dedent: \ No newline at end of file diff --git a/source/aggregation/rankFusion.txt b/source/aggregation/rankFusion.txt new file mode 100644 index 00000000..22ae0fbd --- /dev/null +++ b/source/aggregation/rankFusion.txt @@ -0,0 +1,18 @@ +.. _csharp-aggregation-rankfusion: + +========== +RankFusion +========== + +.. facet:: + :name: genre + :values: reference + +.. meta:: + :keywords: code example, transform, pipeline + +.. contents:: On this page + :local: + :backlinks: none + :depth: 2 + :class: singlecol \ No newline at end of file diff --git a/source/aggregation/replaceRoot.txt b/source/aggregation/replaceRoot.txt new file mode 100644 index 00000000..e2b141be --- /dev/null +++ b/source/aggregation/replaceRoot.txt @@ -0,0 +1,34 @@ +.. _csharp-aggregation-replaceroot: + +=========== +ReplaceRoot +=========== + +.. facet:: + :name: genre + :values: reference + +.. meta:: + :keywords: code example, transform, pipeline + +.. contents:: On this page + :local: + :backlinks: none + :depth: 2 + :class: singlecol + + +ReplaceRoot +----------- + +Use the ``replaceRoot()`` method to create a :manual:`$replaceRoot ` +pipeline stage that replaces each input document with the specified document. + +The following example replaces each input document with the nested document +in the ``spanish_translation`` field: + +.. literalinclude:: /includes/aggregation/Builders.cs + :start-after: // begin replaceRoot + :end-before: // end replaceRoot + :language: csharp + :dedent: \ No newline at end of file diff --git a/source/aggregation/replaceWith.txt b/source/aggregation/replaceWith.txt new file mode 100644 index 00000000..4e8a6af1 --- /dev/null +++ b/source/aggregation/replaceWith.txt @@ -0,0 +1,18 @@ +.. _csharp-aggregation-replacewith: + +=========== +ReplaceWith +=========== + +.. facet:: + :name: genre + :values: reference + +.. meta:: + :keywords: code example, transform, pipeline + +.. contents:: On this page + :local: + :backlinks: none + :depth: 2 + :class: singlecol \ No newline at end of file diff --git a/source/aggregation/sample.txt b/source/aggregation/sample.txt new file mode 100644 index 00000000..83de28cb --- /dev/null +++ b/source/aggregation/sample.txt @@ -0,0 +1,33 @@ +.. _csharp-aggregation-sample: + +====== +Sample +====== + +.. facet:: + :name: genre + :values: reference + +.. meta:: + :keywords: code example, transform, pipeline + +.. contents:: On this page + :local: + :backlinks: none + :depth: 2 + :class: singlecol + + +Sample +------ + +Use the ``sample()`` method to create a :manual:`$sample ` +pipeline stage to randomly select documents from input. + +The following example creates a pipeline stage that randomly selects 5 documents: + +.. literalinclude:: /includes/aggregation/Builders.cs + :start-after: // begin sample + :end-before: // end sample + :language: csharp + :dedent: \ No newline at end of file diff --git a/source/aggregation/search.txt b/source/aggregation/search.txt new file mode 100644 index 00000000..a9b12c7b --- /dev/null +++ b/source/aggregation/search.txt @@ -0,0 +1,45 @@ +.. _csharp-aggregation-search: + +====== +Search +====== + +.. facet:: + :name: genre + :values: reference + +.. meta:: + :keywords: code example, transform, pipeline + +.. contents:: On this page + :local: + :backlinks: none + :depth: 2 + :class: singlecol + + +Atlas Full-Text Search +---------------------- + +Use the ``search()`` method to create a :manual:`$search ` +pipeline stage that specifies a full-text search of one or more fields. + +.. tip:: Only Available on Atlas for MongoDB v4.2 and later + + This aggregation pipeline operator is only available for collections hosted + on :atlas:`MongoDB Atlas ` clusters running v4.2 or later that are + covered by an :atlas:`Atlas search index `. + Learn more about the required setup and the functionality of this operator + from the :ref:`Atlas Search ` documentation. + +The following example creates a pipeline stage that searches the ``title`` +field for text that contains the word "Future": + +.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggregateSearchBuilderExample.java + :start-after: // begin atlasTextSearch + :end-before: // end atlasTextSearch + :language: csharp + :dedent: + +Learn more about the builders from the +`search package API documentation <{+core-api+}/client/model/search/package-summary.html>`__. \ No newline at end of file diff --git a/source/aggregation/searchMeta.txt b/source/aggregation/searchMeta.txt new file mode 100644 index 00000000..1225ec80 --- /dev/null +++ b/source/aggregation/searchMeta.txt @@ -0,0 +1,46 @@ +.. _csharp-aggregation-searchmeta: + +========== +SearchMeta +========== + +.. facet:: + :name: genre + :values: reference + +.. meta:: + :keywords: code example, transform, pipeline + +.. contents:: On this page + :local: + :backlinks: none + :depth: 2 + :class: singlecol + + +Atlas Search Metadata +--------------------- + +Use the ``searchMeta()`` method to create a +:manual:`$searchMeta ` +pipeline stage which returns only the metadata part of the results from +Atlas full-text search queries. + +.. tip:: Only Available on Atlas for MongoDB v4.4.11 and later + + This aggregation pipeline operator is only available + on :atlas:`MongoDB Atlas ` clusters running v4.4.11 and later. For a + detailed list of version availability, see the MongoDB Atlas documentation + on :atlas:`$searchMeta `. + +The following example shows the ``count`` metadata for an Atlas search +aggregation stage: + +.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggregateSearchBuilderExample.java + :start-after: // begin atlasSearchMeta + :end-before: // end atlasSearchMeta + :language: csharp + :dedent: + +Learn more about this helper from the +`searchMeta() API documentation <{+core-api+}/client/model/Aggregates.html#searchMeta(com.mongodb.client.model.search.SearchCollector)>`__. \ No newline at end of file diff --git a/source/aggregation/set.txt b/source/aggregation/set.txt new file mode 100644 index 00000000..f4c04063 --- /dev/null +++ b/source/aggregation/set.txt @@ -0,0 +1,18 @@ +.. _csharp-aggregation-set: + +=== +Set +=== + +.. facet:: + :name: genre + :values: reference + +.. meta:: + :keywords: code example, transform, pipeline + +.. contents:: On this page + :local: + :backlinks: none + :depth: 2 + :class: singlecol \ No newline at end of file diff --git a/source/aggregation/setWindowFields.txt b/source/aggregation/setWindowFields.txt new file mode 100644 index 00000000..e09948be --- /dev/null +++ b/source/aggregation/setWindowFields.txt @@ -0,0 +1,44 @@ +.. _csharp-aggregation-setwindowfields: + +=============== +SetWindowFields +=============== + +.. facet:: + :name: genre + :values: reference + +.. meta:: + :keywords: code example, transform, pipeline + +.. contents:: On this page + :local: + :backlinks: none + :depth: 2 + :class: singlecol + + +.. _builders-aggregates-setWindowFields: + +SetWindowFields +--------------- + +Use the ``setWindowFields()`` method to create a :manual:`$setWindowFields ` +pipeline stage that allows using window operators to perform operations +on a specified span of documents in a collection. + +.. tip:: Window Functions + + The driver includes the `Windows <{+core-api+}/client/model/Windows.html>`__ + class with static factory methods for building windowed computations. + +The following example creates a pipeline stage that computes the +accumulated rainfall and the average temperature over the past month for +each locality from more fine-grained measurements presented in the ``rainfall`` +and ``temperature`` fields: + +.. literalinclude:: /includes/aggregation/Builders.cs + :start-after: // begin setWindowFields + :end-before: // end setWindowFields + :language: csharp + :dedent: \ No newline at end of file diff --git a/source/aggregation/skip.txt b/source/aggregation/skip.txt new file mode 100644 index 00000000..4d00f565 --- /dev/null +++ b/source/aggregation/skip.txt @@ -0,0 +1,34 @@ +.. _csharp-aggregation-skip: + +==== +Skip +==== + +.. facet:: + :name: genre + :values: reference + +.. meta:: + :keywords: code example, transform, pipeline + +.. contents:: On this page + :local: + :backlinks: none + :depth: 2 + :class: singlecol + + +Skip +---- + +Use the ``skip()`` method to create a :manual:`$skip ` +pipeline stage to skip over the specified number of documents before +passing documents into the next stage. + +The following example creates a pipeline stage that skips the first ``5`` documents: + +.. literalinclude:: /includes/aggregation/Builders.cs + :start-after: // begin skip + :end-before: // end skip + :language: csharp + :dedent: \ No newline at end of file diff --git a/source/aggregation/sort.txt b/source/aggregation/sort.txt new file mode 100644 index 00000000..b8841b67 --- /dev/null +++ b/source/aggregation/sort.txt @@ -0,0 +1,40 @@ +.. _csharp-aggregation-sort: + +==== +Sort +==== + +.. facet:: + :name: genre + :values: reference + +.. meta:: + :keywords: code example, transform, pipeline + +.. contents:: On this page + :local: + :backlinks: none + :depth: 2 + :class: singlecol + + +Sort +---- + +Use the ``sort()`` method to create a :manual:`$sort ` +pipeline stage to sort by the specified criteria. + +.. tip:: + + Though the sort criteria can be an instance of any class that + implements ``Bson``, it's convenient to combine with use of :ref:`Sorts `. + +The following example creates a pipeline stage that sorts in descending order according +to the value of the ``year`` field and then in ascending order according to the +value of the ``title`` field: + +.. literalinclude:: /includes/aggregation/Builders.cs + :start-after: // begin sortStage + :end-before: // end sortStage + :language: csharp + :dedent: \ No newline at end of file diff --git a/source/aggregation/sortByCount.txt b/source/aggregation/sortByCount.txt new file mode 100644 index 00000000..0d5fb077 --- /dev/null +++ b/source/aggregation/sortByCount.txt @@ -0,0 +1,47 @@ +.. _csharp-aggregation-sortbycount: + +=========== +SortByCount +=========== + +.. facet:: + :name: genre + :values: reference + +.. meta:: + :keywords: code example, transform, pipeline + +.. contents:: On this page + :local: + :backlinks: none + :depth: 2 + :class: singlecol + + +SortByCount +----------- + +Use the ``sortByCount()`` method to create a :manual:`$sortByCount ` +pipeline stage that groups documents by a given expression and then sorts +these groups by count in descending order. + +.. tip:: + + The ``$sortByCount`` stage is identical to a ``$group`` stage with a + ``$sum`` accumulator followed by a ``$sort`` stage. + + .. code-block:: json + + [ + { "$group": { "_id": , "count": { "$sum": 1 } } }, + { "$sort": { "count": -1 } } + ] + +The following example groups documents by the truncated value of the field ``x`` +and computes the count for each distinct value: + +.. literalinclude:: /includes/aggregation/Builders.cs + :start-after: // begin sortByCount + :end-before: // end sortByCount + :language: csharp + :dedent: \ No newline at end of file diff --git a/source/aggregation/stages.txt b/source/aggregation/stages.txt deleted file mode 100644 index 2c2795df..00000000 --- a/source/aggregation/stages.txt +++ /dev/null @@ -1,868 +0,0 @@ -.. _csharp-aggregation-stages: - -================== -Aggregation Stages -================== - -.. facet:: - :name: genre - :values: reference - -.. meta:: - :keywords: code examples, dotnet - -.. contents:: On this page - :local: - :backlinks: none - :depth: 2 - :class: singlecol - -Overview --------- - -Bucket ------- - -Use the ``bucket()`` method to create a :manual:`$bucket ` -pipeline stage that automates the bucketing of data around predefined boundary -values. - -The following example creates a pipeline stage that groups incoming documents based -on the value of their ``screenSize`` field, inclusive of the lower boundary -and exclusive of the upper boundary. - -.. literalinclude:: /includes/aggregation/Builders.cs - :start-after: // begin basicBucket - :end-before: // end basicBucket - :language: csharp - :dedent: - -Use the ``BucketOptions`` class to specify a default bucket for values -outside of the specified boundaries, and to specify additional accumulators. - -The following example creates a pipeline stage that groups incoming documents based -on the value of their ``screenSize`` field, counting the number of documents -that fall within each bucket, pushing the value of ``screenSize`` into a -field called ``matches``, and capturing any screen sizes greater than "70" -into a bucket called "monster" for monstrously large screen sizes: - -.. tip:: - - The driver includes the `Accumulators <{+core-api+}/client/model/Accumulators.html>`__ - class with static factory methods for each of the supported accumulators. - -.. literalinclude:: /includes/aggregation/Builders.cs - :start-after: // begin bucketOptions - :end-before: // end bucketOptions - :language: csharp - :dedent: - -BucketAuto ----------- - -Use the ``bucketAuto()`` method to create a :manual:`$bucketAuto ` -pipeline stage that automatically determines the boundaries of each bucket -in its attempt to distribute the documents evenly into a specified number of buckets. - -The following example creates a pipeline stage that will attempt to create and evenly -distribute documents into *10* buckets using the value of their ``price`` field: - -.. literalinclude:: /includes/aggregation/Builders.cs - :start-after: // begin bucketAutoBasic - :end-before: // end bucketAutoBasic - :language: csharp - :dedent: - - -Use the ``BucketAutoOptions`` class to specify a :wikipedia:`preferred number ` -based scheme to set boundary values, and specify additional accumulators. - -The following example creates a pipeline stage that will attempt to create and evenly -distribute documents into *10* buckets using the value of their ``price`` field, -setting the bucket boundaries at powers of 2 (2, 4, 8, 16, ...). It also counts -the number of documents in each bucket, and calculates their average ``price`` -in a new field called ``avgPrice``: - -.. literalinclude:: /includes/aggregation/Builders.cs - :start-after: // begin bucketAutoOptions - :end-before: // end bucketAutoOptions - :language: csharp - :dedent: - -Count ------ - -Use the ``count()`` method to create a :manual:`$count ` -pipeline stage that counts the number of documents that enter the stage, and assigns -that value to a specified field name. If you do not specify a field, -``count()`` defaults the field name to "count". - -.. tip:: - - The ``$count`` stage is syntactic sugar for: - - .. code-block:: json - - { "$group":{ "_id": 0, "count": { "$sum" : 1 } } } - -The following example creates a pipeline stage that outputs the count of incoming -documents in a field called "total": - -.. literalinclude:: /includes/aggregation/Builders.cs - :start-after: // begin count - :end-before: // end count - :language: csharp - :dedent: - -Densify -------- - -Use the ``densify()`` method to create a -:manual:`$densify ` pipeline -stage that generates a sequence of documents to span a specified interval. - -.. tip:: - - You can use the ``$densify()`` aggregation stage only when running - MongoDB v5.1 or later. - -Consider the following documents retrieved from the :atlas:`Atlas sample weather dataset ` -that contain measurements for a similar ``position`` field, spaced one hour -apart: - -.. code-block:: none - :copyable: false - - Document{{ _id=5553a..., position=Document{{type=Point, coordinates=[-47.9, 47.6]}}, ts=Mon Mar 05 08:00:00 EST 1984, ... }} - Document{{ _id=5553b..., position=Document{{type=Point, coordinates=[-47.9, 47.6]}}, ts=Mon Mar 05 09:00:00 EST 1984, ... }} - -Suppose you needed to create a pipeline stage that performs the following -actions on these documents: - -- Add a document at every 15-minute interval for which a ``ts`` value does not - already exist. -- Group the documents by the ``position`` field. - -The call to the ``densify()`` aggregation stage builder that accomplishes -these actions resembles the following: - -.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggregateDensify.java - :start-after: // begin densify aggregate - :end-before: // end densify aggregate - :language: csharp - :dedent: - -The following output highlights the documents generated by the aggregate stage -which contain ``ts`` values every 15 minutes between the existing documents: - -.. code-block:: none - :emphasize-lines: 2-4 - :copyable: false - - Document{{ _id=5553a..., position=Document{{type=Point, coordinates=[-47.9, 47.6]}}, ts=Mon Mar 05 08:00:00 EST 1984, ... }} - Document{{ position=Document{{coordinates=[-47.9, 47.6]}}, ts=Mon Mar 05 08:15:00 EST 1984 }} - Document{{ position=Document{{coordinates=[-47.9, 47.6]}}, ts=Mon Mar 05 08:30:00 EST 1984 }} - Document{{ position=Document{{coordinates=[-47.9, 47.6]}}, ts=Mon Mar 05 08:45:00 EST 1984 }} - Document{{ _id=5553b..., position=Document{{type=Point, coordinates=[-47.9, 47.6]}}, ts=Mon Mar 05 09:00:00 EST 1984, ... }} - -See the `densify package API documentation <{+core-api+}/client/model/densify/package-summary.html>`__ -for more information. - -Documents ---------- - -Use the ``documents()`` method to create a -:manual:`$documents ` -pipeline stage that returns literal documents from input values. - -.. important:: - - If you use a ``$documents`` stage in an aggregation pipeline, it must be the first - stage in the pipeline. - -The following example creates a pipeline stage that creates -sample documents with a ``title`` field: - -.. literalinclude:: /includes/aggregation/Builders.cs - :start-after: // begin documents - :end-before: // end documents - :language: csharp - :dedent: - -.. important:: - - If you use the ``documents()`` method to provide the input to an aggregation pipeline, - you must call the ``aggregate()`` method on a database instead of on a - collection. - -Facet ------ - -Use the ``facet()`` method to create a :manual:`$facet ` -pipeline stage that allows for the definition of parallel pipelines. - -The following example creates a pipeline stage that executes two parallel aggregations: - -- The first aggregation distributes incoming documents into 5 groups according to - their ``attributes.screen_size`` field. - -- The second aggregation counts all *manufacturers* and returns their count, limited - to the top **5**. - -.. literalinclude:: /includes/aggregation/Builders.cs - :start-after: // begin facet - :end-before: // end facet - :language: csharp - :dedent: - -GraphLookup ------------ - -Use the ``graphLookup()`` method to create a :manual:`$graphLookup ` -pipeline stage that performs a recursive search on a specified collection to match -a specified field in one document to a specified field of another document. - -The following example computes the social network graph for users in the -``contacts`` collection, recursively matching the value in the ``friends`` field -to the ``name`` field: - -.. literalinclude:: /includes/aggregation/Builders.cs - :start-after: // begin graphLookupBasic - :end-before: // end graphLookupBasic - :language: csharp - :dedent: - -Using ``GraphLookupOptions``, you can specify the depth to recurse as well as -the name of the depth field, if desired. In this example, ``$graphLookup`` will -recurse up to two times, and create a field called ``degrees`` with the -recursion depth information for every document. - -.. literalinclude:: /includes/aggregation/Builders.cs - :start-after: // begin graphLookupDepth - :end-before: // end graphLookupDepth - :language: csharp - :dedent: - -Using ``GraphLookupOptions``, you can specify a filter that documents must match -in order for MongoDB to include them in your search. In this -example, only links with "golf" in their ``hobbies`` field will be included. - -.. literalinclude:: /includes/aggregation/Builders.cs - :start-after: // begin graphLookupMatch - :end-before: // end graphLookupMatch - :language: csharp - :dedent: - -Group ------ - -Use the ``group()`` method to create a :manual:`$group ` -pipeline stage to group documents by a specified expression and output a document -for each distinct grouping. - -.. tip:: - - The driver includes the `Accumulators <{+core-api+}/client/model/Accumulators.html>`__ - class with static factory methods for each of the supported accumulators. - -The following example creates a pipeline stage that groups documents by the value -of the ``customerId`` field. Each group accumulates the sum and average -of the values of the ``quantity`` field into the ``totalQuantity`` and -``averageQuantity`` fields. - -.. literalinclude:: /includes/aggregation/Builders.cs - :start-after: // begin group - :end-before: // end group - :language: csharp - :dedent: - -Learn more about accumulator operators from the Server manual section -on :manual:`Accumulators `. - -Limit ------ - -Use the :manual:`$limit ` pipeline stage -to limit the number of documents passed to the next stage. - -The following example creates a pipeline stage that limits the number of documents to ``10``: - -.. literalinclude:: /includes/aggregation/Builders.cs - :start-after: // begin limit - :end-before: // end limit - :language: csharp - :dedent: - -Lookup ------- - -Use the ``lookup()`` method to create a :manual:`$lookup ` -pipeline stage to perform joins and uncorrelated subqueries between two collections. - -Left Outer Join -~~~~~~~~~~~~~~~ - -The following example creates a pipeline stage that performs a left outer -join between the ``movies`` and ``comments`` collections: - -- It joins the ``_id`` field from ``movies`` to the ``movie_id`` field in ``comments`` -- It outputs the results in the ``joined_comments`` field: - -.. literalinclude:: /includes/aggregation/Builders.cs - :start-after: // begin basic lookup - :end-before: // end basic lookup - :language: csharp - :dedent: - -Full Join and Uncorrelated Subqueries -~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ - -The following example creates a pipeline stage that joins two collections, ``orders`` -and ``warehouses``, by the item and whether the available quantity is enough -to fulfill the ordered quantity: - -.. literalinclude:: /includes/aggregation/Builders.cs - :start-after: // begin advanced lookup - :end-before: // end advanced lookup - :language: csharp - :dedent: - -Match ------ - -Use the ``match()`` method to create a :manual:`$match ` -pipeline stage that matches incoming documents against the specified -query filter, filtering out documents that do not match. - -.. tip:: - - The filter can be an instance of any class that implements ``Bson``, but it's - convenient to combine with use of the :ref:`Filters ` class. - -The following example creates a pipeline stage that matches all documents where the -``title`` field is equal to "The Shawshank Redemption": - -.. literalinclude:: /includes/aggregation/Builders.cs - :start-after: begin match - :end-before: end match - :language: csharp - :dedent: - -Merge ------ - -Use the ``merge()`` method to create a :manual:`$merge ` -pipeline stage that merges all documents into the specified collection. - -.. important:: - - The ``$merge`` stage must be the last stage in any aggregation pipeline. - -The following example merges the pipeline into the ``authors`` collection using the default -options: - -.. literalinclude:: /includes/aggregation/Builders.cs - :start-after: // begin mergeStage - :end-before: // end mergeStage - :language: csharp - :dedent: - -The following example merges the pipeline into the ``customers`` collection in the -``reporting`` database using some options that specify to replace -the document if both ``date`` and ``customerId`` match, otherwise insert the -document: - -.. literalinclude:: /includes/aggregation/Builders.cs - :start-after: // begin mergeOptions - :end-before: // end mergeOptions - :language: csharp - :dedent: - -Out ---- - -Use the ``out()`` method to create an :manual:`$out ` -pipeline stage that writes all documents to the specified collection in -the same database. - -.. important:: - - The ``$out`` stage must be the last stage in any aggregation pipeline. - -The following example writes the results of the pipeline to the ``authors`` -collection: - -.. literalinclude:: /includes/aggregation/Builders.cs - :start-after: // begin out - :end-before: // end out - :language: csharp - :dedent: - -Project -------- - -Use the ``project()`` method to create a :manual:`$project ` -pipeline stage that project specified document fields. Field projection -in aggregation follows the same rules as :ref:`field projection in queries `. - -.. tip:: - - Though the projection can be an instance of any class that implements ``Bson``, - it's convenient to combine with use of :ref:`Projections `. - -The following example creates a pipeline stage that excludes the ``_id`` field but -includes the ``title`` and ``plot`` fields: - -.. literalinclude:: /includes/aggregation/Builders.cs - :start-after: begin project - :end-before: end project - :language: csharp - :dedent: - -Projecting Computed Fields -~~~~~~~~~~~~~~~~~~~~~~~~~~ - -The ``$project`` stage can project computed fields as well. - -The following example creates a pipeline stage that projects the ``rated`` field -into a new field called ``rating``, effectively renaming the field. - -.. literalinclude:: /includes/aggregation/Builders.cs - :start-after: begin computed - :end-before: end computed - :language: csharp - :dedent: - -ReplaceRoot ------------ - -Use the ``replaceRoot()`` method to create a :manual:`$replaceRoot ` -pipeline stage that replaces each input document with the specified document. - -The following example replaces each input document with the nested document -in the ``spanish_translation`` field: - -.. literalinclude:: /includes/aggregation/Builders.cs - :start-after: // begin replaceRoot - :end-before: // end replaceRoot - :language: csharp - :dedent: - -Sample ------- - -Use the ``sample()`` method to create a :manual:`$sample ` -pipeline stage to randomly select documents from input. - -The following example creates a pipeline stage that randomly selects 5 documents: - -.. literalinclude:: /includes/aggregation/Builders.cs - :start-after: // begin sample - :end-before: // end sample - :language: csharp - :dedent: - -Atlas Full-Text Search ----------------------- - -Use the ``search()`` method to create a :manual:`$search ` -pipeline stage that specifies a full-text search of one or more fields. - -.. tip:: Only Available on Atlas for MongoDB v4.2 and later - - This aggregation pipeline operator is only available for collections hosted - on :atlas:`MongoDB Atlas ` clusters running v4.2 or later that are - covered by an :atlas:`Atlas search index `. - Learn more about the required setup and the functionality of this operator - from the :ref:`Atlas Search ` documentation. - -The following example creates a pipeline stage that searches the ``title`` -field for text that contains the word "Future": - -.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggregateSearchBuilderExample.java - :start-after: // begin atlasTextSearch - :end-before: // end atlasTextSearch - :language: csharp - :dedent: - -Learn more about the builders from the -`search package API documentation <{+core-api+}/client/model/search/package-summary.html>`__. -Atlas Search Metadata ---------------------- - -Use the ``searchMeta()`` method to create a -:manual:`$searchMeta ` -pipeline stage which returns only the metadata part of the results from -Atlas full-text search queries. - -.. tip:: Only Available on Atlas for MongoDB v4.4.11 and later - - This aggregation pipeline operator is only available - on :atlas:`MongoDB Atlas ` clusters running v4.4.11 and later. For a - detailed list of version availability, see the MongoDB Atlas documentation - on :atlas:`$searchMeta `. - -The following example shows the ``count`` metadata for an Atlas search -aggregation stage: - -.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggregateSearchBuilderExample.java - :start-after: // begin atlasSearchMeta - :end-before: // end atlasSearchMeta - :language: csharp - :dedent: - -Learn more about this helper from the -`searchMeta() API documentation <{+core-api+}/client/model/Aggregates.html#searchMeta(com.mongodb.client.model.search.SearchCollector)>`__. - -.. _builders-aggregates-setWindowFields: - -SetWindowFields ---------------- - -Use the ``setWindowFields()`` method to create a :manual:`$setWindowFields ` -pipeline stage that allows using window operators to perform operations -on a specified span of documents in a collection. - -.. tip:: Window Functions - - The driver includes the `Windows <{+core-api+}/client/model/Windows.html>`__ - class with static factory methods for building windowed computations. - -The following example creates a pipeline stage that computes the -accumulated rainfall and the average temperature over the past month for -each locality from more fine-grained measurements presented in the ``rainfall`` -and ``temperature`` fields: - -.. literalinclude:: /includes/aggregation/Builders.cs - :start-after: // begin setWindowFields - :end-before: // end setWindowFields - :language: csharp - :dedent: - -Skip ----- - -Use the ``skip()`` method to create a :manual:`$skip ` -pipeline stage to skip over the specified number of documents before -passing documents into the next stage. - -The following example creates a pipeline stage that skips the first ``5`` documents: - -.. literalinclude:: /includes/aggregation/Builders.cs - :start-after: // begin skip - :end-before: // end skip - :language: csharp - :dedent: - -Sort ----- - -Use the ``sort()`` method to create a :manual:`$sort ` -pipeline stage to sort by the specified criteria. - -.. tip:: - - Though the sort criteria can be an instance of any class that - implements ``Bson``, it's convenient to combine with use of :ref:`Sorts `. - -The following example creates a pipeline stage that sorts in descending order according -to the value of the ``year`` field and then in ascending order according to the -value of the ``title`` field: - -.. literalinclude:: /includes/aggregation/Builders.cs - :start-after: // begin sortStage - :end-before: // end sortStage - :language: csharp - :dedent: - -SortByCount ------------ - -Use the ``sortByCount()`` method to create a :manual:`$sortByCount ` -pipeline stage that groups documents by a given expression and then sorts -these groups by count in descending order. - -.. tip:: - - The ``$sortByCount`` stage is identical to a ``$group`` stage with a - ``$sum`` accumulator followed by a ``$sort`` stage. - - .. code-block:: json - - [ - { "$group": { "_id": , "count": { "$sum": 1 } } }, - { "$sort": { "count": -1 } } - ] - -The following example groups documents by the truncated value of the field ``x`` -and computes the count for each distinct value: - -.. literalinclude:: /includes/aggregation/Builders.cs - :start-after: // begin sortByCount - :end-before: // end sortByCount - :language: csharp - :dedent: - - - - - -Unwind ------- - -Use the ``unwind()`` method to create an :manual:`$unwind ` -pipeline stage to deconstruct an array field from input documents, creating -an output document for each array element. - -The following example creates a document for each element in the ``sizes`` array: - -.. literalinclude:: /includes/aggregation/Builders.cs - :start-after: // begin unwindStage - :end-before: // end unwindStage - :language: csharp - :dedent: - -To preserve documents that have missing or ``null`` -values for the array field, or where array is empty: - - -.. literalinclude:: /includes/aggregation/Builders.cs - :start-after: // begin unwindPreserve - :end-before: // end unwindPreserve - :language: csharp - :dedent: - -To include the array index, in this example in a field called ``"position"``: - -.. literalinclude:: /includes/aggregation/Builders.cs - :start-after: // begin unwindIndex - :end-before: // end unwindIndex - :language: csharp - :dedent: - - - - - - - -.. _java_aggregates_pick_n: - -Pick-N Accumulators -------------------- - -The pick-n accumulators are aggregation accumulation operators that return -the top and bottom elements given a specific ordering. Use one of the -following builders to create an aggregation accumulation operator: - -- :ref:`minN() ` -- :ref:`maxN() ` -- :ref:`firstN() ` -- :ref:`lastN() ` -- :ref:`top() ` -- :ref:`topN() ` -- :ref:`bottom() ` -- :ref:`bottomN() ` - -.. tip:: - - You can only perform aggregation operations with these pick-n accumulators - when running MongoDB v5.2 or later. - -Learn which aggregation pipeline stages you can use accumulator operators with -from the Server manual section on -:manual:`Accumulators `. - -.. _java_aggregates_min_n: - -MinN -~~~~ - -The ``minN()`` builder creates the :manual:`$minN ` -accumulator which returns data from documents that contain the ``n`` lowest -values of a grouping. - -.. tip:: - - The ``$minN`` and ``$bottomN`` accumulators can perform similar tasks. - See - :manual:`Comparison of $minN and $bottomN Accumulators ` - for recommended usage of each. - -The following example demonstrates how to use the ``minN()`` method to return -the lowest three ``imdb.rating`` values for movies, grouped by ``year``: - -.. literalinclude:: /includes/fundamentals/code-snippets/builders/AccumulatorsPickN.java - :start-after: // begin minN accumulator - :end-before: // end minN accumulator - :language: csharp - :dedent: - -See the `minN() API documentation <{+core-api+}/client/model/Accumulators.html#minN(java.lang.String,InExpression,NExpression)>`__ -for more information. - -.. _java_aggregates_max_n: - -MaxN -~~~~ - -The ``maxN()`` accumulator returns data from documents that contain the ``n`` -highest values of a grouping. - -The following example demonstrates how to use the ``maxN()`` method to -return the highest two ``imdb.rating`` values for movies, grouped by ``year``: - -.. literalinclude:: /includes/fundamentals/code-snippets/builders/AccumulatorsPickN.java - :start-after: // begin maxN accumulator - :end-before: // end maxN accumulator - :language: csharp - :dedent: - -See the `maxN() API documentation <{+core-api+}/client/model/Accumulators.html#maxN(java.lang.String,InExpression,NExpression)>`__ -for more information. - -.. _java_aggregates_first_n: - -FirstN -~~~~~~ - -The ``firstN()`` accumulator returns data from the first ``n`` documents in -each grouping for the specified sort order. - -.. tip:: - - The ``$firstN`` and ``$topN`` accumulators can perform similar tasks. - See - :manual:`Comparison of $firstN and $topN Accumulators ` - for recommended usage of each. - -The following example demonstrates how to use the ``firstN()`` method to -return the first four movie ``title`` values, based on the order they came -into the stage, grouped by ``year``: - -.. literalinclude:: /includes/fundamentals/code-snippets/builders/AccumulatorsPickN.java - :start-after: // begin firstN accumulator - :end-before: // end firstN accumulator - :language: csharp - :dedent: - -See the `firstN() API documentation <{+core-api+}/client/model/Accumulators.html#firstN(java.lang.String,InExpression,NExpression)>`__ -for more information. - -.. _java_aggregates_last_n: - -LastN -~~~~~ - -The ``lastN()`` accumulator returns data from the last ``n`` documents in -each grouping for the specified sort order. - -The following example demonstrates how to use the ``lastN()`` method to show -the last three movie ``title`` values, based on the the order they came into -the stage, grouped by ``year``: - -.. literalinclude:: /includes/fundamentals/code-snippets/builders/AccumulatorsPickN.java - :start-after: // begin lastN accumulator - :end-before: // end lastN accumulator - :language: csharp - :dedent: - -See the `lastN() API documentation <{+core-api+}/client/model/Accumulators.html#lastN(java.lang.String,InExpression,NExpression)>`__ -for more information. - -.. _java_aggregates_top: - -Top -~~~ - -The ``top()`` accumulator returns data from the first document in a group -based on the specified sort order. - -The following example demonstrates how to use the ``top()`` method to return -the ``title`` and ``imdb.rating`` values for the top rated movies based on the -``imdb.rating``, grouped by ``year``. - -.. literalinclude:: /includes/fundamentals/code-snippets/builders/AccumulatorsPickN.java - :start-after: // begin top accumulator - :end-before: // end top accumulator - :language: csharp - :dedent: - -See the `top() API documentation <{+core-api+}/client/model/Accumulators.html#top(java.lang.String,org.bson.conversions.Bson,OutExpression)>`__ -for more information. - -.. _java_aggregates_top_n: - -TopN -~~~~ - -The ``topN()`` accumulator returns data from documents that contain the -highest ``n`` values for the specified field. - -.. tip:: - - The ``$firstN`` and ``$topN`` accumulators can perform similar tasks. - See - :manual:`Comparison of $firstN and $topN Accumulators ` - for recommended usage of each. - -The following example demonstrates how to use the ``topN()`` method to return -the ``title`` and ``runtime`` values of the three longest movies based on the -``runtime`` values, grouped by ``year``. - -.. literalinclude:: /includes/fundamentals/code-snippets/builders/AccumulatorsPickN.java - :start-after: // begin topN accumulator - :end-before: // end topN accumulator - :language: csharp - :dedent: - -See the `topN() API documentation <{+core-api+}/client/model/Accumulators.html#topN(java.lang.String,org.bson.conversions.Bson,OutExpression,NExpression)>`__ -for more information. - -.. _java_aggregates_bottom: - -Bottom -~~~~~~ - -The ``bottom()`` accumulator returns data from the last document in a group -based on the specified sort order. - -The following example demonstrates how to use the ``bottom()`` method to -return the ``title`` and ``runtime`` values of the shortest movie based on the -``runtime`` value, grouped by ``year``. - -.. literalinclude:: /includes/fundamentals/code-snippets/builders/AccumulatorsPickN.java - :start-after: // begin bottom accumulator - :end-before: // end bottom accumulator - :language: csharp - :dedent: - -See the `bottom() API documentation <{+core-api+}/client/model/Accumulators.html#bottom(java.lang.String,org.bson.conversions.Bson,OutExpression)>`__ -for more information. - -.. _java_aggregates_bottom_n: - -BottomN -~~~~~~~ - -The ``bottomN()`` accumulator returns data from documents that contain the -lowest ``n`` values for the specified field. - -.. tip:: - - The ``$minN`` and ``$bottomN`` accumulators can perform similar tasks. - See :manual:`Comparison of $minN and $bottomN Accumulators ` - for recommended usage of each. - -The following example demonstrates how to use the ``bottomN()`` method to -return the ``title`` and ``imdb.rating`` values of the two lowest rated movies -based on the ``imdb.rating`` value, grouped by ``year``: - -.. literalinclude:: /includes/fundamentals/code-snippets/builders/AccumulatorsPickN.java - :start-after: // begin bottomN accumulator - :end-before: // end bottomN accumulator - :language: csharp - :dedent: - -See the `bottomN() API documentation <{+core-api+}/client/model/Accumulators.html#bottomN(java.lang.String,org.bson.conversions.Bson,OutExpression,NExpression)>`__ -for more information. \ No newline at end of file diff --git a/source/aggregation/unionWith.txt b/source/aggregation/unionWith.txt new file mode 100644 index 00000000..ac2bade5 --- /dev/null +++ b/source/aggregation/unionWith.txt @@ -0,0 +1,18 @@ +.. _csharp-aggregation-unionwith: + +========= +UnionWith +========= + +.. facet:: + :name: genre + :values: reference + +.. meta:: + :keywords: code example, transform, pipeline + +.. contents:: On this page + :local: + :backlinks: none + :depth: 2 + :class: singlecol \ No newline at end of file diff --git a/source/aggregation/unwind.txt b/source/aggregation/unwind.txt new file mode 100644 index 00000000..3dbd5d84 --- /dev/null +++ b/source/aggregation/unwind.txt @@ -0,0 +1,52 @@ +.. _csharp-aggregation-unwind: + +====== +Unwind +====== + +.. facet:: + :name: genre + :values: reference + +.. meta:: + :keywords: code example, transform, pipeline + +.. contents:: On this page + :local: + :backlinks: none + :depth: 2 + :class: singlecol + + +Unwind +------ + +Use the ``unwind()`` method to create an :manual:`$unwind ` +pipeline stage to deconstruct an array field from input documents, creating +an output document for each array element. + +The following example creates a document for each element in the ``sizes`` array: + +.. literalinclude:: /includes/aggregation/Builders.cs + :start-after: // begin unwindStage + :end-before: // end unwindStage + :language: csharp + :dedent: + +To preserve documents that have missing or ``null`` +values for the array field, or where array is empty: + + +.. literalinclude:: /includes/aggregation/Builders.cs + :start-after: // begin unwindPreserve + :end-before: // end unwindPreserve + :language: csharp + :dedent: + +To include the array index, in this example in a field called ``"position"``: + +.. literalinclude:: /includes/aggregation/Builders.cs + :start-after: // begin unwindIndex + :end-before: // end unwindIndex + :language: csharp + :dedent: diff --git a/source/aggregation/vectorSearch.txt b/source/aggregation/vectorSearch.txt new file mode 100644 index 00000000..b04cf3a3 --- /dev/null +++ b/source/aggregation/vectorSearch.txt @@ -0,0 +1,18 @@ +.. _csharp-aggregation-vectorsearch: + +============ +VectorSearch +============ + +.. facet:: + :name: genre + :values: reference + +.. meta:: + :keywords: code example, transform, pipeline + +.. contents:: On this page + :local: + :backlinks: none + :depth: 2 + :class: singlecol \ No newline at end of file From 1c893690c1f214b6b2b775871a1ab13ade84dfb3 Mon Sep 17 00:00:00 2001 From: Mike Woofter <108414937+mongoKart@users.noreply.github.com> Date: Fri, 11 Apr 2025 16:34:08 -0500 Subject: [PATCH 04/17] wip --- source/aggregation.txt | 1532 ++--------------- source/aggregation/changeStream.txt | 18 - source/aggregation/group.txt | 45 - source/aggregation/limit.txt | 33 - source/aggregation/lookup.txt | 53 - source/aggregation/match.txt | 39 - source/aggregation/operators.txt | 263 +++ source/aggregation/out.txt | 38 - source/aggregation/sample.txt | 33 - source/aggregation/skip.txt | 34 - source/aggregation/sort.txt | 40 - source/aggregation/stages.txt | 247 +++ source/aggregation/{ => stages}/bucket.txt | 0 .../aggregation/{ => stages}/bucketAuto.txt | 0 source/aggregation/stages/changeStream.txt | 20 + .../changeStreamSplitLargeEvent copy.txt | 0 source/aggregation/{ => stages}/count.txt | 0 source/aggregation/{ => stages}/densify.txt | 8 + source/aggregation/{ => stages}/documents.txt | 0 source/aggregation/{ => stages}/facet.txt | 0 .../aggregation/{ => stages}/graphLookup.txt | 3 + source/aggregation/stages/group.txt | 97 ++ source/aggregation/stages/limit.txt | 57 + source/aggregation/stages/lookup.txt | 146 ++ source/aggregation/stages/match.txt | 78 + source/aggregation/{ => stages}/merge.txt | 0 source/aggregation/stages/out.txt | 89 + source/aggregation/{ => stages}/project.txt | 42 +- .../aggregation/{ => stages}/rankFusion.txt | 0 .../aggregation/{ => stages}/replaceRoot.txt | 0 .../aggregation/{ => stages}/replaceWith.txt | 0 source/aggregation/stages/sample.txt | 60 + source/aggregation/{ => stages}/search.txt | 0 .../aggregation/{ => stages}/searchMeta.txt | 0 source/aggregation/{ => stages}/set.txt | 0 .../{ => stages}/setWindowFields.txt | 0 source/aggregation/stages/skip.txt | 61 + source/aggregation/stages/sort.txt | 87 + .../aggregation/{ => stages}/sortByCount.txt | 0 source/aggregation/{ => stages}/unionWith.txt | 0 source/aggregation/stages/unwind.txt | 186 ++ source/aggregation/stages/vectorSearch.txt | 97 ++ source/aggregation/unwind.txt | 52 - source/aggregation/vectorSearch.txt | 18 - 44 files changed, 1662 insertions(+), 1814 deletions(-) delete mode 100644 source/aggregation/changeStream.txt delete mode 100644 source/aggregation/group.txt delete mode 100644 source/aggregation/limit.txt delete mode 100644 source/aggregation/lookup.txt delete mode 100644 source/aggregation/match.txt create mode 100644 source/aggregation/operators.txt delete mode 100644 source/aggregation/out.txt delete mode 100644 source/aggregation/sample.txt delete mode 100644 source/aggregation/skip.txt delete mode 100644 source/aggregation/sort.txt create mode 100644 source/aggregation/stages.txt rename source/aggregation/{ => stages}/bucket.txt (100%) rename source/aggregation/{ => stages}/bucketAuto.txt (100%) create mode 100644 source/aggregation/stages/changeStream.txt rename source/aggregation/{ => stages}/changeStreamSplitLargeEvent copy.txt (100%) rename source/aggregation/{ => stages}/count.txt (100%) rename source/aggregation/{ => stages}/densify.txt (90%) rename source/aggregation/{ => stages}/documents.txt (100%) rename source/aggregation/{ => stages}/facet.txt (100%) rename source/aggregation/{ => stages}/graphLookup.txt (91%) create mode 100644 source/aggregation/stages/group.txt create mode 100644 source/aggregation/stages/limit.txt create mode 100644 source/aggregation/stages/lookup.txt create mode 100644 source/aggregation/stages/match.txt rename source/aggregation/{ => stages}/merge.txt (100%) create mode 100644 source/aggregation/stages/out.txt rename source/aggregation/{ => stages}/project.txt (55%) rename source/aggregation/{ => stages}/rankFusion.txt (100%) rename source/aggregation/{ => stages}/replaceRoot.txt (100%) rename source/aggregation/{ => stages}/replaceWith.txt (100%) create mode 100644 source/aggregation/stages/sample.txt rename source/aggregation/{ => stages}/search.txt (100%) rename source/aggregation/{ => stages}/searchMeta.txt (100%) rename source/aggregation/{ => stages}/set.txt (100%) rename source/aggregation/{ => stages}/setWindowFields.txt (100%) create mode 100644 source/aggregation/stages/skip.txt create mode 100644 source/aggregation/stages/sort.txt rename source/aggregation/{ => stages}/sortByCount.txt (100%) rename source/aggregation/{ => stages}/unionWith.txt (100%) create mode 100644 source/aggregation/stages/unwind.txt create mode 100644 source/aggregation/stages/vectorSearch.txt delete mode 100644 source/aggregation/unwind.txt delete mode 100644 source/aggregation/vectorSearch.txt diff --git a/source/aggregation.txt b/source/aggregation.txt index a27a8ac1..d418d36f 100644 --- a/source/aggregation.txt +++ b/source/aggregation.txt @@ -21,8 +21,7 @@ Aggregation Operations :titlesonly: :maxdepth: 1 - Aggergation Stages - Aggergation Operators + Aggregation Stages Overview -------- @@ -35,6 +34,9 @@ return computed results. The MongoDB Aggregation framework is modeled on the concept of data processing pipelines. Documents enter a pipeline comprised of one or more stages, and this pipeline transforms the documents into an aggregated result. +To learn more about the Aggregation Pipeline, see the +:manual:`Aggregation Pipeline ` server manual page. + Analogy ~~~~~~~ @@ -94,530 +96,67 @@ performing aggregation operations: the `AllowDiskUse <{+new-api-root+}/MongoDB.Driver/MongoDB.Driver.AggregateOptions.AllowDiskUse.html#MongoDB_Driver_AggregateOptions_AllowDiskUse>`__ property of the ``AggregateOptions`` object that you pass to the ``Aggregate()`` method. -- The :manual:`$graphLookup ` stage has - a strict memory limit of 100 megabytes and ignores the ``AllowDiskUse`` property. - -Syntax Options --------------- - -This section describes the approaches that you can use to create an aggregation pipeline. - -Builders -~~~~~~~~ - - - - -To perform an aggregation, pass a list of aggregation stages to the -``IMongoCollection.Aggregate()`` method. - -.. note:: - - This example uses the ``sample_restaurants.restaurants`` collection - from the :atlas:`Atlas sample datasets `. To learn how to create a - free MongoDB Atlas cluster and load the sample datasets, see :ref:`csharp-get-started`. - -The following code example produces a count of the number of bakeries in each borough -of New York City. To do so, it uses an aggregation pipeline that contains the following stages: - -- A :manual:`$match ` stage to filter for documents whose - ``cuisine`` field contains the value ``"Bakery"``. - -- A :manual:`$group ` stage to group the matching - documents by the ``borough`` field, accumulating a count of documents for each distinct value - of that field. - -The following sections implement this example by using LINQ, Builders, and BsonDocument -approaches to create and combine the aggregation stages used in the example pipeline. - -LINQ Approach -~~~~~~~~~~~~~ - -.. io-code-block:: - - .. input:: /includes/fundamentals/code-examples/LinqAggregation.cs - :language: csharp - :dedent: - :start-after: begin-aggregation - :end-before: end-aggregation - - .. output:: - :language: console - :visible: false - - { _id = Bronx, Count = 71 } - { _id = Brooklyn, Count = 173 } - { _id = Staten Island, Count = 20 } - { _id = Missing, Count = 2 } - { _id = Manhattan, Count = 221 } - { _id = Queens, Count = 204 } - -To learn more about using LINQ to construct aggregation pipelines, see the -:ref:`csharp-linq` guide. - -Builders Approach -~~~~~~~~~~~~~~~~~ - -.. io-code-block:: - - .. input:: /includes/fundamentals/code-examples/BuilderAggregation.cs - :language: csharp - :dedent: - :start-after: begin-aggregation - :end-before: end-aggregation - - .. output:: - :language: console - :visible: false - - { _id = Bronx, Count = 71 } - { _id = Brooklyn, Count = 173 } - { _id = Staten Island, Count = 20 } - { _id = Missing, Count = 2 } - { _id = Manhattan, Count = 221 } - { _id = Queens, Count = 204 } - -To learn more about using builders to construct aggregation pipelines, -see the :ref:`csharp-builders-aggregation` section of the Operations with Builders guide. - -BsonDocument Approach -~~~~~~~~~~~~~~~~~~~~~ - -.. io-code-block:: - - .. input:: /includes/fundamentals/code-examples/Aggregation.cs - :language: csharp - :dedent: - :start-after: begin-aggregation - :end-before: end-aggregation - - .. output:: - :language: console - :visible: false - - { "_id" : "Brooklyn", "count" : 173 } - { "_id" : "Manhattan", "count" : 221 } - { "_id" : "Bronx", "count" : 71 } - { "_id" : "Missing", "count" : 2 } - { "_id" : "Staten Island", "count" : 20 } - { "_id" : "Queens", "count" : 204 } - - -Aggregation Stage Methods -------------------------- - -The following table lists the builder methods in the {+driver-short+} that correspond -to stages in the aggregation pipeline. For more information about a method, click the -method name. - -.. list-table:: - :header-rows: 1 - :widths: 20 80 - - * - Stage - - Description - - * - :ref:`Bucket() ` - - - Categorizes incoming documents into groups, called buckets, - based on a specified expression and bucket boundaries. - - * - :ref:`BucketAuto() ` - - - Categorizes incoming documents into a specific number of - groups, called buckets, based on a specified expression. - Bucket boundaries are automatically determined in an attempt - to evenly distribute the documents into the specified number - of buckets. - - * - :ref:`ChangeStream() ` - - - Returns a change stream cursor for the - collection. This stage can occur only once in an aggregation - pipeline and it must occur as the first stage. - - * - :ref:`ChangeStreamSplitLargeEvent() ` - - - Splits large change stream events that exceed 16 MB into smaller fragments returned - in a change stream cursor. - - You can use $changeStreamSplitLargeEvent only in a $changeStream pipeline, and - it must be the final stage in the pipeline. - - * - :ref:`Count() ` - - - Returns a count of the number of documents at this stage of - the aggregation pipeline. - - * - :ref:`Densify() ` - - - Creates new documents in a sequence of documents where certain values in a field are missing. - - * - :ref:`Documents() ` - - - Returns literal documents from input expressions. - - * - :ref:`Facet() ` - - - Processes multiple aggregation pipelines - within a single stage on the same set - of input documents. Enables the creation of multi-faceted - aggregations capable of characterizing data across multiple - dimensions, or facets, in a single stage. - - * - :ref:`GraphLookup() ` - - - Performs a recursive search on a collection. To each output - document, adds a new array field that contains the traversal - results of the recursive search for that document. - - * - :ref:`Group() ` - - - Groups input documents by a specified identifier expression - and applies the accumulator expressions, if specified, to - each group. Consumes all input documents and outputs one - document per each distinct group. The output documents - contain only the identifier field and, if specified, accumulated - fields. - - * - :ref:`Limit() ` - - - Passes the first *n* documents unmodified to the pipeline, - where *n* is the specified limit. For each input document, - outputs either one document (for the first *n* documents) or - zero documents (after the first *n* documents). - - * - :ref:`Lookup() ` - - - Performs a left outer join to another collection in the - *same* database to filter in documents from the "joined" - collection for processing. - - * - :ref:`Match() ` - - - Filters the document stream to allow only matching documents - to pass unmodified into the next pipeline stage. - For each input document, outputs either one document (a match) or zero - documents (no match). - - * - :ref:`Merge() ` - - - Writes the resulting documents of the aggregation pipeline to - a collection. The stage can incorporate (insert new - documents, merge documents, replace documents, keep existing - documents, fail the operation, process documents with a - custom update pipeline) the results into an output - collection. To use this stage, it must be - the last stage in the pipeline. - - * - :ref:`Out() ` - - - Writes the resulting documents of the aggregation pipeline to - a collection. To use this stage, it must be - the last stage in the pipeline. - - * - :ref:`Project() ` - - - Reshapes each document in the stream, such as by adding new - fields or removing existing fields. For each input document, - outputs one document. - - * - :ref:`ReplaceRoot() ` - - - Replaces a document with the specified embedded document. The - operation replaces all existing fields in the input document, - including the ``_id`` field. Specify a document embedded in - the input document to promote the embedded document to the - top level. - - The ``$replaceWith`` stage is an alias for the ``$replaceRoot`` stage. - - * - :ref:`ReplaceWith() ` - - - Replaces a document with the specified embedded document. - The operation replaces all existing fields in the input document, including - the ``_id`` field. Specify a document embedded in the input document to promote - the embedded document to the top level. - - The ``$replaceWith`` stage is an alias for the ``$replaceRoot`` stage. - - * - :ref:`Sample() ` - - - Randomly selects the specified number of documents from its - input. - - * - :ref:`Search() ` - - - Performs a full-text search of the field or fields in an - :atlas:`Atlas ` - collection. - - This stage is available only for MongoDB Atlas clusters, and is not - available for self-managed deployments. To learn more, see - :atlas:`Atlas Search Aggregation Pipeline Stages - ` in the Atlas documentation. - - * - :ref:`SearchMeta() ` - - - Returns different types of metadata result documents for the - :atlas:`Atlas Search ` query against an - :atlas:`Atlas ` - collection. - - This stage is available only for MongoDB Atlas clusters, - and is not available for self-managed deployments. To learn - more, see :atlas:`Atlas Search Aggregation Pipeline Stages - ` in the Atlas documentation. - - * - :ref:`Set() ` - - - Adds new fields to documents. Like the ``Project()`` method, - this method reshapes each - document in the stream by adding new fields to - output documents that contain both the existing fields - from the input documents and the newly added fields. - - * - :ref:`SetWindowFields() ` - - - Groups documents into windows and applies one or more - operators to the documents in each window. - - .. versionadded:: 5.0 - - * - :ref:`Skip() ` - - - Skips the first *n* documents, where *n* is the specified skip - number, and passes the remaining documents unmodified to the - pipeline. For each input document, outputs either zero - documents (for the first *n* documents) or one document (if - after the first *n* documents). - - * - :ref:`Sort() ` - - - Reorders the document stream by a specified sort key. The documents remain unmodified. - For each input document, outputs one document. - - * - :ref:`SortByCount() ` - - - Groups incoming documents based on the value of a specified - expression, then computes the count of documents in each - distinct group. - - * - :ref:`UnionWith() ` - - - Combines pipeline results from two collections into a single - result set. - - * - :ref:`Unwind() ` - - - Deconstructs an array field from the input documents to - output a document for *each* element. Each output document - replaces the array with an element value. For each input - document, outputs *n* Documents, where *n* is the number of - array elements. *n* can be zero for an empty array. - - * - :ref:`VectorSearch() ` - - - Performs an :abbr:`ANN (Approximate Nearest Neighbor)` or - :abbr:`ENN (Exact Nearest Neighbor)` search on a - vector in the specified field of an - :atlas:`Atlas ` collection. - - ``$vectorSearch`` is available only for MongoDB Atlas clusters - running MongoDB v6.0.11 or higher, and is not available for - self-managed deployments. - - .. versionadded:: 7.0.2 - -Additional Information ----------------------- - -MongoDB Server Manual -~~~~~~~~~~~~~~~~~~~~~ - -To view a full list of expression operators, see -:manual:`Aggregation Operators `. - -To learn more about assembling an aggregation pipeline and view examples, see -:manual:`Aggregation Pipeline `. - -To learn more about creating pipeline stages, see -:manual:`Aggregation Stages `. - -To learn about explaining MongoDB aggregation operations, see -:manual:`Explain Results ` and -:manual:`Query Plans `. - -API Documentation -~~~~~~~~~~~~~~~~~ - -For more information about the aggregation operations discussed in this guide, see the -following API documentation: - -- `Aggregate() <{+new-api-root+}/MongoDB.Driver/MongoDB.Driver.IMongoCollection-1.Aggregate.html>`__ -- `AggregateOptions <{+new-api-root+}/MongoDB.Driver/MongoDB.Driver.AggregateOptions.html>`__ -- `Group() <{+new-api-root+}/MongoDB.Driver/MongoDB.Driver.PipelineStageDefinitionBuilder.Group.html>`__ -- `Match() <{+new-api-root+}/MongoDB.Driver/MongoDB.Driver.PipelineStageDefinitionBuilder.Match.html>`__ -- `Where() <{+new-api-root+}/MongoDB.Driver/MongoDB.Driver.Linq.MongoQueryable.Where.html>`__ -- `GroupBy() <{+new-api-root+}/MongoDB.Driver/MongoDB.Driver.Linq.MongoQueryable.GroupBy.html>`__ -- `Select() <{+new-api-root+}/MongoDB.Driver/MongoDB.Driver.Linq.MongoQueryable.Select.html>`__ - -.. TODO: integrate into existing page - -Sample Class ------------- - -The code examples in this guide demonstrate how you can use builders to -create types to interact with documents in the sample collection ``plants.flowers``. -Documents in this collection are modeled by the following ``Flower`` class: - -.. literalinclude:: /includes/fundamentals/code-examples/builders.cs - :language: csharp - :dedent: - :start-after: start-model - :end-before: end-model - -Each builder class takes a generic type parameter -``TDocument`` which represents the type of document that you are working -with. In this guide, the ``Flower`` class is the document type used in -each builder class example. - -.. _csharp-builders-aggregation: +Link to Stages page +Link to Operators page Build an Aggregation Pipeline ----------------------------- -The ``PipelineDefinitionBuilder`` class provides a type-safe interface for -defining an **aggregation pipeline**. An aggregation pipeline is a series of -stages that are used to transform a document. Suppose you want to create a -pipeline that performs the following operations: - -- Matches all documents with "spring" in the ``Season`` field -- Sorts the results by the ``Category`` field -- Groups the documents by category and shows the average price and total - available for all documents in that category - -Use ``PipelineDefinitionBuilder`` classes to build the pipeline: - -.. code-block:: csharp - - var sortBuilder = Builders.Sort.Ascending(f => f.Category); - var matchFilter = Builders.Filter.AnyEq(f => f.Season, "spring"); - - var pipeline = new EmptyPipelineDefinition() - .Match(matchFilter) - .Sort(sortBuilder) - .Group(f => f.Category, - g => new - { - name = g.Key, - avgPrice = g.Average(f => f.Price), - totalAvailable = g.Sum(f => f.Stock) - } - ); - -The preceding example creates the following pipeline: - -.. code-block:: json - - [{ "$match" : { "season" : "spring" } }, { "$sort" : { "category" : 1 } }, { "$group" : { "_id" : "$category", "avgPrice" : { "$avg" : "$price" }, "totalAvailable" : { "$sum" : "$stock" } } }] +The following sections describe the different ways to build an aggregation +pipeline by using the {+driver-long+}. -You can add stages to your pipeline that don't have corresponding type-safe -methods in the ``PipelineDefinitionBuilder`` interface by providing your query -as a ``BsonDocument`` to the `AppendStage() method -<{+new-api-root+}/MongoDB.Driver/MongoDB.Driver.PipelineDefinitionBuilder.AppendStage.html>`__. - -.. code-block:: csharp - - var pipeline = new EmptyPipelineDefinition().AppendStage("{ $set: { field1: '$field2' } }"); - -.. note:: - - When using a ``BsonDocument`` to define your pipeline stage, the driver does - not take into account any ``BsonClassMap``, serialization attributes or - serialization conventions. The field names used in the ``BsonDocument`` must - match those stored on the server. - - For more information on providing a query as a ``BsonDocument``, see our - :ref:`FAQ page `. - -To learn more about the Aggregation Pipeline, see the -:manual:`Aggregation Pipeline ` server manual page. - -.. _csharp-builders-out: - -Write Pipeline Results to a Collection -~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ - -You can write the documents returned from an aggregation pipeline to a -collection by creating an ``$out`` stage at the end of your aggregation -pipeline. To create an ``$out`` stage, call the ``Out()`` method on a -``PipelineStageDefinitionBuilder``. The ``Out()`` method requires the name of -the collection you want to write the documents to. - -The following example builds an aggregation pipeline that matches all documents -with a ``season`` field value of ``"Spring"`` and outputs them to -a ``springFlowers`` collection: - -.. code-block:: csharp - - var outputCollection = database.GetCollection("springFlowers"); - var matchFilter = Builders.Filter.AnyEq(f => f.Season, "spring"); - - // Creates an aggregation pipeline and outputs resulting documents to a new collection. - var pipeline = new EmptyPipelineDefinition() - .Match(matchFilter) - .Out(outputCollection); +Builders +~~~~~~~~ -You can write the results of an aggregation pipeline to a time series collection -by specifying a ``TimeSeriesOption`` object and passing it as the second -parameter to the ``Out()`` method. +You can create an aggregation pipeline in the following ways: -Imagine that the documents in the ``plants.flowers`` collection contain a ``datePlanted`` field that -holds BSON date values. You can store the documents in this collection in a time -series collection by using the ``datePlanted`` field as the time field. +- Create an ``EmptyPipelineDefinition`` object and chain calls to the relevant + aggregation methods. Then, pass the pipeline object to the ``IMongoCollection.Aggregate()`` + method. as shown in the following example: -The following example creates a ``TimeSeriesOptions`` object and specifies -``datePlanted`` as the ``timeField``. It then builds an aggregation pipeline that matches all documents -with a ``season`` field value of ``"Spring"`` and outputs them to a -time series collection called ``springFlowerTimes``. +- Chain aggregation methods directly from the call to the + ``IMongoCollection.Aggregate()`` method. -.. code-block:: csharp +Select the :guilabel:`EmptyPipelineDefinition` +or :guilabel:`Aggregate` tab to see the corresponding code: - var timeSeriesOptions = new TimeSeriesOptions("datePlanted"); - var collectionName = "springFlowerTimes" - var matchFilter = Builders.Filter.AnyEq(f => f.Season, "spring"); +.. tabs:: + + .. tab:: EmptyPipelineDefinition + :tabid: empty-pipeline-definition - // Creates an aggregation pipeline and outputs resulting documents to a time series collection. - var pipeline = new EmptyPipelineDefinition() - .Match(matchFilter) - .Out(collectionName, timeSeriesOptions); + .. code-block:: csharp -To learn more about time series collections, see :ref:`csharp-time-series`. + // Defines the aggregation pipeline + var pipeline = new EmptyPipelineDefinition() + .Match(...) + .Group(...) + .Merge(...); + // Executes the aggregation pipeline + var results = collection.Aggregate(pipeline).ToList(); -- `PipelineDefinitionBuilder <{+new-api-root+}/MongoDB.Driver/MongoDB.Driver.PipelineDefinitionBuilder.html>`__ + .. tab:: Aggregate + :tabid: aggregate -.. TODO: integrate into existing page + .. code-block:: csharp -.. _csharp-linq: + // Defines and executes the aggregation pipeline + var pipeline = collection.Aggregate() + .Match(...) + .Group(...) + .Merge(...); -==== LINQ -==== - -.. facet:: - :name: genre - :values: reference - -.. meta:: - :keywords: code example, query, aggregation +~~~~ +You can use LINQ to create an :ref:`aggregation pipeline `. +The {+driver-short+} automatically translates each LINQ statement into the corresponding +aggregation pipeline stages. In this section you can learn which +aggregation pipeline stages are supported. -Overview --------- - -.. include:: /includes/linq-vs-builders.rst +To learn more about the aggregation pipeline stages, see the +:ref:`aggregation-pipeline-operator-reference` page in the server manual. In this guide you can learn how to use `LINQ `__ @@ -631,39 +170,6 @@ The {+driver-short+} automatically translates LINQ queries into LINQ3 is the only LINQ provider available in the {+driver-long+}. If you have manually configured your project to use LINQ2, it will not compile. -The examples in this guide use the ``restaurants`` collection -in the ``sample_restaurants`` database provided in the :atlas:`Atlas sample datasets `. -To learn how to create a free MongoDB Atlas cluster and load the sample datasets, -see the :ref:``. - -The following ``Restaurant``, ``Address`` and ``GradeEntry`` classes model the -documents in this collection: - -.. literalinclude:: /includes/fundamentals/code-examples/linq.cs - :language: csharp - :dedent: - :start-after: start-restaurant-model - :end-before: end-restaurant-model - -.. literalinclude:: /includes/fundamentals/code-examples/linq.cs - :language: csharp - :dedent: - :start-after: start-address-model - :end-before: end-address-model - -.. literalinclude:: /includes/fundamentals/code-examples/linq.cs - :language: csharp - :dedent: - :start-after: start-grade-model - :end-before: end-grade-model - -.. include:: /includes/convention-pack-note.rst - -.. _csharp-linq-queryable: - -Make A Collection Queryable ---------------------------- - To use LINQ to query your collection, you must first create an an `IQueryable `__ @@ -733,749 +239,120 @@ You can print the results of the preceding example as follows: var results = query.ToCursor(); +View Translated Queries +----------------------- -Supported Aggregation Stages ----------------------------- +When you run a LINQ query, the {+driver-short+} automatically translates your +query into an aggregation pipeline written with the {+query-api+}. You can view +the translated query by using the ``ToString()`` method or the +``LoggedStages`` property. -You can use LINQ to create an :ref:`aggregation pipeline `. -The {+driver-short+} automatically translates each LINQ statement into the corresponding -aggregation pipeline stages. In this section you can learn which -aggregation pipeline stages are supported. +To see the translated query for **non-scalar operations**, use the ``ToString()`` +method. Non-scalar operations are operations that return a query object, such +as: -To learn more about the aggregation pipeline stages, see the -:ref:`aggregation-pipeline-operator-reference` page in the server manual. +- ``Where`` +- ``Select`` +- ``SelectMany`` +- ``GroupJoin`` -$project -~~~~~~~~ +The following example calls the ``ToString()`` method on a LINQ query and prints +the translated query: -The ``$project`` aggregation stage returns a document containing only the specified -fields. +.. io-code-block:: -Select the :guilabel:`Method Syntax` or :guilabel:`Query Syntax` tab to see how -to generate a ``$project`` stage using LINQ: + .. input:: + :language: csharp -.. tabs:: + var queryableCollection = _restaurantsCollection.AsQueryable(); + var query = queryableCollection + .Where(r => r.Name == "The Movable Feast"); - .. tab:: Method Syntax - :tabid: method-syntax + var queryTranslated = query.ToString(); + Console.WriteLine(queryTranslated); - .. code-block:: csharp - :emphasize-lines: 2 + .. output:: - var query = queryableCollection - .Select(r => new { r.Name, r.Address }); + sample_restaurants.restaurants.Aggregate([{ "$match" : { "name" : "The Movable Feast" } }]) - .. tab:: Query Syntax - :tabid: query-syntax +To get the translated query for **scalar operations** use the ``LoggedStages`` +property. Scalar operations are operations that return a scalar result rather than a +query object, such as: - .. code-block:: csharp - :emphasize-lines: 2 +- ``First`` +- ``Sum`` +- ``Count`` +- ``Min`` +- ``Max`` - var query = from r in queryableCollection - select new { r.Name, r.Address }; - -The result of the preceding example contains the following document: - -.. code-block:: json - - { "name" : "The Movable Feast", "address" : { "building" : "284", "coord" : [-73.982923900000003, 40.6580753], "street" : "Prospect Park West", "zipcode" : "11215" } } - -.. note:: Excluding the ``_id`` Field - - If you don't include the ``_id`` field in your LINQ projection, the {+driver-short+} - automatically excludes it from the results. - -$match -~~~~~~ - -The ``$match`` aggregation stage returns the documents that match a specified -criteria. - -Select the :guilabel:`Method Syntax` or :guilabel:`Query Syntax` tab to see how -to generate a ``$match`` stage using LINQ: - -.. tabs:: - - .. tab:: Method Syntax - :tabid: method-syntax - - .. code-block:: csharp - :emphasize-lines: 2 - - var query = queryableCollection - .Where(r => r.Name == "The Movable Feast"); - - .. tab:: Query Syntax - :tabid: query-syntax - - .. code-block:: csharp - :emphasize-lines: 2 - - var query = from r in queryableCollection - where r.Name == "The Movable Feast" - select r; - -The result of the preceding example contains the following document: - -.. code-block:: json - - // Results Truncated - - { "_id" : ObjectId(...), "name" : "The Movable Feast", "restaurant_id" : "40361606", "cuisine" : "American", "address" : {...}, "borough" : "Brooklyn", "grades" : [...] } - -$limit -~~~~~~ - -The ``$limit`` aggregation stage limits the number of documents returned by the -query. The following example shows how to generate a ``$limit`` stage using LINQ: - -.. code-block:: csharp - :emphasize-lines: 4 - - var query = queryableCollection - .Where(r => r.Cuisine == "Italian") - .Select(r => new {r.Name, r.Cuisine}) - .Take(5); - -The result of the preceding example contains the following documents: - -.. code-block:: json - - { "name" : "Philadelhia Grille Express", "cuisine" : "Italian" } - { "name" : "Isle Of Capri Resturant", "cuisine" : "Italian" } - { "name" : "Marchis Restaurant", "cuisine" : "Italian" } - { "name" : "Crystal Room", "cuisine" : "Italian" } - { "name" : "Forlinis Restaurant", "cuisine" : "Italian" } - -$sample -~~~~~~~ - -The ``$sample`` aggregation stage returns a random sample of documents from a -collection. The following example shows how to generate a ``$sample`` stage by using -LINQ: - -.. code-block:: csharp - :emphasize-lines: 4 - - var query = queryableCollection - .Aggregate() - .Sample(4) - .ToList(); - -The result of the preceding example contains the following documents: - -.. code-block:: json - - // Results Truncated - - { "name" : "Von Dolhens", "cuisine" : "Ice Cream, Gelato, Yogurt, Ices" } - { "name" : "New York Mercantile Exchange", "cuisine" : "American" } - { "name" : "Michaelangelo's Restaurant", "cuisine" : "Italian" } - { "name" : "Charlie Palmer Steak", "cuisine" : "American" } - -$skip -~~~~~ - -The ``$skip`` aggregation stage skips over a specified number of documents returned -by a query, then returns the rest of the results. The following example shows how to generate -a ``$skip`` stage using LINQ: - -.. code-block:: csharp - :emphasize-lines: 4 - - var query = queryableCollection - .Where(r => r.Cuisine == "Italian") - .Select(r => new {r.Name, r.Cuisine}) - .Skip(2); - -The preceding example skips the first two restaurants that match the criteria, and -returns the rest. The result contains the following documents: - -.. code-block:: json - - // Results Truncated - - { "name" : "Marchis Restaurant", "cuisine" : "Italian" } - { "name" : "Crystal Room", "cuisine" : "Italian" } - { "name" : "Forlinis Restaurant", "cuisine" : "Italian" } - ... - -$unwind -~~~~~~~ - -The ``$unwind`` aggregation stage deconstructs a specified array field and returns -a document for each element in that array. - -Select the :guilabel:`Method Syntax` or :guilabel:`Query Syntax` tab to see how -to generate an ``$unwind`` stage using LINQ: - -.. tabs:: - - .. tab:: Method Syntax - :tabid: method-syntax - - .. code-block:: csharp - :emphasize-lines: 3 - - var query = queryableCollection - .Where(r => r.Name == "The Movable Feast") - .SelectMany(r => r.Grades); - - .. tab:: Query Syntax - :tabid: query-syntax - - .. code-block:: csharp - :emphasize-lines: 3 +To get a translated query with the ``LoggedStages`` property, you must save +the translated query directly after it is executed, and before executing any +other queries with the same queryable object. - var query = from r in queryableCollection - where r.Name == "The Movable Feast" - from grade in r.Grades - select grade; - -The query in the preceding example finds the document where the ``Name`` field -has the value "The Movable Feast." Then, for each element in this document's -``Grades`` array, the query returns a new document. The result contains the -following documents: - -.. code-block:: json - - { "date" : ISODate("2014-11-19T00:00:00Z"), "grade" : "A", "score" : 11 } - { "date" : ISODate("2013-11-14T00:00:00Z"), "grade" : "A", "score" : 2 } - { "date" : ISODate("2012-12-05T00:00:00Z"), "grade" : "A", "score" : 13 } - { "date" : ISODate("2012-05-17T00:00:00Z"), "grade" : "A", "score" : 11 } - -Nested Statements -+++++++++++++++++ - -You can chain or nest ``Select`` and ``SelectMany`` statements to unwind nested -arrays. Consider a collection that contains documents with a **new** schema. These -documents contain a ``restaurants`` field, which holds an array of documents -represented by the ``Restaurant`` class. The documents within the array each have -a ``grades`` field that holds an array of documents represented by -the ``Grade`` class. The following code is an example of a single document in -this collection: - -.. code-block:: json - - { - "_id": { "$oid": ... }, - "restaurants": [ - { - "_id": { ... } , - "address": { ... }, - "name": "Tov Kosher Kitchen", - "grades": [ - { - "date" : ISODate("2014-11-24T00:00:00Z"), - "grade" : "Z", - "score" : 20.0 - }, - { - "date" : ISODate("2013-01-17T00:00:00Z"), - "grade" : "A", - "score" : 13.0 - } - ] - ... - }, - { - "_id": { ... } , - "address": { ... }, - "name": "Harriet's Kitchen", - "grades": [ - { - "date" : ISODate("2014-04-19T00:00:00Z"), - "grade" : "B", - "score" : 12.0 - } - ], - ... - }, - ... - ] - } - -You can nest ``SelectMany`` statements within ``SelectMany`` or ``Select`` -statements. The following example nests a ``SelectMany`` statement within a -``Select`` statement to retrieve an array from each document in the collection. -Each array holds all grade objects from all restaurants in each document. +The following example uses the ``LoggedStages`` property on a LINQ query that +uses a scalar operation, then prints the translated query: .. io-code-block:: - :copyable: true - .. input:: /includes/fundamentals/code-examples/linq.cs + .. input:: :language: csharp - :start-after: start-nested-SelectMany - :end-before: end-nested-SelectMany - - .. output:: - :visible: false - :language: json - - // output for first document in collection - [ - { "date" : ISODate("2014-11-24T00:00:00Z"), - "grade" : "Z", - "score" : 20.0 - }, - { "date" : ISODate("2013-01-17T00:00:00Z"), - "grade" : "A", - "score" : 13.0 - }, - { - "date" : ISODate("2014-04-19T00:00:00Z"), - "grade" : "B", - "score" : 12.0 - }, - ... - ], - // output for second document in collection - [ - ... - ] - -$group -~~~~~~ - -The ``$group`` aggregation stage separates documents into groups according to -the criteria you specify. - -Select the :guilabel:`Method Syntax` or :guilabel:`Query Syntax` tab to see how -to generate an ``$group`` stage using LINQ: - -.. tabs:: - - .. tab:: Method Syntax - :tabid: method-syntax - - .. code-block:: csharp - :emphasize-lines: 2 - - var query = queryableCollection - .GroupBy(r => r.Cuisine) - .Select(g => new { Cuisine = g.Key, Count = g.Count() }); - - .. tab:: Query Syntax - :tabid: query-syntax - - .. code-block:: csharp - :emphasize-lines: 2 - - var query = from r in queryableCollection - group r by r.Cuisine into g - select new {Cuisine = g.Key, Count = g.Count()}; - -The preceding example groups each document by the value in its ``Cuisine`` field, -then counts how many documents have each ``Cuisine`` value. The result contains -the following documents: - -.. code-block:: json - - // Results Truncated - - { "cuisine" : "Caribbean", "count" : 657 } - { "cuisine" : "Café/Coffee/Tea", "count" : 1214 } - { "cuisine" : "Iranian", "count" : 2 } - { "cuisine" : "Nuts/Confectionary", "count" : 6 } - { "cuisine" : "Middle Eastern", "count" : 168 } - ... - -.. note:: Result Order - - The preceding queries don't always return results in the same order. Running - this example may return the results in a different order than shown above. - -$sort -~~~~~ - -The ``$sort`` aggregation stage returns the results of your query in the order -that you specify. - -Select the :guilabel:`Method Syntax` or :guilabel:`Query Syntax` tab to see how -to generate an ``$sort`` stage using LINQ: - -.. tabs:: - - .. tab:: Method Syntax - :tabid: method-syntax - - .. code-block:: csharp - :emphasize-lines: 2 - - var query = queryableCollection - .OrderBy(r => r.Name) - .ThenByDescending(r => r.RestaurantId); - - .. tab:: Query Syntax - :tabid: query-syntax - - .. code-block:: csharp - :emphasize-lines: 2 - - var query = from r in queryableCollection - orderby r.Name, r.RestaurantId descending - select r; - -The preceding example returns the query results sorted alphabetically by the -``Name`` field, with a secondary descending sort on the ``RestaurantId`` field. -The following is a subset of the documents contained in the returned results: - -.. code-block:: json - - // Results Truncated - - ... - { "_id" : ObjectId(...), "name" : "Aba Turkish Restaurant", "restaurant_id" : "41548686", "cuisine" : "Turkish", "address" : {...}, "borough" : "Manhattan", "grades" : [...] } - { "_id" : ObjectId(...), "name" : "Abace Sushi", "restaurant_id" : "50006214", "cuisine" : "Japanese", "address" : { ... }, "borough" : "Manhattan", "grades" : [...] } - { "_id" : ObjectId(...), "name" : "Abacky Potluck", "restaurant_id" : "50011222", "cuisine" : "Asian", "address" : { ... }, "borough" : "Manhattan", "grades" : [...] } - { "_id" : ObjectId(...), "name" : "Abaleh", "restaurant_id" : "50009096", "cuisine" : "Mediterranean", "address" : { ... }, "borough" : "Manhattan", "grades" : [...] } - ... - -$lookup -~~~~~~~ - -The ``$lookup`` aggregation stage joins documents from one collection to documents -from another collection in the same database. The ``$lookup`` stage adds a new -array field to each input document. The new array field contains the matching -documents from the "joined" collection. - -.. note:: - - To perform a lookup, you must make both collections queryable by using the - ``AsQueryable()`` method. - - To learn how to make a collection queryable, see :ref:`csharp-linq-queryable`. - -Consider a second collection in the ``sample_restaurants`` database called -``reviews`` that has restaurant reviews. You can join documents from that collection -to documents with the same ``name`` value in the ``restaurants`` collection using -the ``$lookup`` stage. - -The following ``Review`` class models the documents in the ``reviews`` collection: - -.. literalinclude:: /includes/fundamentals/code-examples/linq.cs - :language: csharp - :dedent: - :start-after: start-review-model - :end-before: end-review-model + :emphasize-lines: 6 -Select the :guilabel:`Method Syntax` or :guilabel:`Query Syntax` tab to see how -to generate a ``$lookup`` stage by using LINQ: -.. tabs:: + var queryableCollection = _restaurantsCollection.AsQueryable(); + var query = queryableCollection + .Where(r => r.Name == "The Movable Feast"); - .. tab:: Method Syntax - :tabid: method-syntax + var result = query.FirstOrDefault(); + var queryTranslated = query.LoggedStages; - .. code-block:: csharp + Console.WriteLine(queryTranslated.ToJson()); - var query = queryableCollection - .GroupJoin(reviewCollection, - restaurant => restaurant.Name, - review => review.RestaurantName, - (restaurant, reviews) => - new { Restaurant = restaurant, Reviews = reviews } - ); + .. output:: - .. tab:: Query Syntax - :tabid: query-syntax + [{ "$match" : { "name" : "The Movable Feast" } }, { "$limit" : NumberLong(1) }] - .. code-block:: csharp +.. important:: - var query = from restaurant in queryableCollection - join rv in reviewCollection on restaurant.Name equals rv.RestaurantName into reviews - select new { restaurant, reviews }; - -The preceding example returns all documents from the ``restaurants`` collection. Each -restaurant document has an added field called ``reviews``, which contains all -reviews for that restaurant. A review matches a restaurant if the value of the -``name`` field in the review document matches the ``name`` field of the restaurant -document. - -The following shows a subset of the returned results: - -.. code-block:: json - - // Results Truncated - - { - "restaurant": { - "_id": ObjectId("..."), - "name": "The Movable Feast", - "restaurant_id": "40361606", - "cuisine": "American", - "address": { ... }, - "borough": "Brooklyn", - "grades": [ ... ] - }, - "reviews": [ - { - "_id": ObjectId("..."), - "restaurant_name": "The Movable Feast", - "reviewer": "Lazlo Cravensworth", - "review_text": "Great restaurant! 12/10 stars!" - }, - { - "_id": ObjectId("..."), - "restaurant_name": "The Movable Feast", - "reviewer": "Michael Scarn", - "review_text": "It really was a feast" - } - ] - } - -$vectorSearch -~~~~~~~~~~~~~ - -The ``$vectorSearch`` aggregation stage performs an *approximate nearest neighbor* search -on a vector in the specified field. Your collection *must* have a -defined Atlas Vector Search index before you can perform a vector search on your data. - -.. tip:: - - To obtain the sample dataset used in the following example, see :ref:`csharp-get-started`. - To create the sample Atlas Vector Search index used in the following example, see - :atlas:`Create an Atlas Vector Search Index ` in the - Atlas manual. - -Consider the ``embedded_movies`` collection in the ``sample_mflix`` database. You -can use a ``$vectorSearch`` stage to perform a semantic search on the ``plot_embedding`` -field of the documents in the collection. - -The following ``EmbeddedMovie`` class models the documents in the ``embedded_movies`` -collection: + ``LoggedStages`` is not thread-safe. Executing a query and accessing the + associated ``LoggedStages`` property from multiple threads might have + non-deterministic results. -.. code-block:: csharp +Additional Information +---------------------- - [BsonIgnoreExtraElements] - public class EmbeddedMovie - { - [BsonIgnoreIfDefault] - public string Title { get; set; } +MongoDB Server Manual +~~~~~~~~~~~~~~~~~~~~~ - public string Plot { get; set; } +To view a full list of expression operators, see +:manual:`Aggregation Operators `. - [BsonElement("plot_embedding")] - public double[] Embedding { get; set; } - } +To learn more about assembling an aggregation pipeline and view examples, see +:manual:`Aggregation Pipeline `. -The following example shows how to generate a ``$vectorSearch`` stage to search -the ``plot_embedding`` field using vector embeddings for the string ``"time travel"``: +To learn more about creating pipeline stages, see +:manual:`Aggregation Stages `. -.. code-block:: csharp +To learn about explaining MongoDB aggregation operations, see +:manual:`Explain Results ` and +:manual:`Query Plans `. - // Defines vector embeddings for the string "time travel" - var vector = new[] {-0.0016261312,-0.028070757,-0.011342932,-0.012775794,-0.0027440966,0.008683807,-0.02575152,-0.02020668,-0.010283281,-0.0041719596,0.021392956,0.028657231,-0.006634482,0.007490867,0.018593878,0.0038187427,0.029590257,-0.01451522,0.016061379,0.00008528442,-0.008943722,0.01627464,0.024311995,-0.025911469,0.00022596726,-0.008863748,0.008823762,-0.034921836,0.007910728,-0.01515501,0.035801545,-0.0035688248,-0.020299982,-0.03145631,-0.032256044,-0.028763862,-0.0071576433,-0.012769129,0.012322609,-0.006621153,0.010583182,0.024085402,-0.001623632,0.007864078,-0.021406285,0.002554159,0.012229307,-0.011762793,0.0051682983,0.0048484034,0.018087378,0.024325324,-0.037694257,-0.026537929,-0.008803768,-0.017767483,-0.012642504,-0.0062712682,0.0009771782,-0.010409906,0.017754154,-0.004671795,-0.030469967,0.008477209,-0.005218282,-0.0058480743,-0.020153364,-0.0032805866,0.004248601,0.0051449724,0.006791097,0.007650814,0.003458861,-0.0031223053,-0.01932697,-0.033615597,0.00745088,0.006321252,-0.0038154104,0.014555207,0.027697546,-0.02828402,0.0066711367,0.0077107945,0.01794076,0.011349596,-0.0052715978,0.014755142,-0.019753495,-0.011156326,0.011202978,0.022126047,0.00846388,0.030549942,-0.0041386373,0.018847128,-0.00033655585,0.024925126,-0.003555496,-0.019300312,0.010749794,0.0075308536,-0.018287312,-0.016567878,-0.012869096,-0.015528221,0.0078107617,-0.011156326,0.013522214,-0.020646535,-0.01211601,0.055928253,0.011596181,-0.017247654,0.0005939711,-0.026977783,-0.003942035,-0.009583511,-0.0055248477,-0.028737204,0.023179034,0.003995351,0.0219661,-0.008470545,0.023392297,0.010469886,-0.015874773,0.007890735,-0.009690142,-0.00024970944,0.012775794,0.0114762215,0.013422247,0.010429899,-0.03686786,-0.006717788,-0.027484283,0.011556195,-0.036068123,-0.013915418,-0.0016327957,0.0151016945,-0.020473259,0.004671795,-0.012555866,0.0209531,0.01982014,0.024485271,0.0105431955,-0.005178295,0.033162415,-0.013795458,0.007150979,0.010243294,0.005644808,0.017260984,-0.0045618312,0.0024725192,0.004305249,-0.008197301,0.0014203656,0.0018460588,0.005015015,-0.011142998,0.01439526,0.022965772,0.02552493,0.007757446,-0.0019726837,0.009503538,-0.032042783,0.008403899,-0.04609149,0.013808787,0.011749465,0.036388017,0.016314628,0.021939443,-0.0250051,-0.017354285,-0.012962398,0.00006107364,0.019113706,0.03081652,-0.018114036,-0.0084572155,0.009643491,-0.0034721901,0.0072642746,-0.0090636825,0.01642126,0.013428912,0.027724205,0.0071243206,-0.6858542,-0.031029783,-0.014595194,-0.011449563,0.017514233,0.01743426,0.009950057,0.0029706885,-0.015714826,-0.001806072,0.011856096,0.026444625,-0.0010663156,-0.006474535,0.0016161345,-0.020313311,0.0148351155,-0.0018393943,0.0057347785,0.018300641,-0.018647194,0.03345565,-0.008070676,0.0071443142,0.014301958,0.0044818576,0.003838736,-0.007350913,-0.024525259,-0.001142124,-0.018620536,0.017247654,0.007037683,0.010236629,0.06046009,0.0138887605,-0.012122675,0.037694257,0.0055081863,0.042492677,0.00021784494,-0.011656162,0.010276617,0.022325981,0.005984696,-0.009496873,0.013382261,-0.0010563189,0.0026507939,-0.041639622,0.008637156,0.026471283,-0.008403899,0.024858482,-0.00066686375,-0.0016252982,0.027590916,0.0051449724,0.0058647357,-0.008743787,-0.014968405,0.027724205,-0.011596181,0.0047650975,-0.015381602,0.0043718936,0.002159289,0.035908177,-0.008243952,-0.030443309,0.027564257,0.042625964,-0.0033688906,0.01843393,0.019087048,0.024578573,0.03268257,-0.015608194,-0.014128681,-0.0033538956,-0.0028757197,-0.004121976,-0.032389335,0.0034322033,0.058807302,0.010943064,-0.030523283,0.008903735,0.017500903,0.00871713,-0.0029406983,0.013995391,-0.03132302,-0.019660193,-0.00770413,-0.0038853872,0.0015894766,-0.0015294964,-0.006251275,-0.021099718,-0.010256623,-0.008863748,0.028550599,0.02020668,-0.0012962399,-0.003415542,-0.0022509254,0.0119360695,0.027590916,-0.046971202,-0.0015194997,-0.022405956,0.0016677842,-0.00018535563,-0.015421589,-0.031802863,0.03814744,0.0065411795,0.016567878,-0.015621523,0.022899127,-0.011076353,0.02841731,-0.002679118,-0.002342562,0.015341615,0.01804739,-0.020566562,-0.012989056,-0.002990682,0.01643459,0.00042527664,0.008243952,-0.013715484,-0.004835075,-0.009803439,0.03129636,-0.021432944,0.0012087687,-0.015741484,-0.0052016205,0.00080890034,-0.01755422,0.004811749,-0.017967418,-0.026684547,-0.014128681,0.0041386373,-0.013742141,-0.010056688,-0.013268964,-0.0110630235,-0.028337335,0.015981404,-0.00997005,-0.02424535,-0.013968734,-0.028310679,-0.027750863,-0.020699851,0.02235264,0.001057985,0.00081639783,-0.0099367285,0.013522214,-0.012016043,-0.00086471526,0.013568865,0.0019376953,-0.019020405,0.017460918,-0.023045745,0.008503866,0.0064678704,-0.011509543,0.018727167,-0.003372223,-0.0028690554,-0.0027024434,-0.011902748,-0.012182655,-0.015714826,-0.0098634185,0.00593138,0.018753825,0.0010146659,0.013029044,0.0003521757,-0.017620865,0.04102649,0.00552818,0.024485271,-0.009630162,-0.015608194,0.0006718621,-0.0008418062,0.012395918,0.0057980907,0.016221326,0.010616505,0.004838407,-0.012402583,0.019900113,-0.0034521967,0.000247002,-0.03153628,0.0011038032,-0.020819811,0.016234655,-0.00330058,-0.0032289368,0.00078973995,-0.021952773,-0.022459272,0.03118973,0.03673457,-0.021472929,0.0072109587,-0.015075036,0.004855068,-0.0008151483,0.0069643734,0.010023367,-0.010276617,-0.023019087,0.0068244194,-0.0012520878,-0.0015086699,0.022046074,-0.034148756,-0.0022192693,0.002427534,-0.0027124402,0.0060346797,0.015461575,0.0137554705,0.009230294,-0.009583511,0.032629255,0.015994733,-0.019167023,-0.009203636,0.03393549,-0.017274313,-0.012042701,-0.0009930064,0.026777849,-0.013582194,-0.0027590916,-0.017594207,-0.026804507,-0.0014236979,-0.022032745,0.0091236625,-0.0042419364,-0.00858384,-0.0033905501,-0.020739838,0.016821127,0.022539245,0.015381602,0.015141681,0.028817179,-0.019726837,-0.0051283115,-0.011489551,-0.013208984,-0.0047017853,-0.0072309524,0.01767418,0.0025658219,-0.010323267,0.012609182,-0.028097415,0.026871152,-0.010276617,0.021912785,0.0022542577,0.005124979,-0.0019710176,0.004518512,-0.040360045,0.010969722,-0.0031539614,-0.020366628,-0.025778178,-0.0110030435,-0.016221326,0.0036587953,0.016207997,0.003007343,-0.0032555948,0.0044052163,-0.022046074,-0.0008822095,-0.009363583,0.028230704,-0.024538586,0.0029840174,0.0016044717,-0.014181997,0.031349678,-0.014381931,-0.027750863,0.02613806,0.0004136138,-0.005748107,-0.01868718,-0.0010138329,0.0054348772,0.010703143,-0.003682121,0.0030856507,-0.004275259,-0.010403241,0.021113047,-0.022685863,-0.023032416,0.031429652,0.001792743,-0.005644808,-0.011842767,-0.04078657,-0.0026874484,0.06915057,-0.00056939584,-0.013995391,0.010703143,-0.013728813,-0.022939114,-0.015261642,-0.022485929,0.016807798,0.007964044,0.0144219175,0.016821127,0.0076241563,0.005461535,-0.013248971,0.015301628,0.0085171955,-0.004318578,0.011136333,-0.0059047225,-0.010249958,-0.018207338,0.024645219,0.021752838,0.0007614159,-0.013648839,0.01111634,-0.010503208,-0.0038487327,-0.008203966,-0.00397869,0.0029740208,0.008530525,0.005261601,0.01642126,-0.0038753906,-0.013222313,0.026537929,0.024671877,-0.043505676,0.014195326,0.024778508,0.0056914594,-0.025951454,0.017620865,-0.0021359634,0.008643821,0.021299653,0.0041686273,-0.009017031,0.04044002,0.024378639,-0.027777521,-0.014208655,0.0028623908,0.042119466,0.005801423,-0.028124074,-0.03129636,0.022139376,-0.022179363,-0.04067994,0.013688826,0.013328944,0.0046184794,-0.02828402,-0.0063412455,-0.0046184794,-0.011756129,-0.010383247,-0.0018543894,-0.0018593877,-0.00052024535,0.004815081,0.014781799,0.018007403,0.01306903,-0.020433271,0.009043689,0.033189073,-0.006844413,-0.019766824,-0.018767154,0.00533491,-0.0024575242,0.018727167,0.0058080875,-0.013835444,0.0040719924,0.004881726,0.012029372,0.005664801,0.03193615,0.0058047553,0.002695779,0.009290274,0.02361889,0.017834127,0.0049017193,-0.0036388019,0.010776452,-0.019793482,0.0067777685,-0.014208655,-0.024911797,0.002385881,0.0034988478,0.020899786,-0.0025858153,-0.011849431,0.033189073,-0.021312982,0.024965113,-0.014635181,0.014048708,-0.0035921505,-0.003347231,0.030869836,-0.0017161017,-0.0061346465,0.009203636,-0.025165047,0.0068510775,0.021499587,0.013782129,-0.0024475274,-0.0051149824,-0.024445284,0.006167969,0.0068844,-0.00076183246,0.030150073,-0.0055948244,-0.011162991,-0.02057989,-0.009703471,-0.020646535,0.008004031,0.0066378145,-0.019900113,-0.012169327,-0.01439526,0.0044252095,-0.004018677,0.014621852,-0.025085073,-0.013715484,-0.017980747,0.0071043274,0.011456228,-0.01010334,-0.0035321703,-0.03801415,-0.012036037,-0.0028990454,-0.05419549,-0.024058744,-0.024272008,0.015221654,0.027964126,0.03182952,-0.015354944,0.004855068,0.011522872,0.004771762,0.0027874154,0.023405626,0.0004242353,-0.03132302,0.007057676,0.008763781,-0.0027057757,0.023005757,-0.0071176565,-0.005238275,0.029110415,-0.010989714,0.013728813,-0.009630162,-0.029137073,-0.0049317093,-0.0008630492,-0.015248313,0.0043219104,-0.0055681667,-0.013175662,0.029723546,0.025098402,0.012849103,-0.0009996708,0.03118973,-0.0021709518,0.0260181,-0.020526575,0.028097415,-0.016141351,0.010509873,-0.022965772,0.002865723,0.0020493253,0.0020509914,-0.0041419696,-0.00039695262,0.017287642,0.0038987163,0.014795128,-0.014661839,-0.008950386,0.004431874,-0.009383577,0.0012604183,-0.023019087,0.0029273694,-0.033135757,0.009176978,-0.011023037,-0.002102641,0.02663123,-0.03849399,-0.0044152127,0.0004527676,-0.0026924468,0.02828402,0.017727496,0.035135098,0.02728435,-0.005348239,-0.001467017,-0.019766824,0.014715155,0.011982721,0.0045651635,0.023458943,-0.0010046692,-0.0031373003,-0.0006972704,0.0019043729,-0.018967088,-0.024311995,0.0011546199,0.007977373,-0.004755101,-0.010016702,-0.02780418,-0.004688456,0.013022379,-0.005484861,0.0017227661,-0.015394931,-0.028763862,-0.026684547,0.0030589928,-0.018513903,0.028363993,0.0044818576,-0.009270281,0.038920518,-0.016008062,0.0093902415,0.004815081,-0.021059733,0.01451522,-0.0051583014,0.023765508,-0.017874114,-0.016821127,-0.012522544,-0.0028390652,0.0040886537,0.020259995,-0.031216389,-0.014115352,-0.009176978,0.010303274,0.020313311,0.0064112223,-0.02235264,-0.022872468,0.0052449396,0.0005723116,0.0037321046,0.016807798,-0.018527232,-0.009303603,0.0024858483,-0.0012662497,-0.007110992,0.011976057,-0.007790768,-0.042999174,-0.006727785,-0.011829439,0.007024354,0.005278262,-0.017740825,-0.0041519664,0.0085905045,0.027750863,-0.038387362,0.024391968,0.00087721116,0.010509873,-0.00038508154,-0.006857742,0.0183273,-0.0037054466,0.015461575,0.0017394272,-0.0017944091,0.014181997,-0.0052682655,0.009023695,0.00719763,-0.013522214,0.0034422,0.014941746,-0.0016711164,-0.025298337,-0.017634194,0.0058714002,-0.005321581,0.017834127,0.0110630235,-0.03369557,0.029190388,-0.008943722,0.009363583,-0.0034222065,-0.026111402,-0.007037683,-0.006561173,0.02473852,-0.007084334,-0.010110005,-0.008577175,0.0030439978,-0.022712521,0.0054582027,-0.0012620845,-0.0011954397,-0.015741484,0.0129557345,-0.00042111133,0.00846388,0.008930393,0.016487904,0.010469886,-0.007917393,-0.011762793,-0.0214596,0.000917198,0.021672864,0.010269952,-0.007737452,-0.010243294,-0.0067244526,-0.015488233,-0.021552904,0.017127695,0.011109675,0.038067464,0.00871713,-0.0025591573,0.021312982,-0.006237946,0.034628596,-0.0045251767,0.008357248,0.020686522,0.0010696478,0.0076708077,0.03772091,-0.018700508,-0.0020676525,-0.008923728,-0.023298996,0.018233996,-0.010256623,0.0017860786,0.009796774,-0.00897038,-0.01269582,-0.018527232,0.009190307,-0.02372552,-0.042119466,0.008097334,-0.0066778013,-0.021046404,0.0019593548,0.011083017,-0.0016028056,0.012662497,-0.000059095124,0.0071043274,-0.014675168,0.024831824,-0.053582355,0.038387362,0.0005698124,0.015954746,0.021552904,0.031589597,-0.009230294,-0.0006147976,0.002625802,-0.011749465,-0.034362018,-0.0067844326,-0.018793812,0.011442899,-0.008743787,0.017474247,-0.021619547,0.01831397,-0.009037024,-0.0057247817,-0.02728435,0.010363255,0.034415334,-0.024032086,-0.0020126705,-0.0045518344,-0.019353628,-0.018340627,-0.03129636,-0.0034038792,-0.006321252,-0.0016161345,0.033642255,-0.000056075285,-0.005005019,0.004571828,-0.0024075406,-0.00010215386,0.0098634185,0.1980148,-0.003825407,-0.025191706,0.035161756,0.005358236,0.025111731,0.023485601,0.0023342315,-0.011882754,0.018287312,-0.0068910643,0.003912045,0.009243623,-0.001355387,-0.028603915,-0.012802451,-0.030150073,-0.014795128,-0.028630573,-0.0013487226,0.002667455,0.00985009,-0.0033972147,-0.021486258,0.009503538,-0.017847456,0.013062365,-0.014341944,0.005078328,0.025165047,-0.015594865,-0.025924796,-0.0018177348,0.010996379,-0.02993681,0.007324255,0.014475234,-0.028577257,0.005494857,0.00011725306,-0.013315615,0.015941417,0.009376912,0.0025158382,0.008743787,0.023832154,-0.008084005,-0.014195326,-0.008823762,0.0033455652,-0.032362677,-0.021552904,-0.0056081535,0.023298996,-0.025444955,0.0097301295,0.009736794,0.015274971,-0.0012937407,-0.018087378,-0.0039387033,0.008637156,-0.011189649,-0.00023846315,-0.011582852,0.0066411467,-0.018220667,0.0060846633,0.0376676,-0.002709108,0.0072776037,0.0034188742,-0.010249958,-0.0007747449,-0.00795738,-0.022192692,0.03910712,0.032122757,0.023898797,0.0076241563,-0.007397564,-0.003655463,0.011442899,-0.014115352,-0.00505167,-0.031163072,0.030336678,-0.006857742,-0.022259338,0.004048667,0.02072651,0.0030156737,-0.0042119464,0.00041861215,-0.005731446,0.011103011,0.013822115,0.021512916,0.009216965,-0.006537847,-0.027057758,-0.04054665,0.010403241,-0.0056281467,-0.005701456,-0.002709108,-0.00745088,-0.0024841821,0.009356919,-0.022659205,0.004061996,-0.013175662,0.017074378,-0.006141311,-0.014541878,0.02993681,-0.00028448965,-0.025271678,0.011689484,-0.014528549,0.004398552,-0.017274313,0.0045751603,0.012455898,0.004121976,-0.025458284,-0.006744446,0.011822774,-0.015035049,-0.03257594,0.014675168,-0.0039187097,0.019726837,-0.0047251107,0.0022825818,0.011829439,0.005391558,-0.016781142,-0.0058747325,0.010309938,-0.013049036,0.01186276,-0.0011246296,0.0062112883,0.0028190718,-0.021739509,0.009883412,-0.0073175905,-0.012715813,-0.017181009,-0.016607866,-0.042492677,-0.0014478565,-0.01794076,0.012302616,-0.015194997,-0.04433207,-0.020606548,0.009696807,0.010303274,-0.01694109,-0.004018677,0.019353628,-0.001991011,0.000058938927,0.010536531,-0.17274313,0.010143327,0.014235313,-0.024152048,0.025684876,-0.0012504216,0.036601283,-0.003698782,0.0007310093,0.004165295,-0.0029157067,0.017101036,-0.046891227,-0.017460918,0.022965772,0.020233337,-0.024072073,0.017220996,0.009370248,0.0010363255,0.0194336,-0.019606877,0.01818068,-0.020819811,0.007410893,0.0019326969,0.017887443,0.006651143,0.00067394477,-0.011889419,-0.025058415,-0.008543854,0.021579562,0.0047484366,0.014062037,0.0075508473,-0.009510202,-0.009143656,0.0046817916,0.013982063,-0.0027990784,0.011782787,0.014541878,-0.015701497,-0.029350337,0.021979429,0.01332228,-0.026244693,-0.0123492675,-0.003895384,0.0071576433,-0.035454992,-0.00046984528,0.0033522295,0.039347045,0.0005119148,0.00476843,-0.012995721,0.0024042083,-0.006931051,-0.014461905,-0.0127558,0.0034555288,-0.0074842023,-0.030256703,-0.007057676,-0.00807734,0.007804097,-0.006957709,0.017181009,-0.034575284,-0.008603834,-0.005008351,-0.015834786,0.02943031,0.016861115,-0.0050849924,0.014235313,0.0051449724,0.0025924798,-0.0025741523,0.04289254,-0.002104307,0.012969063,-0.008310596,0.00423194,0.0074975314,0.0018810473,-0.014248641,-0.024725191,0.0151016945,-0.017527562,0.0018727167,0.0002830318,0.015168339,0.0144219175,-0.004048667,-0.004358565,0.011836103,-0.010343261,-0.005911387,0.0022825818,0.0073175905,0.00403867,0.013188991,0.03334902,0.006111321,0.008597169,0.030123414,-0.015474904,0.0017877447,-0.024551915,0.013155668,0.023525586,-0.0255116,0.017220996,0.004358565,-0.00934359,0.0099967085,0.011162991,0.03092315,-0.021046404,-0.015514892,0.0011946067,-0.01816735,0.010876419,-0.10124666,-0.03550831,0.0056348112,0.013942076,0.005951374,0.020419942,-0.006857742,-0.020873128,-0.021259667,0.0137554705,0.0057880944,-0.029163731,-0.018767154,-0.021392956,0.030896494,-0.005494857,-0.0027307675,-0.006801094,-0.014821786,0.021392956,-0.0018110704,-0.0018843795,-0.012362596,-0.0072176233,-0.017194338,-0.018713837,-0.024272008,0.03801415,0.00015880188,0.0044951867,-0.028630573,-0.0014070367,-0.00916365,-0.026537929,-0.009576847,-0.013995391,-0.0077107945,0.0050016865,0.00578143,-0.04467862,0.008363913,0.010136662,-0.0006268769,-0.006591163,0.015341615,-0.027377652,-0.00093136,0.029243704,-0.020886457,-0.01041657,-0.02424535,0.005291591,-0.02980352,-0.009190307,0.019460259,-0.0041286405,0.004801752,0.0011787785,-0.001257086,-0.011216307,-0.013395589,0.00088137644,-0.0051616337,0.03876057,-0.0033455652,0.00075850025,-0.006951045,-0.0062112883,0.018140694,-0.006351242,-0.008263946,0.018154023,-0.012189319,0.0075508473,-0.044358727,-0.0040153447,0.0093302615,-0.010636497,0.032789204,-0.005264933,-0.014235313,-0.018393943,0.007297597,-0.016114693,0.015021721,0.020033404,0.0137688,0.0011046362,0.010616505,-0.0039453674,0.012109346,0.021099718,-0.0072842683,-0.019153694,-0.003768759,0.039320387,-0.006747778,-0.0016852784,0.018154023,0.0010963057,-0.015035049,-0.021033075,-0.04345236,0.017287642,0.016341286,-0.008610498,0.00236922,0.009290274,0.028950468,-0.014475234,-0.0035654926,0.015434918,-0.03372223,0.004501851,-0.012929076,-0.008483873,-0.0044685286,-0.0102233,0.01615468,0.0022792495,0.010876419,-0.0059647025,0.01895376,-0.0069976957,-0.0042952523,0.017207667,-0.00036133936,0.0085905045,0.008084005,0.03129636,-0.016994404,-0.014915089,0.020100048,-0.012009379,-0.006684466,0.01306903,0.00015765642,-0.00530492,0.0005277429,0.015421589,0.015528221,0.032202728,-0.003485519,-0.0014286962,0.033908837,0.001367883,0.010509873,0.025271678,-0.020993087,0.019846799,0.006897729,-0.010216636,-0.00725761,0.01818068,-0.028443968,-0.011242964,-0.014435247,-0.013688826,0.006101324,-0.0022509254,0.013848773,-0.0019077052,0.017181009,0.03422873,0.005324913,-0.0035188415,0.014128681,-0.004898387,0.005038341,0.0012320944,-0.005561502,-0.017847456,0.0008538855,-0.0047884234,0.011849431,0.015421589,-0.013942076,0.0029790192,-0.013702155,0.0001199605,-0.024431955,0.019926772,0.022179363,-0.016487904,-0.03964028,0.0050849924,0.017487574,0.022792496,0.0012504216,0.004048667,-0.00997005,0.0076041627,-0.014328616,-0.020259995,0.0005598157,-0.010469886,0.0016852784,0.01716768,-0.008990373,-0.001987679,0.026417969,0.023792166,0.0046917885,-0.0071909656,-0.00032051947,-0.023259008,-0.009170313,0.02071318,-0.03156294,-0.030869836,-0.006324584,0.013795458,-0.00047151142,0.016874444,0.00947688,0.00985009,-0.029883493,0.024205362,-0.013522214,-0.015075036,-0.030603256,0.029270362,0.010503208,0.021539574,0.01743426,-0.023898797,0.022019416,-0.0068777353,0.027857494,-0.021259667,0.0025758184,0.006197959,0.006447877,-0.00025200035,-0.004941706,-0.021246338,-0.005504854,-0.008390571,-0.0097301295,0.027244363,-0.04446536,0.05216949,0.010243294,-0.016008062,0.0122493,-0.0199401,0.009077012,0.019753495,0.006431216,-0.037960835,-0.027377652,0.016381273,-0.0038620618,0.022512587,-0.010996379,-0.0015211658,-0.0102233,0.007071005,0.008230623,-0.009490209,-0.010083347,0.024431955,0.002427534,0.02828402,0.0035721571,-0.022192692,-0.011882754,0.010056688,0.0011904413,-0.01426197,-0.017500903,-0.00010985966,0.005591492,-0.0077707744,-0.012049366,0.011869425,0.00858384,-0.024698535,-0.030283362,0.020140035,0.011949399,-0.013968734,0.042732596,-0.011649498,-0.011982721,-0.016967745,-0.0060913274,-0.007130985,-0.013109017,-0.009710136}; - - // Specifies that the vector search will consider the 150 nearest neighbors - // in the specified index - var options = new VectorSearchOptions() - { - IndexName = "vector_index", - NumberOfCandidates = 150 - }; - - // Builds aggregation pipeline and specifies that the $vectorSearch stage - // returns 10 results - var results = queryableCollection - .VectorSearch(m => m.Embedding, vector, 10, options) - .Select(m => new { m.Title, m.Plot }); - -The results of the preceding example contain the following documents: - -.. code-block:: json - - { "_id" : ObjectId("573a13a0f29313caabd04a4f"), "plot" : "A reporter, learning of time travelers visiting 20th century disasters, tries to change the history they know by averting upcoming disasters.", "title" : "Thrill Seekers" } - { "_id" : ObjectId("573a13d8f29313caabda6557"), "plot" : "At the age of 21, Tim discovers he can travel in time and change what happens and has happened in his own life. His decision to make his world a better place by getting a girlfriend turns out not to be as easy as you might think.", "title" : "About Time" } - { "_id" : ObjectId("573a13a5f29313caabd13b4b"), "plot" : "Hoping to alter the events of the past, a 19th century inventor instead travels 800,000 years into the future, where he finds humankind divided into two warring races.", "title" : "The Time Machine" } - { "_id" : ObjectId("573a13aef29313caabd2e2d7"), "plot" : "After using his mother's newly built time machine, Dolf gets stuck involuntary in the year 1212. He ends up in a children's crusade where he confronts his new friends with modern techniques...", "title" : "Crusade in Jeans" } - { "_id" : ObjectId("573a1399f29313caabceec0e"), "plot" : "An officer for a security agency that regulates time travel, must fend for his life against a shady politician who has a tie to his past.", "title" : "Timecop" } - { "_id" : ObjectId("573a1399f29313caabcee36f"), "plot" : "A time-travel experiment in which a robot probe is sent from the year 2073 to the year 1973 goes terribly wrong thrusting one of the project scientists, a man named Nicholas Sinclair into a...", "title" : "A.P.E.X." } - { "_id" : ObjectId("573a13c6f29313caabd715d3"), "plot" : "Agent J travels in time to M.I.B.'s early days in 1969 to stop an alien from assassinating his friend Agent K and changing history.", "title" : "Men in Black 3" } - { "_id" : ObjectId("573a13d4f29313caabd98c13"), "plot" : "Bound by a shared destiny, a teen bursting with scientific curiosity and a former boy-genius inventor embark on a mission to unearth the secrets of a place somewhere in time and space that exists in their collective memory.", "title" : "Tomorrowland" } - { "_id" : ObjectId("573a13b6f29313caabd477fa"), "plot" : "With the help of his uncle, a man travels to the future to try and bring his girlfriend back to life.", "title" : "Love Story 2050" } - { "_id" : ObjectId("573a13e5f29313caabdc40c9"), "plot" : "A dimension-traveling wizard gets stuck in the 21st century because cell-phone radiation interferes with his magic. With his home world on the brink of war, he seeks help from a jaded ...", "title" : "The Portal" } - -For more information about Atlas Vector Search, Atlas Vector Search indexes, and how -to incorporate them into your application, see :atlas:`Atlas Vector Search Overview ` -in the Atlas manual. For more examples about running Atlas Vector Search queries using the -{+driver-short+}, see :atlas:`Run Vector Search Queries ` -in the Atlas manual and select :guilabel:`C#` from the language dropdown. - -Bitwise Operators +API Documentation ~~~~~~~~~~~~~~~~~ -This section describes the :wikipedia:`bitwise operators ` -supported by the {+driver-short+} that you can use in an aggregation pipeline. -You can use multiple bitwise operators in the same -stage. The following guidelines apply when using bitwise operators: - -- All operands must be of type ``int`` or ``long``. - -- ``$bitAnd``, ``$bitOr``, and ``$bitXor`` take two or more operands. ``$bitNot`` takes one operand. - -- Bitwise operations are evaluated from left to right. - -The examples in this section use the following documents in a collection called -``ingredients``: - -.. code-block:: json - - { "_id" : 1, "name" : "watermelon", "is_available" : 1, "is_cheap" : 1 }, - { "_id" : 2, "name" : "onions", "is_available" : 1, "is_cheap" : 0 }, - { "_id" : 3, "name" : "eggs", "is_available" : 0, "is_cheap" : 0 }, - { "_id" : 4, "name" : "potatoes", "is_available" : 1, "is_cheap" : 1 }, - { "_id" : 5, "name" : "pasta", "is_available" : 0, "is_cheap" : 1 }, - { "_id" : 6, "name" : "cheese", "is_available" : 1 } - -The ``"is_available"`` field represents if an ingredient is available. If this -field has a value of ``0``, the ingredient is not available. If it has a value -of ``1``, the ingredient is available. - -The ``"is_cheap"`` field represents if an ingredient is cheap. If this field has -a value of ``0``, the ingredient is not cheap. If it has a value of ``1``, the -ingredient is cheap. - -The following ``Ingredient`` class models the documents in the ``ingredients`` -collection: - -.. literalinclude:: /includes/fundamentals/code-examples/linq.cs - :language: csharp - :dedent: - :start-after: start-ingredient-model - :end-before: end-ingredient-model - -.. note:: Missing or Undefined Operands - - If the operands you pass to any bitwise operator are of type `nullable `__ - ``int`` or ``long`` and contain a missing or undefined value, the entire expression - evaluates to ``null``. If the operands are of type non-nullable ``int`` or - ``long`` and contain a missing or undefined value, the {+driver-short+} will - throw an error. - -$bitAnd -+++++++ - -The ``$bitAnd`` aggregation operator performs a bitwise AND operation on the given -arguments. You can use the ``$bitAnd`` operator by connecting two or more -clauses with a ``&`` character. - -The following example shows how to create a ``$bitAnd`` stage by using LINQ. The -code retrieves the document in which the ``Name`` field has the -value ``"watermelon"``. It then performs a bitwise AND operation on the values of the -``IsAvailable`` and ``IsCheap`` fields in this document. - -.. literalinclude:: /includes/fundamentals/code-examples/linq.cs - :language: csharp - :dedent: - :start-after: start-bitAnd-example - :end-before: end-bitAnd-example - -The preceding code returns ``1``, the result of the AND operation on the values -of the ``IsAvailable`` field (``1``) and the ``IsCheap`` field (``1``). - -The following example performs the same bitwise AND operation on all -documents in the collection: - -.. io-code-block:: - :copyable: true - - .. input:: /includes/fundamentals/code-examples/linq.cs - :language: csharp - :dedent: - :start-after: start-bitAnd-collection-example - :end-before: end-bitAnd-collection-example - - .. output:: - :language: json - :visible: false - - 1 - 0 - 0 - 1 - 0 - null - -The ``null`` result comes from the document where the ``Name`` field -has the value of ``"cheese"``. This document is missing an ``IsCheap`` field, so -the expression evaluates to ``null``. - -$bitOr -++++++ - -The ``$bitOr`` aggregation operator performs a bitwise OR operation on the given -arguments. You can use the ``$bitOr`` operator by connecting two or more -clauses with a ``|`` character. - -The following example shows how to create a ``$bitOr`` stage by using LINQ. The -code retrieves the document in which the ``Name`` field has the -value ``"onions"``. It then performs a bitwise OR operation on the values of the -``IsAvailable`` and ``IsCheap`` fields in this document. - -.. literalinclude:: /includes/fundamentals/code-examples/linq.cs - :language: csharp - :dedent: - :start-after: start-bitOr-example - :end-before: end-bitOr-example - -The preceding code returns ``1``, the result of the OR operation on the values -of the ``IsAvailable`` field (``1``) and the ``IsCheap`` field (``0``). - -$bitNot -+++++++ - -The ``$bitNot`` aggregation operator performs a bitwise NOT operation on the given -argument. You can use the ``$bitNot`` operator by preceding an -operand with a ``~`` character. ``$bitNot`` only takes one argument. The -following example shows how to create a ``$bitNot`` stage by using LINQ: - -.. io-code-block:: - :copyable: true - - .. input:: /includes/fundamentals/code-examples/linq.cs - :language: csharp - :dedent: - :start-after: start-bitNot-example - :end-before: end-bitNot-example - - .. output:: - :language: json - :visible: false - - -2 - -1 - -1 - -2 - -2 - null - -$bitXor -+++++++ - -The ``$bitXor`` aggregation operator performs a bitwise XOR operation on the given -arguments. You can use the ``$bitXor`` operator by connecting two or more -clauses with a ``^`` character. - -The following example shows how to create a ``$bitXor`` stage by using LINQ. The -code retrieves the documents in which the ``Name`` field has -the value ``"watermelon"`` or ``"onions"``. It then performs a bitwise XOR -operation on the values of the ``IsAvailable`` and ``IsCheap`` fields in these -documents. - -.. literalinclude:: /includes/fundamentals/code-examples/linq.cs - :language: csharp - :dedent: - :start-after: start-bitXor-example - :end-before: end-bitXor-example - -The result contains the following values: +For more information about the aggregation operations discussed in this guide, see the +following API documentation: -.. code-block:: json - - 0 - 1 +- `Aggregate() <{+new-api-root+}/MongoDB.Driver/MongoDB.Driver.IMongoCollection-1.Aggregate.html>`__ +- `AggregateOptions <{+new-api-root+}/MongoDB.Driver/MongoDB.Driver.AggregateOptions.html>`__ +- `Group() <{+new-api-root+}/MongoDB.Driver/MongoDB.Driver.PipelineStageDefinitionBuilder.Group.html>`__ +- `Match() <{+new-api-root+}/MongoDB.Driver/MongoDB.Driver.PipelineStageDefinitionBuilder.Match.html>`__ +- `Where() <{+new-api-root+}/MongoDB.Driver/MongoDB.Driver.Linq.MongoQueryable.Where.html>`__ +- `GroupBy() <{+new-api-root+}/MongoDB.Driver/MongoDB.Driver.Linq.MongoQueryable.GroupBy.html>`__ +- `Select() <{+new-api-root+}/MongoDB.Driver/MongoDB.Driver.Linq.MongoQueryable.Select.html>`__ +- `PipelineDefinitionBuilder <{+new-api-root+}/MongoDB.Driver/MongoDB.Driver.PipelineDefinitionBuilder.html>`__ +.. TODO: integrate into existing page Unsupported Aggregation Stages ------------------------------ @@ -1490,174 +367,9 @@ aggregation stages: To learn how to create an aggregation pipeline with the ``$out`` stage by using Builders, see the :ref:`` section. -Supported Methods ------------------ - -The following are some methods supported by the {+driver-long+} implementation of LINQ: - -.. list-table:: - :header-rows: 1 - :widths: 40 60 - - * - Method Name - - Description - - * - ``Any`` - - Determines if any documents match the specified criteria - - * - ``Average`` - - Calculates the average of the specified fields - - * - ``Count`` - - Returns an ``Int32`` that represents the number of documents that match the specified criteria - - * - ``LongCount`` - - Returns an ``Int64`` that represents the number of documents that match the specified criteria - - * - ``DateFromString`` - - Converts a ``string`` to a ``DateTime`` object - - * - ``Distinct`` - - Returns distinct documents that match the specified criteria - - * - ``DistinctMany`` - - Returns distinct documents from an array that match the specified criteria - - * - ``Exists`` - - Tests whether a field exists - - * - ``First`` - - Returns the first matching document, and throws an exception if none are found - - * - ``FirstOrDefault`` - - Returns the first matching document, or ``null`` if none are found - - * - ``GroupBy`` - - Groups documents based on specified criteria - - * - ``GroupJoin`` - - Performs a left outer join to another collection in the same database - - * - ``IsMissing`` - - Returns ``true`` if a field is missing and false otherwies - - * - ``IsNullOrMissing`` - - Returns ``true`` if a field is null or missing and false otherwise - - * - ``Max`` - - Returns the document with the maximum specified value - - * - ``OfType`` - - Returns documents that match the specified type - - * - ``OrderBy``, ``OrderByDescending`` - - Returns results in a specified sort order - - * - ``ThenBy``, ``ThenByDescending`` - - Allows a secondary sort to be specified - - * - ``Select`` - - Selects documents based on specified criteria - * - ``SelectMany`` - - Projects each element of a sequence and combines the resulting sequences into one document - * - ``Single`` - - Returns the only matching document, and throws an exception if there is not exactly one document - - * - ``SingleOrDefault`` - - Returns a single matching document or ``null`` if no documents match - * - ``Skip`` - - Skips over a specified number of documents and returns the rest of the results - - * - ``Sum`` - - Returns the sum of the values in a specified field - - * - ``Take`` - - Specifies the number of results to return - - * - ``Where`` - - Returns all documents that match your specified criteria - -View Translated Queries ------------------------ - -When you run a LINQ query, the {+driver-short+} automatically translates your -query into an aggregation pipeline written with the {+query-api+}. You can view -the translated query by using the ``ToString()`` method or the -``LoggedStages`` property. - -To see the translated query for **non-scalar operations**, use the ``ToString()`` -method. Non-scalar operations are operations that return a query object, such -as: - -- ``Where`` -- ``Select`` -- ``SelectMany`` -- ``GroupJoin`` - -The following example calls the ``ToString()`` method on a LINQ query and prints -the translated query: - -.. io-code-block:: - - .. input:: - :language: csharp - - var queryableCollection = _restaurantsCollection.AsQueryable(); - var query = queryableCollection - .Where(r => r.Name == "The Movable Feast"); - - var queryTranslated = query.ToString(); - Console.WriteLine(queryTranslated); - - .. output:: - - sample_restaurants.restaurants.Aggregate([{ "$match" : { "name" : "The Movable Feast" } }]) - -To get the translated query for **scalar operations** use the ``LoggedStages`` -property. Scalar operations are operations that return a scalar result rather than a -query object, such as: - -- ``First`` -- ``Sum`` -- ``Count`` -- ``Min`` -- ``Max`` - -To get a translated query with the ``LoggedStages`` property, you must save -the translated query directly after it is executed, and before executing any -other queries with the same queryable object. - -The following example uses the ``LoggedStages`` property on a LINQ query that -uses a scalar operation, then prints the translated query: - -.. io-code-block:: - - .. input:: - :language: csharp - :emphasize-lines: 6 - - - var queryableCollection = _restaurantsCollection.AsQueryable(); - var query = queryableCollection - .Where(r => r.Name == "The Movable Feast"); - - var result = query.FirstOrDefault(); - var queryTranslated = query.LoggedStages; - - Console.WriteLine(queryTranslated.ToJson()); - - .. output:: - - [{ "$match" : { "name" : "The Movable Feast" } }, { "$limit" : NumberLong(1) }] - -.. important:: - - ``LoggedStages`` is not thread-safe. Executing a query and accessing the - associated ``LoggedStages`` property from multiple threads might have - non-deterministic results. Troubleshooting --------------- diff --git a/source/aggregation/changeStream.txt b/source/aggregation/changeStream.txt deleted file mode 100644 index 85105f7d..00000000 --- a/source/aggregation/changeStream.txt +++ /dev/null @@ -1,18 +0,0 @@ -.. _csharp-aggregation-changestream: - -============ -ChangeStream -============ - -.. facet:: - :name: genre - :values: reference - -.. meta:: - :keywords: code example, transform, pipeline - -.. contents:: On this page - :local: - :backlinks: none - :depth: 2 - :class: singlecol \ No newline at end of file diff --git a/source/aggregation/group.txt b/source/aggregation/group.txt deleted file mode 100644 index 399803ee..00000000 --- a/source/aggregation/group.txt +++ /dev/null @@ -1,45 +0,0 @@ -.. _csharp-aggregation-group: - -===== -Group -===== - -.. facet:: - :name: genre - :values: reference - -.. meta:: - :keywords: code example, transform, pipeline - -.. contents:: On this page - :local: - :backlinks: none - :depth: 2 - :class: singlecol - - -Group ------ - -Use the ``group()`` method to create a :manual:`$group ` -pipeline stage to group documents by a specified expression and output a document -for each distinct grouping. - -.. tip:: - - The driver includes the `Accumulators <{+core-api+}/client/model/Accumulators.html>`__ - class with static factory methods for each of the supported accumulators. - -The following example creates a pipeline stage that groups documents by the value -of the ``customerId`` field. Each group accumulates the sum and average -of the values of the ``quantity`` field into the ``totalQuantity`` and -``averageQuantity`` fields. - -.. literalinclude:: /includes/aggregation/Builders.cs - :start-after: // begin group - :end-before: // end group - :language: csharp - :dedent: - -Learn more about accumulator operators from the Server manual section -on :manual:`Accumulators `. \ No newline at end of file diff --git a/source/aggregation/limit.txt b/source/aggregation/limit.txt deleted file mode 100644 index ecc86ecd..00000000 --- a/source/aggregation/limit.txt +++ /dev/null @@ -1,33 +0,0 @@ -.. _csharp-aggregation-limit: - -===== -Limit -===== - -.. facet:: - :name: genre - :values: reference - -.. meta:: - :keywords: code example, transform, pipeline - -.. contents:: On this page - :local: - :backlinks: none - :depth: 2 - :class: singlecol - - -Limit ------ - -Use the :manual:`$limit ` pipeline stage -to limit the number of documents passed to the next stage. - -The following example creates a pipeline stage that limits the number of documents to ``10``: - -.. literalinclude:: /includes/aggregation/Builders.cs - :start-after: // begin limit - :end-before: // end limit - :language: csharp - :dedent: \ No newline at end of file diff --git a/source/aggregation/lookup.txt b/source/aggregation/lookup.txt deleted file mode 100644 index 6db7b6e4..00000000 --- a/source/aggregation/lookup.txt +++ /dev/null @@ -1,53 +0,0 @@ -.. _csharp-aggregation-lookup: - -====== -Lookup -====== - -.. facet:: - :name: genre - :values: reference - -.. meta:: - :keywords: code example, transform, pipeline - -.. contents:: On this page - :local: - :backlinks: none - :depth: 2 - :class: singlecol - - -Lookup ------- - -Use the ``lookup()`` method to create a :manual:`$lookup ` -pipeline stage to perform joins and uncorrelated subqueries between two collections. - -Left Outer Join -~~~~~~~~~~~~~~~ - -The following example creates a pipeline stage that performs a left outer -join between the ``movies`` and ``comments`` collections: - -- It joins the ``_id`` field from ``movies`` to the ``movie_id`` field in ``comments`` -- It outputs the results in the ``joined_comments`` field: - -.. literalinclude:: /includes/aggregation/Builders.cs - :start-after: // begin basic lookup - :end-before: // end basic lookup - :language: csharp - :dedent: - -Full Join and Uncorrelated Subqueries -~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ - -The following example creates a pipeline stage that joins two collections, ``orders`` -and ``warehouses``, by the item and whether the available quantity is enough -to fulfill the ordered quantity: - -.. literalinclude:: /includes/aggregation/Builders.cs - :start-after: // begin advanced lookup - :end-before: // end advanced lookup - :language: csharp - :dedent: diff --git a/source/aggregation/match.txt b/source/aggregation/match.txt deleted file mode 100644 index 64f6a516..00000000 --- a/source/aggregation/match.txt +++ /dev/null @@ -1,39 +0,0 @@ -.. _csharp-aggregation-match: - -===== -Match -===== - -.. facet:: - :name: genre - :values: reference - -.. meta:: - :keywords: code example, transform, pipeline - -.. contents:: On this page - :local: - :backlinks: none - :depth: 2 - :class: singlecol - -Match ------ - -Use the ``match()`` method to create a :manual:`$match ` -pipeline stage that matches incoming documents against the specified -query filter, filtering out documents that do not match. - -.. tip:: - - The filter can be an instance of any class that implements ``Bson``, but it's - convenient to combine with use of the :ref:`Filters ` class. - -The following example creates a pipeline stage that matches all documents where the -``title`` field is equal to "The Shawshank Redemption": - -.. literalinclude:: /includes/aggregation/Builders.cs - :start-after: begin match - :end-before: end match - :language: csharp - :dedent: diff --git a/source/aggregation/operators.txt b/source/aggregation/operators.txt new file mode 100644 index 00000000..a39eaf6d --- /dev/null +++ b/source/aggregation/operators.txt @@ -0,0 +1,263 @@ +Supported Methods +----------------- + +The following are some methods supported by the {+driver-long+} implementation of LINQ: + +.. list-table:: + :header-rows: 1 + :widths: 40 60 + + * - Method Name + - Description + + * - ``Any`` + - Determines if any documents match the specified criteria + + * - ``Average`` + - Calculates the average of the specified fields + + * - ``Count`` + - Returns an ``Int32`` that represents the number of documents that match the specified criteria + + * - ``LongCount`` + - Returns an ``Int64`` that represents the number of documents that match the specified criteria + + * - ``DateFromString`` + - Converts a ``string`` to a ``DateTime`` object + + * - ``Distinct`` + - Returns distinct documents that match the specified criteria + + * - ``DistinctMany`` + - Returns distinct documents from an array that match the specified criteria + + * - ``Exists`` + - Tests whether a field exists + + * - ``First`` + - Returns the first matching document, and throws an exception if none are found + + * - ``FirstOrDefault`` + - Returns the first matching document, or ``null`` if none are found + + * - ``GroupBy`` + - Groups documents based on specified criteria + + * - ``GroupJoin`` + - Performs a left outer join to another collection in the same database + + * - ``IsMissing`` + - Returns ``true`` if a field is missing and false otherwies + + * - ``IsNullOrMissing`` + - Returns ``true`` if a field is null or missing and false otherwise + + * - ``Max`` + - Returns the document with the maximum specified value + + * - ``OfType`` + - Returns documents that match the specified type + + * - ``OrderBy``, ``OrderByDescending`` + - Returns results in a specified sort order + + * - ``ThenBy``, ``ThenByDescending`` + - Allows a secondary sort to be specified + + * - ``Select`` + - Selects documents based on specified criteria + + * - ``SelectMany`` + - Projects each element of a sequence and combines the resulting sequences into one document + + * - ``Single`` + - Returns the only matching document, and throws an exception if there is not exactly one document + + * - ``SingleOrDefault`` + - Returns a single matching document or ``null`` if no documents match + + * - ``Skip`` + - Skips over a specified number of documents and returns the rest of the results + + * - ``Sum`` + - Returns the sum of the values in a specified field + + * - ``Take`` + - Specifies the number of results to return + + * - ``Where`` + - Returns all documents that match your specified criteria + +Bitwise Operators +~~~~~~~~~~~~~~~~~ + +This section describes the :wikipedia:`bitwise operators ` +supported by the {+driver-short+} that you can use in an aggregation pipeline. +You can use multiple bitwise operators in the same +stage. The following guidelines apply when using bitwise operators: + +- All operands must be of type ``int`` or ``long``. + +- ``$bitAnd``, ``$bitOr``, and ``$bitXor`` take two or more operands. ``$bitNot`` takes one operand. + +- Bitwise operations are evaluated from left to right. + +The examples in this section use the following documents in a collection called +``ingredients``: + +.. code-block:: json + + { "_id" : 1, "name" : "watermelon", "is_available" : 1, "is_cheap" : 1 }, + { "_id" : 2, "name" : "onions", "is_available" : 1, "is_cheap" : 0 }, + { "_id" : 3, "name" : "eggs", "is_available" : 0, "is_cheap" : 0 }, + { "_id" : 4, "name" : "potatoes", "is_available" : 1, "is_cheap" : 1 }, + { "_id" : 5, "name" : "pasta", "is_available" : 0, "is_cheap" : 1 }, + { "_id" : 6, "name" : "cheese", "is_available" : 1 } + +The ``"is_available"`` field represents if an ingredient is available. If this +field has a value of ``0``, the ingredient is not available. If it has a value +of ``1``, the ingredient is available. + +The ``"is_cheap"`` field represents if an ingredient is cheap. If this field has +a value of ``0``, the ingredient is not cheap. If it has a value of ``1``, the +ingredient is cheap. + +The following ``Ingredient`` class models the documents in the ``ingredients`` +collection: + +.. literalinclude:: /includes/fundamentals/code-examples/linq.cs + :language: csharp + :dedent: + :start-after: start-ingredient-model + :end-before: end-ingredient-model + +.. note:: Missing or Undefined Operands + + If the operands you pass to any bitwise operator are of type `nullable `__ + ``int`` or ``long`` and contain a missing or undefined value, the entire expression + evaluates to ``null``. If the operands are of type non-nullable ``int`` or + ``long`` and contain a missing or undefined value, the {+driver-short+} will + throw an error. + +$bitAnd ++++++++ + +The ``$bitAnd`` aggregation operator performs a bitwise AND operation on the given +arguments. You can use the ``$bitAnd`` operator by connecting two or more +clauses with a ``&`` character. + +The following example shows how to create a ``$bitAnd`` stage by using LINQ. The +code retrieves the document in which the ``Name`` field has the +value ``"watermelon"``. It then performs a bitwise AND operation on the values of the +``IsAvailable`` and ``IsCheap`` fields in this document. + +.. literalinclude:: /includes/fundamentals/code-examples/linq.cs + :language: csharp + :dedent: + :start-after: start-bitAnd-example + :end-before: end-bitAnd-example + +The preceding code returns ``1``, the result of the AND operation on the values +of the ``IsAvailable`` field (``1``) and the ``IsCheap`` field (``1``). + +The following example performs the same bitwise AND operation on all +documents in the collection: + +.. io-code-block:: + :copyable: true + + .. input:: /includes/fundamentals/code-examples/linq.cs + :language: csharp + :dedent: + :start-after: start-bitAnd-collection-example + :end-before: end-bitAnd-collection-example + + .. output:: + :language: json + :visible: false + + 1 + 0 + 0 + 1 + 0 + null + +The ``null`` result comes from the document where the ``Name`` field +has the value of ``"cheese"``. This document is missing an ``IsCheap`` field, so +the expression evaluates to ``null``. + +$bitOr +++++++ + +The ``$bitOr`` aggregation operator performs a bitwise OR operation on the given +arguments. You can use the ``$bitOr`` operator by connecting two or more +clauses with a ``|`` character. + +The following example shows how to create a ``$bitOr`` stage by using LINQ. The +code retrieves the document in which the ``Name`` field has the +value ``"onions"``. It then performs a bitwise OR operation on the values of the +``IsAvailable`` and ``IsCheap`` fields in this document. + +.. literalinclude:: /includes/fundamentals/code-examples/linq.cs + :language: csharp + :dedent: + :start-after: start-bitOr-example + :end-before: end-bitOr-example + +The preceding code returns ``1``, the result of the OR operation on the values +of the ``IsAvailable`` field (``1``) and the ``IsCheap`` field (``0``). + +$bitNot ++++++++ + +The ``$bitNot`` aggregation operator performs a bitwise NOT operation on the given +argument. You can use the ``$bitNot`` operator by preceding an +operand with a ``~`` character. ``$bitNot`` only takes one argument. The +following example shows how to create a ``$bitNot`` stage by using LINQ: + +.. io-code-block:: + :copyable: true + + .. input:: /includes/fundamentals/code-examples/linq.cs + :language: csharp + :dedent: + :start-after: start-bitNot-example + :end-before: end-bitNot-example + + .. output:: + :language: json + :visible: false + + -2 + -1 + -1 + -2 + -2 + null + +$bitXor ++++++++ + +The ``$bitXor`` aggregation operator performs a bitwise XOR operation on the given +arguments. You can use the ``$bitXor`` operator by connecting two or more +clauses with a ``^`` character. + +The following example shows how to create a ``$bitXor`` stage by using LINQ. The +code retrieves the documents in which the ``Name`` field has +the value ``"watermelon"`` or ``"onions"``. It then performs a bitwise XOR +operation on the values of the ``IsAvailable`` and ``IsCheap`` fields in these +documents. + +.. literalinclude:: /includes/fundamentals/code-examples/linq.cs + :language: csharp + :dedent: + :start-after: start-bitXor-example + :end-before: end-bitXor-example + +The result contains the following values: + +.. code-block:: json + + 0 + 1 \ No newline at end of file diff --git a/source/aggregation/out.txt b/source/aggregation/out.txt deleted file mode 100644 index c12c03c9..00000000 --- a/source/aggregation/out.txt +++ /dev/null @@ -1,38 +0,0 @@ -.. _csharp-aggregation-out: - -=== -Out -=== - -.. facet:: - :name: genre - :values: reference - -.. meta:: - :keywords: code example, transform, pipeline - -.. contents:: On this page - :local: - :backlinks: none - :depth: 2 - :class: singlecol - -Out ---- - -Use the ``out()`` method to create an :manual:`$out ` -pipeline stage that writes all documents to the specified collection in -the same database. - -.. important:: - - The ``$out`` stage must be the last stage in any aggregation pipeline. - -The following example writes the results of the pipeline to the ``authors`` -collection: - -.. literalinclude:: /includes/aggregation/Builders.cs - :start-after: // begin out - :end-before: // end out - :language: csharp - :dedent: \ No newline at end of file diff --git a/source/aggregation/sample.txt b/source/aggregation/sample.txt deleted file mode 100644 index 83de28cb..00000000 --- a/source/aggregation/sample.txt +++ /dev/null @@ -1,33 +0,0 @@ -.. _csharp-aggregation-sample: - -====== -Sample -====== - -.. facet:: - :name: genre - :values: reference - -.. meta:: - :keywords: code example, transform, pipeline - -.. contents:: On this page - :local: - :backlinks: none - :depth: 2 - :class: singlecol - - -Sample ------- - -Use the ``sample()`` method to create a :manual:`$sample ` -pipeline stage to randomly select documents from input. - -The following example creates a pipeline stage that randomly selects 5 documents: - -.. literalinclude:: /includes/aggregation/Builders.cs - :start-after: // begin sample - :end-before: // end sample - :language: csharp - :dedent: \ No newline at end of file diff --git a/source/aggregation/skip.txt b/source/aggregation/skip.txt deleted file mode 100644 index 4d00f565..00000000 --- a/source/aggregation/skip.txt +++ /dev/null @@ -1,34 +0,0 @@ -.. _csharp-aggregation-skip: - -==== -Skip -==== - -.. facet:: - :name: genre - :values: reference - -.. meta:: - :keywords: code example, transform, pipeline - -.. contents:: On this page - :local: - :backlinks: none - :depth: 2 - :class: singlecol - - -Skip ----- - -Use the ``skip()`` method to create a :manual:`$skip ` -pipeline stage to skip over the specified number of documents before -passing documents into the next stage. - -The following example creates a pipeline stage that skips the first ``5`` documents: - -.. literalinclude:: /includes/aggregation/Builders.cs - :start-after: // begin skip - :end-before: // end skip - :language: csharp - :dedent: \ No newline at end of file diff --git a/source/aggregation/sort.txt b/source/aggregation/sort.txt deleted file mode 100644 index b8841b67..00000000 --- a/source/aggregation/sort.txt +++ /dev/null @@ -1,40 +0,0 @@ -.. _csharp-aggregation-sort: - -==== -Sort -==== - -.. facet:: - :name: genre - :values: reference - -.. meta:: - :keywords: code example, transform, pipeline - -.. contents:: On this page - :local: - :backlinks: none - :depth: 2 - :class: singlecol - - -Sort ----- - -Use the ``sort()`` method to create a :manual:`$sort ` -pipeline stage to sort by the specified criteria. - -.. tip:: - - Though the sort criteria can be an instance of any class that - implements ``Bson``, it's convenient to combine with use of :ref:`Sorts `. - -The following example creates a pipeline stage that sorts in descending order according -to the value of the ``year`` field and then in ascending order according to the -value of the ``title`` field: - -.. literalinclude:: /includes/aggregation/Builders.cs - :start-after: // begin sortStage - :end-before: // end sortStage - :language: csharp - :dedent: \ No newline at end of file diff --git a/source/aggregation/stages.txt b/source/aggregation/stages.txt new file mode 100644 index 00000000..7ca77406 --- /dev/null +++ b/source/aggregation/stages.txt @@ -0,0 +1,247 @@ + + +Aggregation Stage Methods +------------------------- + +The following table lists the builders methods in the {+driver-short+} that correspond +to stages in the aggregation pipeline. Because each of these methods returns a +``PipelineDefinition`` object, you can chain method calls together. +For more information about a method, click the +method name. + +.. list-table:: + :header-rows: 1 + :widths: 20 80 + + * - Stage + - Description + + * - :ref:`Bucket() ` + + - Categorizes incoming documents into groups, called buckets, + based on a specified expression and bucket boundaries. + + * - :ref:`BucketAuto() ` + + - Categorizes incoming documents into a specific number of + groups, called buckets, based on a specified expression. + Bucket boundaries are automatically determined in an attempt + to evenly distribute the documents into the specified number + of buckets. + + * - :ref:`ChangeStream() ` + + - Returns a change stream cursor for the + collection. This stage can occur only once in an aggregation + pipeline and it must occur as the first stage. + + * - :ref:`ChangeStreamSplitLargeEvent() ` + + - Splits large change stream events that exceed 16 MB into smaller fragments returned + in a change stream cursor. + + You can use $changeStreamSplitLargeEvent only in a $changeStream pipeline, and + it must be the final stage in the pipeline. + + * - :ref:`Count() ` + + - Returns a count of the number of documents at this stage of + the aggregation pipeline. + + * - :ref:`Densify() ` + + - Creates new documents in a sequence of documents where certain values in a field are missing. + + * - :ref:`Documents() ` + + - Returns literal documents from input expressions. + + * - :ref:`Facet() ` + + - Processes multiple aggregation pipelines + within a single stage on the same set + of input documents. Enables the creation of multi-faceted + aggregations capable of characterizing data across multiple + dimensions, or facets, in a single stage. + + * - :ref:`GraphLookup() ` + + - Performs a recursive search on a collection. To each output + document, adds a new array field that contains the traversal + results of the recursive search for that document. + + * - :ref:`Group() ` + + - Groups input documents by a specified identifier expression + and applies the accumulator expressions, if specified, to + each group. Consumes all input documents and outputs one + document per each distinct group. The output documents + contain only the identifier field and, if specified, accumulated + fields. + + * - :ref:`Limit() ` + + - Passes the first *n* documents unmodified to the pipeline, + where *n* is the specified limit. For each input document, + outputs either one document (for the first *n* documents) or + zero documents (after the first *n* documents). + + * - :ref:`Lookup() ` + + - Performs a left outer join to another collection in the + *same* database to filter in documents from the "joined" + collection for processing. + + * - :ref:`Match() ` + + - Filters the document stream to allow only matching documents + to pass unmodified into the next pipeline stage. + For each input document, outputs either one document (a match) or zero + documents (no match). + + * - :ref:`Merge() ` + + - Writes the resulting documents of the aggregation pipeline to + a collection. The stage can incorporate (insert new + documents, merge documents, replace documents, keep existing + documents, fail the operation, process documents with a + custom update pipeline) the results into an output + collection. To use this stage, it must be + the last stage in the pipeline. + + * - :ref:`Out() ` + + - Writes the resulting documents of the aggregation pipeline to + a collection. To use this stage, it must be + the last stage in the pipeline. + + * - :ref:`Project() ` + + - Reshapes each document in the stream, such as by adding new + fields or removing existing fields. For each input document, + outputs one document. + + * - :ref:`RankFusion() ` + + - Uses a rank fusion algorithm to combine results from a Vector Search + query and an Atlas Search query. + + * - :ref:`ReplaceRoot() ` + + - Replaces a document with the specified embedded document. The + operation replaces all existing fields in the input document, + including the ``_id`` field. Specify a document embedded in + the input document to promote the embedded document to the + top level. + + The ``$replaceWith`` stage is an alias for the ``$replaceRoot`` stage. + + * - :ref:`ReplaceWith() ` + + - Replaces a document with the specified embedded document. + The operation replaces all existing fields in the input document, including + the ``_id`` field. Specify a document embedded in the input document to promote + the embedded document to the top level. + + The ``$replaceWith`` stage is an alias for the ``$replaceRoot`` stage. + + * - :ref:`Sample() ` + + - Randomly selects the specified number of documents from its + input. + + * - :ref:`Search() ` + + - Performs a full-text search of the field or fields in an + :atlas:`Atlas ` + collection. + + This stage is available only for MongoDB Atlas clusters, and is not + available for self-managed deployments. To learn more, see + :atlas:`Atlas Search Aggregation Pipeline Stages + ` in the Atlas documentation. + + * - :ref:`SearchMeta() ` + + - Returns different types of metadata result documents for the + :atlas:`Atlas Search ` query against an + :atlas:`Atlas ` + collection. + + This stage is available only for MongoDB Atlas clusters, + and is not available for self-managed deployments. To learn + more, see :atlas:`Atlas Search Aggregation Pipeline Stages + ` in the Atlas documentation. + + * - :ref:`Set() ` + + - Adds new fields to documents. Like the ``Project()`` method, + this method reshapes each + document in the stream by adding new fields to + output documents that contain both the existing fields + from the input documents and the newly added fields. + + * - :ref:`SetWindowFields() ` + + - Groups documents into windows and applies one or more + operators to the documents in each window. + + .. versionadded:: 5.0 + + * - :ref:`Skip() ` + + - Skips the first *n* documents, where *n* is the specified skip + number, and passes the remaining documents unmodified to the + pipeline. For each input document, outputs either zero + documents (for the first *n* documents) or one document (if + after the first *n* documents). + + * - :ref:`Sort() ` + + - Reorders the document stream by a specified sort key. The documents remain unmodified. + For each input document, outputs one document. + + * - :ref:`SortByCount() ` + + - Groups incoming documents based on the value of a specified + expression, then computes the count of documents in each + distinct group. + + * - :ref:`UnionWith() ` + + - Combines pipeline results from two collections into a single + result set. + + * - :ref:`Unwind() ` + + - Deconstructs an array field from the input documents to + output a document for *each* element. Each output document + replaces the array with an element value. For each input + document, outputs *n* Documents, where *n* is the number of + array elements. *n* can be zero for an empty array. + + * - :ref:`VectorSearch() ` + + - Performs an :abbr:`ANN (Approximate Nearest Neighbor)` or + :abbr:`ENN (Exact Nearest Neighbor)` search on a + vector in the specified field of an + :atlas:`Atlas ` collection. + +You can add stages to your pipeline that don't have corresponding type-safe +methods in the ``PipelineDefinitionBuilder`` interface by providing your query +as a ``BsonDocument`` to the `AppendStage() method +<{+new-api-root+}/MongoDB.Driver/MongoDB.Driver.PipelineDefinitionBuilder.AppendStage.html>`__. + +.. code-block:: csharp + + var pipeline = new EmptyPipelineDefinition().AppendStage("{ $set: { field1: '$field2' } }"); + +.. note:: + + When using a ``BsonDocument`` to define your pipeline stage, the driver does + not take into account any ``BsonClassMap``, serialization attributes or + serialization conventions. The field names used in the ``BsonDocument`` must + match those stored on the server. + + For more information on providing a query as a ``BsonDocument``, see our + :ref:`FAQ page `. \ No newline at end of file diff --git a/source/aggregation/bucket.txt b/source/aggregation/stages/bucket.txt similarity index 100% rename from source/aggregation/bucket.txt rename to source/aggregation/stages/bucket.txt diff --git a/source/aggregation/bucketAuto.txt b/source/aggregation/stages/bucketAuto.txt similarity index 100% rename from source/aggregation/bucketAuto.txt rename to source/aggregation/stages/bucketAuto.txt diff --git a/source/aggregation/stages/changeStream.txt b/source/aggregation/stages/changeStream.txt new file mode 100644 index 00000000..a62a9842 --- /dev/null +++ b/source/aggregation/stages/changeStream.txt @@ -0,0 +1,20 @@ +.. _csharp-aggregation-changestream: + +============ +ChangeStream +============ + +.. facet:: + :name: genre + :values: reference + +.. meta:: + :keywords: code example, transform, pipeline + +.. contents:: On this page + :local: + :backlinks: none + :depth: 2 + :class: singlecol +Appends a $changeStream stage to the pipeline. Normally you would prefer to use the Watch method of IMongoCollection. Only use this method if subsequent stages project away the resume token (the _id) or you don't want the resulting cursor to automatically resume. +Returns a Change Stream cursor on a collection, a database, or an entire cluster. Must be used as the first stage in an aggregation pipeline. \ No newline at end of file diff --git a/source/aggregation/changeStreamSplitLargeEvent copy.txt b/source/aggregation/stages/changeStreamSplitLargeEvent copy.txt similarity index 100% rename from source/aggregation/changeStreamSplitLargeEvent copy.txt rename to source/aggregation/stages/changeStreamSplitLargeEvent copy.txt diff --git a/source/aggregation/count.txt b/source/aggregation/stages/count.txt similarity index 100% rename from source/aggregation/count.txt rename to source/aggregation/stages/count.txt diff --git a/source/aggregation/densify.txt b/source/aggregation/stages/densify.txt similarity index 90% rename from source/aggregation/densify.txt rename to source/aggregation/stages/densify.txt index 6fc31b2f..39508fc9 100644 --- a/source/aggregation/densify.txt +++ b/source/aggregation/stages/densify.txt @@ -24,7 +24,15 @@ Densify Use the ``densify()`` method to create a :manual:`$densify ` pipeline stage that generates a sequence of documents to span a specified interval. +Creates new documents in a sequence of documents where certain values in a field are missing. +You can use $densify to: + + Fill gaps in time series data. + + Add missing values between groups of data. + + Populate your data with a specified range of values. .. tip:: You can use the ``$densify()`` aggregation stage only when running diff --git a/source/aggregation/documents.txt b/source/aggregation/stages/documents.txt similarity index 100% rename from source/aggregation/documents.txt rename to source/aggregation/stages/documents.txt diff --git a/source/aggregation/facet.txt b/source/aggregation/stages/facet.txt similarity index 100% rename from source/aggregation/facet.txt rename to source/aggregation/stages/facet.txt diff --git a/source/aggregation/graphLookup.txt b/source/aggregation/stages/graphLookup.txt similarity index 91% rename from source/aggregation/graphLookup.txt rename to source/aggregation/stages/graphLookup.txt index d69e1780..c461e40d 100644 --- a/source/aggregation/graphLookup.txt +++ b/source/aggregation/stages/graphLookup.txt @@ -55,3 +55,6 @@ example, only links with "golf" in their ``hobbies`` field will be included. :end-before: // end graphLookupMatch :language: csharp :dedent: + +- The :manual:`$graphLookup ` stage has + a strict memory limit of 100 megabytes and ignores the ``AllowDiskUse`` property. \ No newline at end of file diff --git a/source/aggregation/stages/group.txt b/source/aggregation/stages/group.txt new file mode 100644 index 00000000..b4d9804a --- /dev/null +++ b/source/aggregation/stages/group.txt @@ -0,0 +1,97 @@ +.. _csharp-aggregation-group: + +===== +Group +===== + +.. facet:: + :name: genre + :values: reference + +.. meta:: + :keywords: code example, transform, pipeline + +.. contents:: On this page + :local: + :backlinks: none + :depth: 2 + :class: singlecol + + +Group +----- + +Use the ``group()`` method to create a :manual:`$group ` +pipeline stage to group documents by a specified expression and output a document +for each distinct grouping. + +.. tip:: + + The driver includes the `Accumulators <{+core-api+}/client/model/Accumulators.html>`__ + class with static factory methods for each of the supported accumulators. + +The following example creates a pipeline stage that groups documents by the value +of the ``customerId`` field. Each group accumulates the sum and average +of the values of the ``quantity`` field into the ``totalQuantity`` and +``averageQuantity`` fields. + +.. literalinclude:: /includes/aggregation/Builders.cs + :start-after: // begin group + :end-before: // end group + :language: csharp + :dedent: + +Learn more about accumulator operators from the Server manual section +on :manual:`Accumulators `. + + +$group +~~~~~~ + +The ``$group`` aggregation stage separates documents into groups according to +the criteria you specify. + +Select the :guilabel:`Method Syntax` or :guilabel:`Query Syntax` tab to see how +to generate an ``$group`` stage using LINQ: + +.. tabs:: + + .. tab:: Method Syntax + :tabid: method-syntax + + .. code-block:: csharp + :emphasize-lines: 2 + + var query = queryableCollection + .GroupBy(r => r.Cuisine) + .Select(g => new { Cuisine = g.Key, Count = g.Count() }); + + .. tab:: Query Syntax + :tabid: query-syntax + + .. code-block:: csharp + :emphasize-lines: 2 + + var query = from r in queryableCollection + group r by r.Cuisine into g + select new {Cuisine = g.Key, Count = g.Count()}; + +The preceding example groups each document by the value in its ``Cuisine`` field, +then counts how many documents have each ``Cuisine`` value. The result contains +the following documents: + +.. code-block:: json + + // Results Truncated + + { "cuisine" : "Caribbean", "count" : 657 } + { "cuisine" : "Café/Coffee/Tea", "count" : 1214 } + { "cuisine" : "Iranian", "count" : 2 } + { "cuisine" : "Nuts/Confectionary", "count" : 6 } + { "cuisine" : "Middle Eastern", "count" : 168 } + ... + +.. note:: Result Order + + The preceding queries don't always return results in the same order. Running + this example may return the results in a different order than shown above. \ No newline at end of file diff --git a/source/aggregation/stages/limit.txt b/source/aggregation/stages/limit.txt new file mode 100644 index 00000000..b737744d --- /dev/null +++ b/source/aggregation/stages/limit.txt @@ -0,0 +1,57 @@ +.. _csharp-aggregation-limit: + +===== +Limit +===== + +.. facet:: + :name: genre + :values: reference + +.. meta:: + :keywords: code example, transform, pipeline + +.. contents:: On this page + :local: + :backlinks: none + :depth: 2 + :class: singlecol + + +Limit +----- + +Use the :manual:`$limit ` pipeline stage +to limit the number of documents passed to the next stage. + +The following example creates a pipeline stage that limits the number of documents to ``10``: + +.. literalinclude:: /includes/aggregation/Builders.cs + :start-after: // begin limit + :end-before: // end limit + :language: csharp + :dedent: + +$limit +~~~~~~ + +The ``$limit`` aggregation stage limits the number of documents returned by the +query. The following example shows how to generate a ``$limit`` stage using LINQ: + +.. code-block:: csharp + :emphasize-lines: 4 + + var query = queryableCollection + .Where(r => r.Cuisine == "Italian") + .Select(r => new {r.Name, r.Cuisine}) + .Take(5); + +The result of the preceding example contains the following documents: + +.. code-block:: json + + { "name" : "Philadelhia Grille Express", "cuisine" : "Italian" } + { "name" : "Isle Of Capri Resturant", "cuisine" : "Italian" } + { "name" : "Marchis Restaurant", "cuisine" : "Italian" } + { "name" : "Crystal Room", "cuisine" : "Italian" } + { "name" : "Forlinis Restaurant", "cuisine" : "Italian" } \ No newline at end of file diff --git a/source/aggregation/stages/lookup.txt b/source/aggregation/stages/lookup.txt new file mode 100644 index 00000000..f9be046b --- /dev/null +++ b/source/aggregation/stages/lookup.txt @@ -0,0 +1,146 @@ +.. _csharp-aggregation-lookup: + +====== +Lookup +====== + +.. facet:: + :name: genre + :values: reference + +.. meta:: + :keywords: code example, transform, pipeline + +.. contents:: On this page + :local: + :backlinks: none + :depth: 2 + :class: singlecol + + +Lookup +------ + +Use the ``lookup()`` method to create a :manual:`$lookup ` +pipeline stage to perform joins and uncorrelated subqueries between two collections. + +Left Outer Join +~~~~~~~~~~~~~~~ + +The following example creates a pipeline stage that performs a left outer +join between the ``movies`` and ``comments`` collections: + +- It joins the ``_id`` field from ``movies`` to the ``movie_id`` field in ``comments`` +- It outputs the results in the ``joined_comments`` field: + +.. literalinclude:: /includes/aggregation/Builders.cs + :start-after: // begin basic lookup + :end-before: // end basic lookup + :language: csharp + :dedent: + +Full Join and Uncorrelated Subqueries +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +The following example creates a pipeline stage that joins two collections, ``orders`` +and ``warehouses``, by the item and whether the available quantity is enough +to fulfill the ordered quantity: + +.. literalinclude:: /includes/aggregation/Builders.cs + :start-after: // begin advanced lookup + :end-before: // end advanced lookup + :language: csharp + :dedent: + +$lookup +~~~~~~~ + +The ``$lookup`` aggregation stage joins documents from one collection to documents +from another collection in the same database. The ``$lookup`` stage adds a new +array field to each input document. The new array field contains the matching +documents from the "joined" collection. + +.. note:: + + To perform a lookup, you must make both collections queryable by using the + ``AsQueryable()`` method. + + To learn how to make a collection queryable, see :ref:`csharp-linq-queryable`. + +Consider a second collection in the ``sample_restaurants`` database called +``reviews`` that has restaurant reviews. You can join documents from that collection +to documents with the same ``name`` value in the ``restaurants`` collection using +the ``$lookup`` stage. + +The following ``Review`` class models the documents in the ``reviews`` collection: + +.. literalinclude:: /includes/fundamentals/code-examples/linq.cs + :language: csharp + :dedent: + :start-after: start-review-model + :end-before: end-review-model + +Select the :guilabel:`Method Syntax` or :guilabel:`Query Syntax` tab to see how +to generate a ``$lookup`` stage by using LINQ: + +.. tabs:: + + .. tab:: Method Syntax + :tabid: method-syntax + + .. code-block:: csharp + + var query = queryableCollection + .GroupJoin(reviewCollection, + restaurant => restaurant.Name, + review => review.RestaurantName, + (restaurant, reviews) => + new { Restaurant = restaurant, Reviews = reviews } + ); + + .. tab:: Query Syntax + :tabid: query-syntax + + .. code-block:: csharp + + var query = from restaurant in queryableCollection + join rv in reviewCollection on restaurant.Name equals rv.RestaurantName into reviews + select new { restaurant, reviews }; + +The preceding example returns all documents from the ``restaurants`` collection. Each +restaurant document has an added field called ``reviews``, which contains all +reviews for that restaurant. A review matches a restaurant if the value of the +``name`` field in the review document matches the ``name`` field of the restaurant +document. + +The following shows a subset of the returned results: + +.. code-block:: json + + // Results Truncated + + { + "restaurant": { + "_id": ObjectId("..."), + "name": "The Movable Feast", + "restaurant_id": "40361606", + "cuisine": "American", + "address": { ... }, + "borough": "Brooklyn", + "grades": [ ... ] + }, + "reviews": [ + { + "_id": ObjectId("..."), + "restaurant_name": "The Movable Feast", + "reviewer": "Lazlo Cravensworth", + "review_text": "Great restaurant! 12/10 stars!" + }, + { + "_id": ObjectId("..."), + "restaurant_name": "The Movable Feast", + "reviewer": "Michael Scarn", + "review_text": "It really was a feast" + } + ] + } \ No newline at end of file diff --git a/source/aggregation/stages/match.txt b/source/aggregation/stages/match.txt new file mode 100644 index 00000000..bf6755f4 --- /dev/null +++ b/source/aggregation/stages/match.txt @@ -0,0 +1,78 @@ +.. _csharp-aggregation-match: + +===== +Match +===== + +.. facet:: + :name: genre + :values: reference + +.. meta:: + :keywords: code example, transform, pipeline + +.. contents:: On this page + :local: + :backlinks: none + :depth: 2 + :class: singlecol + +Match +----- + +Use the ``match()`` method to create a :manual:`$match ` +pipeline stage that matches incoming documents against the specified +query filter, filtering out documents that do not match. + +.. tip:: + + The filter can be an instance of any class that implements ``Bson``, but it's + convenient to combine with use of the :ref:`Filters ` class. + +The following example creates a pipeline stage that matches all documents where the +``title`` field is equal to "The Shawshank Redemption": + +.. literalinclude:: /includes/aggregation/Builders.cs + :start-after: begin match + :end-before: end match + :language: csharp + :dedent: + + +$match +~~~~~~ + +The ``$match`` aggregation stage returns the documents that match a specified +criteria. + +Select the :guilabel:`Method Syntax` or :guilabel:`Query Syntax` tab to see how +to generate a ``$match`` stage using LINQ: + +.. tabs:: + + .. tab:: Method Syntax + :tabid: method-syntax + + .. code-block:: csharp + :emphasize-lines: 2 + + var query = queryableCollection + .Where(r => r.Name == "The Movable Feast"); + + .. tab:: Query Syntax + :tabid: query-syntax + + .. code-block:: csharp + :emphasize-lines: 2 + + var query = from r in queryableCollection + where r.Name == "The Movable Feast" + select r; + +The result of the preceding example contains the following document: + +.. code-block:: json + + // Results Truncated + + { "_id" : ObjectId(...), "name" : "The Movable Feast", "restaurant_id" : "40361606", "cuisine" : "American", "address" : {...}, "borough" : "Brooklyn", "grades" : [...] } diff --git a/source/aggregation/merge.txt b/source/aggregation/stages/merge.txt similarity index 100% rename from source/aggregation/merge.txt rename to source/aggregation/stages/merge.txt diff --git a/source/aggregation/stages/out.txt b/source/aggregation/stages/out.txt new file mode 100644 index 00000000..908d8d5a --- /dev/null +++ b/source/aggregation/stages/out.txt @@ -0,0 +1,89 @@ +.. _csharp-aggregation-out: + +=== +Out +=== + +.. facet:: + :name: genre + :values: reference + +.. meta:: + :keywords: code example, transform, pipeline + +.. contents:: On this page + :local: + :backlinks: none + :depth: 2 + :class: singlecol + +Out +--- + +Use the ``out()`` method to create an :manual:`$out ` +pipeline stage that writes all documents to the specified collection in +the same database. + +.. important:: + + The ``$out`` stage must be the last stage in any aggregation pipeline. + +The following example writes the results of the pipeline to the ``authors`` +collection: + +.. literalinclude:: /includes/aggregation/Builders.cs + :start-after: // begin out + :end-before: // end out + :language: csharp + :dedent: + +.. _csharp-builders-out: + +Write Pipeline Results to a Collection +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +You can write the documents returned from an aggregation pipeline to a +collection by creating an ``$out`` stage at the end of your aggregation +pipeline. To create an ``$out`` stage, call the ``Out()`` method on a +``PipelineStageDefinitionBuilder``. The ``Out()`` method requires the name of +the collection you want to write the documents to. + +The following example builds an aggregation pipeline that matches all documents +with a ``season`` field value of ``"Spring"`` and outputs them to +a ``springFlowers`` collection: + +.. code-block:: csharp + + var outputCollection = database.GetCollection("springFlowers"); + var matchFilter = Builders.Filter.AnyEq(f => f.Season, "spring"); + + // Creates an aggregation pipeline and outputs resulting documents to a new collection. + var pipeline = new EmptyPipelineDefinition() + .Match(matchFilter) + .Out(outputCollection); + +You can write the results of an aggregation pipeline to a time series collection +by specifying a ``TimeSeriesOption`` object and passing it as the second +parameter to the ``Out()`` method. + +Imagine that the documents in the ``plants.flowers`` collection contain a ``datePlanted`` field that +holds BSON date values. You can store the documents in this collection in a time +series collection by using the ``datePlanted`` field as the time field. + +The following example creates a ``TimeSeriesOptions`` object and specifies +``datePlanted`` as the ``timeField``. It then builds an aggregation pipeline that matches all documents +with a ``season`` field value of ``"Spring"`` and outputs them to a +time series collection called ``springFlowerTimes``. + +.. code-block:: csharp + + var timeSeriesOptions = new TimeSeriesOptions("datePlanted"); + var collectionName = "springFlowerTimes" + var matchFilter = Builders.Filter.AnyEq(f => f.Season, "spring"); + + // Creates an aggregation pipeline and outputs resulting documents to a time series collection. + var pipeline = new EmptyPipelineDefinition() + .Match(matchFilter) + .Out(collectionName, timeSeriesOptions); + +To learn more about time series collections, see :ref:`csharp-time-series`. \ No newline at end of file diff --git a/source/aggregation/project.txt b/source/aggregation/stages/project.txt similarity index 55% rename from source/aggregation/project.txt rename to source/aggregation/stages/project.txt index 0c47b837..d503b6c4 100644 --- a/source/aggregation/project.txt +++ b/source/aggregation/stages/project.txt @@ -52,4 +52,44 @@ into a new field called ``rating``, effectively renaming the field. :start-after: begin computed :end-before: end computed :language: csharp - :dedent: \ No newline at end of file + :dedent: + +$project +~~~~~~~~ + +The ``$project`` aggregation stage returns a document containing only the specified +fields. + +Select the :guilabel:`Method Syntax` or :guilabel:`Query Syntax` tab to see how +to generate a ``$project`` stage using LINQ: + +.. tabs:: + + .. tab:: Method Syntax + :tabid: method-syntax + + .. code-block:: csharp + :emphasize-lines: 2 + + var query = queryableCollection + .Select(r => new { r.Name, r.Address }); + + .. tab:: Query Syntax + :tabid: query-syntax + + .. code-block:: csharp + :emphasize-lines: 2 + + var query = from r in queryableCollection + select new { r.Name, r.Address }; + +The result of the preceding example contains the following document: + +.. code-block:: json + + { "name" : "The Movable Feast", "address" : { "building" : "284", "coord" : [-73.982923900000003, 40.6580753], "street" : "Prospect Park West", "zipcode" : "11215" } } + +.. note:: Excluding the ``_id`` Field + + If you don't include the ``_id`` field in your LINQ projection, the {+driver-short+} + automatically excludes it from the results. diff --git a/source/aggregation/rankFusion.txt b/source/aggregation/stages/rankFusion.txt similarity index 100% rename from source/aggregation/rankFusion.txt rename to source/aggregation/stages/rankFusion.txt diff --git a/source/aggregation/replaceRoot.txt b/source/aggregation/stages/replaceRoot.txt similarity index 100% rename from source/aggregation/replaceRoot.txt rename to source/aggregation/stages/replaceRoot.txt diff --git a/source/aggregation/replaceWith.txt b/source/aggregation/stages/replaceWith.txt similarity index 100% rename from source/aggregation/replaceWith.txt rename to source/aggregation/stages/replaceWith.txt diff --git a/source/aggregation/stages/sample.txt b/source/aggregation/stages/sample.txt new file mode 100644 index 00000000..c5c90a77 --- /dev/null +++ b/source/aggregation/stages/sample.txt @@ -0,0 +1,60 @@ +.. _csharp-aggregation-sample: + +====== +Sample +====== + +.. facet:: + :name: genre + :values: reference + +.. meta:: + :keywords: code example, transform, pipeline + +.. contents:: On this page + :local: + :backlinks: none + :depth: 2 + :class: singlecol + + +Sample +------ + +Use the ``sample()`` method to create a :manual:`$sample ` +pipeline stage to randomly select documents from input. + +The following example creates a pipeline stage that randomly selects 5 documents: + +.. literalinclude:: /includes/aggregation/Builders.cs + :start-after: // begin sample + :end-before: // end sample + :language: csharp + :dedent: + + +$sample +~~~~~~~ + +The ``$sample`` aggregation stage returns a random sample of documents from a +collection. The following example shows how to generate a ``$sample`` stage by using +LINQ: + +.. code-block:: csharp + :emphasize-lines: 4 + + var query = queryableCollection + .Aggregate() + .Sample(4) + .ToList(); + +The result of the preceding example contains the following documents: + +.. code-block:: json + + // Results Truncated + + { "name" : "Von Dolhens", "cuisine" : "Ice Cream, Gelato, Yogurt, Ices" } + { "name" : "New York Mercantile Exchange", "cuisine" : "American" } + { "name" : "Michaelangelo's Restaurant", "cuisine" : "Italian" } + { "name" : "Charlie Palmer Steak", "cuisine" : "American" } \ No newline at end of file diff --git a/source/aggregation/search.txt b/source/aggregation/stages/search.txt similarity index 100% rename from source/aggregation/search.txt rename to source/aggregation/stages/search.txt diff --git a/source/aggregation/searchMeta.txt b/source/aggregation/stages/searchMeta.txt similarity index 100% rename from source/aggregation/searchMeta.txt rename to source/aggregation/stages/searchMeta.txt diff --git a/source/aggregation/set.txt b/source/aggregation/stages/set.txt similarity index 100% rename from source/aggregation/set.txt rename to source/aggregation/stages/set.txt diff --git a/source/aggregation/setWindowFields.txt b/source/aggregation/stages/setWindowFields.txt similarity index 100% rename from source/aggregation/setWindowFields.txt rename to source/aggregation/stages/setWindowFields.txt diff --git a/source/aggregation/stages/skip.txt b/source/aggregation/stages/skip.txt new file mode 100644 index 00000000..7948594f --- /dev/null +++ b/source/aggregation/stages/skip.txt @@ -0,0 +1,61 @@ +.. _csharp-aggregation-skip: + +==== +Skip +==== + +.. facet:: + :name: genre + :values: reference + +.. meta:: + :keywords: code example, transform, pipeline + +.. contents:: On this page + :local: + :backlinks: none + :depth: 2 + :class: singlecol + + +Skip +---- + +Use the ``skip()`` method to create a :manual:`$skip ` +pipeline stage to skip over the specified number of documents before +passing documents into the next stage. + +The following example creates a pipeline stage that skips the first ``5`` documents: + +.. literalinclude:: /includes/aggregation/Builders.cs + :start-after: // begin skip + :end-before: // end skip + :language: csharp + :dedent: + + +$skip +~~~~~ + +The ``$skip`` aggregation stage skips over a specified number of documents returned +by a query, then returns the rest of the results. The following example shows how to generate +a ``$skip`` stage using LINQ: + +.. code-block:: csharp + :emphasize-lines: 4 + + var query = queryableCollection + .Where(r => r.Cuisine == "Italian") + .Select(r => new {r.Name, r.Cuisine}) + .Skip(2); + +The preceding example skips the first two restaurants that match the criteria, and +returns the rest. The result contains the following documents: + +.. code-block:: json + + // Results Truncated + + { "name" : "Marchis Restaurant", "cuisine" : "Italian" } + { "name" : "Crystal Room", "cuisine" : "Italian" } + { "name" : "Forlinis Restaurant", "cuisine" : "Italian" } \ No newline at end of file diff --git a/source/aggregation/stages/sort.txt b/source/aggregation/stages/sort.txt new file mode 100644 index 00000000..3aea6d45 --- /dev/null +++ b/source/aggregation/stages/sort.txt @@ -0,0 +1,87 @@ +.. _csharp-aggregation-sort: + +==== +Sort +==== + +.. facet:: + :name: genre + :values: reference + +.. meta:: + :keywords: code example, transform, pipeline + +.. contents:: On this page + :local: + :backlinks: none + :depth: 2 + :class: singlecol + + +Sort +---- + +Use the ``sort()`` method to create a :manual:`$sort ` +pipeline stage to sort by the specified criteria. + +.. tip:: + + Though the sort criteria can be an instance of any class that + implements ``Bson``, it's convenient to combine with use of :ref:`Sorts `. + +The following example creates a pipeline stage that sorts in descending order according +to the value of the ``year`` field and then in ascending order according to the +value of the ``title`` field: + +.. literalinclude:: /includes/aggregation/Builders.cs + :start-after: // begin sortStage + :end-before: // end sortStage + :language: csharp + :dedent: + + +$sort +~~~~~ + +The ``$sort`` aggregation stage returns the results of your query in the order +that you specify. + +Select the :guilabel:`Method Syntax` or :guilabel:`Query Syntax` tab to see how +to generate an ``$sort`` stage using LINQ: + +.. tabs:: + + .. tab:: Method Syntax + :tabid: method-syntax + + .. code-block:: csharp + :emphasize-lines: 2 + + var query = queryableCollection + .OrderBy(r => r.Name) + .ThenByDescending(r => r.RestaurantId); + + .. tab:: Query Syntax + :tabid: query-syntax + + .. code-block:: csharp + :emphasize-lines: 2 + + var query = from r in queryableCollection + orderby r.Name, r.RestaurantId descending + select r; + +The preceding example returns the query results sorted alphabetically by the +``Name`` field, with a secondary descending sort on the ``RestaurantId`` field. +The following is a subset of the documents contained in the returned results: + +.. code-block:: json + + // Results Truncated + + ... + { "_id" : ObjectId(...), "name" : "Aba Turkish Restaurant", "restaurant_id" : "41548686", "cuisine" : "Turkish", "address" : {...}, "borough" : "Manhattan", "grades" : [...] } + { "_id" : ObjectId(...), "name" : "Abace Sushi", "restaurant_id" : "50006214", "cuisine" : "Japanese", "address" : { ... }, "borough" : "Manhattan", "grades" : [...] } + { "_id" : ObjectId(...), "name" : "Abacky Potluck", "restaurant_id" : "50011222", "cuisine" : "Asian", "address" : { ... }, "borough" : "Manhattan", "grades" : [...] } + { "_id" : ObjectId(...), "name" : "Abaleh", "restaurant_id" : "50009096", "cuisine" : "Mediterranean", "address" : { ... }, "borough" : "Manhattan", "grades" : [...] } + ... \ No newline at end of file diff --git a/source/aggregation/sortByCount.txt b/source/aggregation/stages/sortByCount.txt similarity index 100% rename from source/aggregation/sortByCount.txt rename to source/aggregation/stages/sortByCount.txt diff --git a/source/aggregation/unionWith.txt b/source/aggregation/stages/unionWith.txt similarity index 100% rename from source/aggregation/unionWith.txt rename to source/aggregation/stages/unionWith.txt diff --git a/source/aggregation/stages/unwind.txt b/source/aggregation/stages/unwind.txt new file mode 100644 index 00000000..fc580589 --- /dev/null +++ b/source/aggregation/stages/unwind.txt @@ -0,0 +1,186 @@ +.. _csharp-aggregation-unwind: + +====== +Unwind +====== + +.. facet:: + :name: genre + :values: reference + +.. meta:: + :keywords: code example, transform, pipeline + +.. contents:: On this page + :local: + :backlinks: none + :depth: 2 + :class: singlecol + + +Unwind +------ + +Use the ``unwind()`` method to create an :manual:`$unwind ` +pipeline stage to deconstruct an array field from input documents, creating +an output document for each array element. + +The following example creates a document for each element in the ``sizes`` array: + +.. literalinclude:: /includes/aggregation/Builders.cs + :start-after: // begin unwindStage + :end-before: // end unwindStage + :language: csharp + :dedent: + +To preserve documents that have missing or ``null`` +values for the array field, or where array is empty: + + +.. literalinclude:: /includes/aggregation/Builders.cs + :start-after: // begin unwindPreserve + :end-before: // end unwindPreserve + :language: csharp + :dedent: + +To include the array index, in this example in a field called ``"position"``: + +.. literalinclude:: /includes/aggregation/Builders.cs + :start-after: // begin unwindIndex + :end-before: // end unwindIndex + :language: csharp + :dedent: + +$unwind +~~~~~~~ + +The ``$unwind`` aggregation stage deconstructs a specified array field and returns +a document for each element in that array. + +Select the :guilabel:`Method Syntax` or :guilabel:`Query Syntax` tab to see how +to generate an ``$unwind`` stage using LINQ: + +.. tabs:: + + .. tab:: Method Syntax + :tabid: method-syntax + + .. code-block:: csharp + :emphasize-lines: 3 + + var query = queryableCollection + .Where(r => r.Name == "The Movable Feast") + .SelectMany(r => r.Grades); + + .. tab:: Query Syntax + :tabid: query-syntax + + .. code-block:: csharp + :emphasize-lines: 3 + + var query = from r in queryableCollection + where r.Name == "The Movable Feast" + from grade in r.Grades + select grade; + +The query in the preceding example finds the document where the ``Name`` field +has the value "The Movable Feast." Then, for each element in this document's +``Grades`` array, the query returns a new document. The result contains the +following documents: + +.. code-block:: json + + { "date" : ISODate("2014-11-19T00:00:00Z"), "grade" : "A", "score" : 11 } + { "date" : ISODate("2013-11-14T00:00:00Z"), "grade" : "A", "score" : 2 } + { "date" : ISODate("2012-12-05T00:00:00Z"), "grade" : "A", "score" : 13 } + { "date" : ISODate("2012-05-17T00:00:00Z"), "grade" : "A", "score" : 11 } + +Nested Statements ++++++++++++++++++ + +You can chain or nest ``Select`` and ``SelectMany`` statements to unwind nested +arrays. Consider a collection that contains documents with a **new** schema. These +documents contain a ``restaurants`` field, which holds an array of documents +represented by the ``Restaurant`` class. The documents within the array each have +a ``grades`` field that holds an array of documents represented by +the ``Grade`` class. The following code is an example of a single document in +this collection: + +.. code-block:: json + + { + "_id": { "$oid": ... }, + "restaurants": [ + { + "_id": { ... } , + "address": { ... }, + "name": "Tov Kosher Kitchen", + "grades": [ + { + "date" : ISODate("2014-11-24T00:00:00Z"), + "grade" : "Z", + "score" : 20.0 + }, + { + "date" : ISODate("2013-01-17T00:00:00Z"), + "grade" : "A", + "score" : 13.0 + } + ] + ... + }, + { + "_id": { ... } , + "address": { ... }, + "name": "Harriet's Kitchen", + "grades": [ + { + "date" : ISODate("2014-04-19T00:00:00Z"), + "grade" : "B", + "score" : 12.0 + } + ], + ... + }, + ... + ] + } + +You can nest ``SelectMany`` statements within ``SelectMany`` or ``Select`` +statements. The following example nests a ``SelectMany`` statement within a +``Select`` statement to retrieve an array from each document in the collection. +Each array holds all grade objects from all restaurants in each document. + +.. io-code-block:: + :copyable: true + + .. input:: /includes/fundamentals/code-examples/linq.cs + :language: csharp + :start-after: start-nested-SelectMany + :end-before: end-nested-SelectMany + + .. output:: + :visible: false + :language: json + + // output for first document in collection + [ + { "date" : ISODate("2014-11-24T00:00:00Z"), + "grade" : "Z", + "score" : 20.0 + }, + { "date" : ISODate("2013-01-17T00:00:00Z"), + "grade" : "A", + "score" : 13.0 + }, + { + "date" : ISODate("2014-04-19T00:00:00Z"), + "grade" : "B", + "score" : 12.0 + }, + ... + ], + // output for second document in collection + [ + ... + ] \ No newline at end of file diff --git a/source/aggregation/stages/vectorSearch.txt b/source/aggregation/stages/vectorSearch.txt new file mode 100644 index 00000000..ed85e192 --- /dev/null +++ b/source/aggregation/stages/vectorSearch.txt @@ -0,0 +1,97 @@ +.. _csharp-aggregation-vectorsearch: + +============ +VectorSearch +============ + +.. facet:: + :name: genre + :values: reference + +.. meta:: + :keywords: code example, transform, pipeline + +.. contents:: On this page + :local: + :backlinks: none + :depth: 2 + :class: singlecol + + +$vectorSearch +~~~~~~~~~~~~~ + +The ``$vectorSearch`` aggregation stage performs an *approximate nearest neighbor* search +on a vector in the specified field. Your collection *must* have a +defined Atlas Vector Search index before you can perform a vector search on your data. + +.. tip:: + + To obtain the sample dataset used in the following example, see :ref:`csharp-get-started`. + To create the sample Atlas Vector Search index used in the following example, see + :atlas:`Create an Atlas Vector Search Index ` in the + Atlas manual. + +Consider the ``embedded_movies`` collection in the ``sample_mflix`` database. You +can use a ``$vectorSearch`` stage to perform a semantic search on the ``plot_embedding`` +field of the documents in the collection. + +The following ``EmbeddedMovie`` class models the documents in the ``embedded_movies`` +collection: + +.. code-block:: csharp + + [BsonIgnoreExtraElements] + public class EmbeddedMovie + { + [BsonIgnoreIfDefault] + public string Title { get; set; } + + public string Plot { get; set; } + + [BsonElement("plot_embedding")] + public double[] Embedding { get; set; } + } + +The following example shows how to generate a ``$vectorSearch`` stage to search +the ``plot_embedding`` field using vector embeddings for the string ``"time travel"``: + +.. code-block:: csharp + + // Defines vector embeddings for the string "time travel" + var vector = new[] {-0.0016261312,-0.028070757,-0.011342932,-0.012775794,-0.0027440966,0.008683807,-0.02575152,-0.02020668,-0.010283281,-0.0041719596,0.021392956,0.028657231,-0.006634482,0.007490867,0.018593878,0.0038187427,0.029590257,-0.01451522,0.016061379,0.00008528442,-0.008943722,0.01627464,0.024311995,-0.025911469,0.00022596726,-0.008863748,0.008823762,-0.034921836,0.007910728,-0.01515501,0.035801545,-0.0035688248,-0.020299982,-0.03145631,-0.032256044,-0.028763862,-0.0071576433,-0.012769129,0.012322609,-0.006621153,0.010583182,0.024085402,-0.001623632,0.007864078,-0.021406285,0.002554159,0.012229307,-0.011762793,0.0051682983,0.0048484034,0.018087378,0.024325324,-0.037694257,-0.026537929,-0.008803768,-0.017767483,-0.012642504,-0.0062712682,0.0009771782,-0.010409906,0.017754154,-0.004671795,-0.030469967,0.008477209,-0.005218282,-0.0058480743,-0.020153364,-0.0032805866,0.004248601,0.0051449724,0.006791097,0.007650814,0.003458861,-0.0031223053,-0.01932697,-0.033615597,0.00745088,0.006321252,-0.0038154104,0.014555207,0.027697546,-0.02828402,0.0066711367,0.0077107945,0.01794076,0.011349596,-0.0052715978,0.014755142,-0.019753495,-0.011156326,0.011202978,0.022126047,0.00846388,0.030549942,-0.0041386373,0.018847128,-0.00033655585,0.024925126,-0.003555496,-0.019300312,0.010749794,0.0075308536,-0.018287312,-0.016567878,-0.012869096,-0.015528221,0.0078107617,-0.011156326,0.013522214,-0.020646535,-0.01211601,0.055928253,0.011596181,-0.017247654,0.0005939711,-0.026977783,-0.003942035,-0.009583511,-0.0055248477,-0.028737204,0.023179034,0.003995351,0.0219661,-0.008470545,0.023392297,0.010469886,-0.015874773,0.007890735,-0.009690142,-0.00024970944,0.012775794,0.0114762215,0.013422247,0.010429899,-0.03686786,-0.006717788,-0.027484283,0.011556195,-0.036068123,-0.013915418,-0.0016327957,0.0151016945,-0.020473259,0.004671795,-0.012555866,0.0209531,0.01982014,0.024485271,0.0105431955,-0.005178295,0.033162415,-0.013795458,0.007150979,0.010243294,0.005644808,0.017260984,-0.0045618312,0.0024725192,0.004305249,-0.008197301,0.0014203656,0.0018460588,0.005015015,-0.011142998,0.01439526,0.022965772,0.02552493,0.007757446,-0.0019726837,0.009503538,-0.032042783,0.008403899,-0.04609149,0.013808787,0.011749465,0.036388017,0.016314628,0.021939443,-0.0250051,-0.017354285,-0.012962398,0.00006107364,0.019113706,0.03081652,-0.018114036,-0.0084572155,0.009643491,-0.0034721901,0.0072642746,-0.0090636825,0.01642126,0.013428912,0.027724205,0.0071243206,-0.6858542,-0.031029783,-0.014595194,-0.011449563,0.017514233,0.01743426,0.009950057,0.0029706885,-0.015714826,-0.001806072,0.011856096,0.026444625,-0.0010663156,-0.006474535,0.0016161345,-0.020313311,0.0148351155,-0.0018393943,0.0057347785,0.018300641,-0.018647194,0.03345565,-0.008070676,0.0071443142,0.014301958,0.0044818576,0.003838736,-0.007350913,-0.024525259,-0.001142124,-0.018620536,0.017247654,0.007037683,0.010236629,0.06046009,0.0138887605,-0.012122675,0.037694257,0.0055081863,0.042492677,0.00021784494,-0.011656162,0.010276617,0.022325981,0.005984696,-0.009496873,0.013382261,-0.0010563189,0.0026507939,-0.041639622,0.008637156,0.026471283,-0.008403899,0.024858482,-0.00066686375,-0.0016252982,0.027590916,0.0051449724,0.0058647357,-0.008743787,-0.014968405,0.027724205,-0.011596181,0.0047650975,-0.015381602,0.0043718936,0.002159289,0.035908177,-0.008243952,-0.030443309,0.027564257,0.042625964,-0.0033688906,0.01843393,0.019087048,0.024578573,0.03268257,-0.015608194,-0.014128681,-0.0033538956,-0.0028757197,-0.004121976,-0.032389335,0.0034322033,0.058807302,0.010943064,-0.030523283,0.008903735,0.017500903,0.00871713,-0.0029406983,0.013995391,-0.03132302,-0.019660193,-0.00770413,-0.0038853872,0.0015894766,-0.0015294964,-0.006251275,-0.021099718,-0.010256623,-0.008863748,0.028550599,0.02020668,-0.0012962399,-0.003415542,-0.0022509254,0.0119360695,0.027590916,-0.046971202,-0.0015194997,-0.022405956,0.0016677842,-0.00018535563,-0.015421589,-0.031802863,0.03814744,0.0065411795,0.016567878,-0.015621523,0.022899127,-0.011076353,0.02841731,-0.002679118,-0.002342562,0.015341615,0.01804739,-0.020566562,-0.012989056,-0.002990682,0.01643459,0.00042527664,0.008243952,-0.013715484,-0.004835075,-0.009803439,0.03129636,-0.021432944,0.0012087687,-0.015741484,-0.0052016205,0.00080890034,-0.01755422,0.004811749,-0.017967418,-0.026684547,-0.014128681,0.0041386373,-0.013742141,-0.010056688,-0.013268964,-0.0110630235,-0.028337335,0.015981404,-0.00997005,-0.02424535,-0.013968734,-0.028310679,-0.027750863,-0.020699851,0.02235264,0.001057985,0.00081639783,-0.0099367285,0.013522214,-0.012016043,-0.00086471526,0.013568865,0.0019376953,-0.019020405,0.017460918,-0.023045745,0.008503866,0.0064678704,-0.011509543,0.018727167,-0.003372223,-0.0028690554,-0.0027024434,-0.011902748,-0.012182655,-0.015714826,-0.0098634185,0.00593138,0.018753825,0.0010146659,0.013029044,0.0003521757,-0.017620865,0.04102649,0.00552818,0.024485271,-0.009630162,-0.015608194,0.0006718621,-0.0008418062,0.012395918,0.0057980907,0.016221326,0.010616505,0.004838407,-0.012402583,0.019900113,-0.0034521967,0.000247002,-0.03153628,0.0011038032,-0.020819811,0.016234655,-0.00330058,-0.0032289368,0.00078973995,-0.021952773,-0.022459272,0.03118973,0.03673457,-0.021472929,0.0072109587,-0.015075036,0.004855068,-0.0008151483,0.0069643734,0.010023367,-0.010276617,-0.023019087,0.0068244194,-0.0012520878,-0.0015086699,0.022046074,-0.034148756,-0.0022192693,0.002427534,-0.0027124402,0.0060346797,0.015461575,0.0137554705,0.009230294,-0.009583511,0.032629255,0.015994733,-0.019167023,-0.009203636,0.03393549,-0.017274313,-0.012042701,-0.0009930064,0.026777849,-0.013582194,-0.0027590916,-0.017594207,-0.026804507,-0.0014236979,-0.022032745,0.0091236625,-0.0042419364,-0.00858384,-0.0033905501,-0.020739838,0.016821127,0.022539245,0.015381602,0.015141681,0.028817179,-0.019726837,-0.0051283115,-0.011489551,-0.013208984,-0.0047017853,-0.0072309524,0.01767418,0.0025658219,-0.010323267,0.012609182,-0.028097415,0.026871152,-0.010276617,0.021912785,0.0022542577,0.005124979,-0.0019710176,0.004518512,-0.040360045,0.010969722,-0.0031539614,-0.020366628,-0.025778178,-0.0110030435,-0.016221326,0.0036587953,0.016207997,0.003007343,-0.0032555948,0.0044052163,-0.022046074,-0.0008822095,-0.009363583,0.028230704,-0.024538586,0.0029840174,0.0016044717,-0.014181997,0.031349678,-0.014381931,-0.027750863,0.02613806,0.0004136138,-0.005748107,-0.01868718,-0.0010138329,0.0054348772,0.010703143,-0.003682121,0.0030856507,-0.004275259,-0.010403241,0.021113047,-0.022685863,-0.023032416,0.031429652,0.001792743,-0.005644808,-0.011842767,-0.04078657,-0.0026874484,0.06915057,-0.00056939584,-0.013995391,0.010703143,-0.013728813,-0.022939114,-0.015261642,-0.022485929,0.016807798,0.007964044,0.0144219175,0.016821127,0.0076241563,0.005461535,-0.013248971,0.015301628,0.0085171955,-0.004318578,0.011136333,-0.0059047225,-0.010249958,-0.018207338,0.024645219,0.021752838,0.0007614159,-0.013648839,0.01111634,-0.010503208,-0.0038487327,-0.008203966,-0.00397869,0.0029740208,0.008530525,0.005261601,0.01642126,-0.0038753906,-0.013222313,0.026537929,0.024671877,-0.043505676,0.014195326,0.024778508,0.0056914594,-0.025951454,0.017620865,-0.0021359634,0.008643821,0.021299653,0.0041686273,-0.009017031,0.04044002,0.024378639,-0.027777521,-0.014208655,0.0028623908,0.042119466,0.005801423,-0.028124074,-0.03129636,0.022139376,-0.022179363,-0.04067994,0.013688826,0.013328944,0.0046184794,-0.02828402,-0.0063412455,-0.0046184794,-0.011756129,-0.010383247,-0.0018543894,-0.0018593877,-0.00052024535,0.004815081,0.014781799,0.018007403,0.01306903,-0.020433271,0.009043689,0.033189073,-0.006844413,-0.019766824,-0.018767154,0.00533491,-0.0024575242,0.018727167,0.0058080875,-0.013835444,0.0040719924,0.004881726,0.012029372,0.005664801,0.03193615,0.0058047553,0.002695779,0.009290274,0.02361889,0.017834127,0.0049017193,-0.0036388019,0.010776452,-0.019793482,0.0067777685,-0.014208655,-0.024911797,0.002385881,0.0034988478,0.020899786,-0.0025858153,-0.011849431,0.033189073,-0.021312982,0.024965113,-0.014635181,0.014048708,-0.0035921505,-0.003347231,0.030869836,-0.0017161017,-0.0061346465,0.009203636,-0.025165047,0.0068510775,0.021499587,0.013782129,-0.0024475274,-0.0051149824,-0.024445284,0.006167969,0.0068844,-0.00076183246,0.030150073,-0.0055948244,-0.011162991,-0.02057989,-0.009703471,-0.020646535,0.008004031,0.0066378145,-0.019900113,-0.012169327,-0.01439526,0.0044252095,-0.004018677,0.014621852,-0.025085073,-0.013715484,-0.017980747,0.0071043274,0.011456228,-0.01010334,-0.0035321703,-0.03801415,-0.012036037,-0.0028990454,-0.05419549,-0.024058744,-0.024272008,0.015221654,0.027964126,0.03182952,-0.015354944,0.004855068,0.011522872,0.004771762,0.0027874154,0.023405626,0.0004242353,-0.03132302,0.007057676,0.008763781,-0.0027057757,0.023005757,-0.0071176565,-0.005238275,0.029110415,-0.010989714,0.013728813,-0.009630162,-0.029137073,-0.0049317093,-0.0008630492,-0.015248313,0.0043219104,-0.0055681667,-0.013175662,0.029723546,0.025098402,0.012849103,-0.0009996708,0.03118973,-0.0021709518,0.0260181,-0.020526575,0.028097415,-0.016141351,0.010509873,-0.022965772,0.002865723,0.0020493253,0.0020509914,-0.0041419696,-0.00039695262,0.017287642,0.0038987163,0.014795128,-0.014661839,-0.008950386,0.004431874,-0.009383577,0.0012604183,-0.023019087,0.0029273694,-0.033135757,0.009176978,-0.011023037,-0.002102641,0.02663123,-0.03849399,-0.0044152127,0.0004527676,-0.0026924468,0.02828402,0.017727496,0.035135098,0.02728435,-0.005348239,-0.001467017,-0.019766824,0.014715155,0.011982721,0.0045651635,0.023458943,-0.0010046692,-0.0031373003,-0.0006972704,0.0019043729,-0.018967088,-0.024311995,0.0011546199,0.007977373,-0.004755101,-0.010016702,-0.02780418,-0.004688456,0.013022379,-0.005484861,0.0017227661,-0.015394931,-0.028763862,-0.026684547,0.0030589928,-0.018513903,0.028363993,0.0044818576,-0.009270281,0.038920518,-0.016008062,0.0093902415,0.004815081,-0.021059733,0.01451522,-0.0051583014,0.023765508,-0.017874114,-0.016821127,-0.012522544,-0.0028390652,0.0040886537,0.020259995,-0.031216389,-0.014115352,-0.009176978,0.010303274,0.020313311,0.0064112223,-0.02235264,-0.022872468,0.0052449396,0.0005723116,0.0037321046,0.016807798,-0.018527232,-0.009303603,0.0024858483,-0.0012662497,-0.007110992,0.011976057,-0.007790768,-0.042999174,-0.006727785,-0.011829439,0.007024354,0.005278262,-0.017740825,-0.0041519664,0.0085905045,0.027750863,-0.038387362,0.024391968,0.00087721116,0.010509873,-0.00038508154,-0.006857742,0.0183273,-0.0037054466,0.015461575,0.0017394272,-0.0017944091,0.014181997,-0.0052682655,0.009023695,0.00719763,-0.013522214,0.0034422,0.014941746,-0.0016711164,-0.025298337,-0.017634194,0.0058714002,-0.005321581,0.017834127,0.0110630235,-0.03369557,0.029190388,-0.008943722,0.009363583,-0.0034222065,-0.026111402,-0.007037683,-0.006561173,0.02473852,-0.007084334,-0.010110005,-0.008577175,0.0030439978,-0.022712521,0.0054582027,-0.0012620845,-0.0011954397,-0.015741484,0.0129557345,-0.00042111133,0.00846388,0.008930393,0.016487904,0.010469886,-0.007917393,-0.011762793,-0.0214596,0.000917198,0.021672864,0.010269952,-0.007737452,-0.010243294,-0.0067244526,-0.015488233,-0.021552904,0.017127695,0.011109675,0.038067464,0.00871713,-0.0025591573,0.021312982,-0.006237946,0.034628596,-0.0045251767,0.008357248,0.020686522,0.0010696478,0.0076708077,0.03772091,-0.018700508,-0.0020676525,-0.008923728,-0.023298996,0.018233996,-0.010256623,0.0017860786,0.009796774,-0.00897038,-0.01269582,-0.018527232,0.009190307,-0.02372552,-0.042119466,0.008097334,-0.0066778013,-0.021046404,0.0019593548,0.011083017,-0.0016028056,0.012662497,-0.000059095124,0.0071043274,-0.014675168,0.024831824,-0.053582355,0.038387362,0.0005698124,0.015954746,0.021552904,0.031589597,-0.009230294,-0.0006147976,0.002625802,-0.011749465,-0.034362018,-0.0067844326,-0.018793812,0.011442899,-0.008743787,0.017474247,-0.021619547,0.01831397,-0.009037024,-0.0057247817,-0.02728435,0.010363255,0.034415334,-0.024032086,-0.0020126705,-0.0045518344,-0.019353628,-0.018340627,-0.03129636,-0.0034038792,-0.006321252,-0.0016161345,0.033642255,-0.000056075285,-0.005005019,0.004571828,-0.0024075406,-0.00010215386,0.0098634185,0.1980148,-0.003825407,-0.025191706,0.035161756,0.005358236,0.025111731,0.023485601,0.0023342315,-0.011882754,0.018287312,-0.0068910643,0.003912045,0.009243623,-0.001355387,-0.028603915,-0.012802451,-0.030150073,-0.014795128,-0.028630573,-0.0013487226,0.002667455,0.00985009,-0.0033972147,-0.021486258,0.009503538,-0.017847456,0.013062365,-0.014341944,0.005078328,0.025165047,-0.015594865,-0.025924796,-0.0018177348,0.010996379,-0.02993681,0.007324255,0.014475234,-0.028577257,0.005494857,0.00011725306,-0.013315615,0.015941417,0.009376912,0.0025158382,0.008743787,0.023832154,-0.008084005,-0.014195326,-0.008823762,0.0033455652,-0.032362677,-0.021552904,-0.0056081535,0.023298996,-0.025444955,0.0097301295,0.009736794,0.015274971,-0.0012937407,-0.018087378,-0.0039387033,0.008637156,-0.011189649,-0.00023846315,-0.011582852,0.0066411467,-0.018220667,0.0060846633,0.0376676,-0.002709108,0.0072776037,0.0034188742,-0.010249958,-0.0007747449,-0.00795738,-0.022192692,0.03910712,0.032122757,0.023898797,0.0076241563,-0.007397564,-0.003655463,0.011442899,-0.014115352,-0.00505167,-0.031163072,0.030336678,-0.006857742,-0.022259338,0.004048667,0.02072651,0.0030156737,-0.0042119464,0.00041861215,-0.005731446,0.011103011,0.013822115,0.021512916,0.009216965,-0.006537847,-0.027057758,-0.04054665,0.010403241,-0.0056281467,-0.005701456,-0.002709108,-0.00745088,-0.0024841821,0.009356919,-0.022659205,0.004061996,-0.013175662,0.017074378,-0.006141311,-0.014541878,0.02993681,-0.00028448965,-0.025271678,0.011689484,-0.014528549,0.004398552,-0.017274313,0.0045751603,0.012455898,0.004121976,-0.025458284,-0.006744446,0.011822774,-0.015035049,-0.03257594,0.014675168,-0.0039187097,0.019726837,-0.0047251107,0.0022825818,0.011829439,0.005391558,-0.016781142,-0.0058747325,0.010309938,-0.013049036,0.01186276,-0.0011246296,0.0062112883,0.0028190718,-0.021739509,0.009883412,-0.0073175905,-0.012715813,-0.017181009,-0.016607866,-0.042492677,-0.0014478565,-0.01794076,0.012302616,-0.015194997,-0.04433207,-0.020606548,0.009696807,0.010303274,-0.01694109,-0.004018677,0.019353628,-0.001991011,0.000058938927,0.010536531,-0.17274313,0.010143327,0.014235313,-0.024152048,0.025684876,-0.0012504216,0.036601283,-0.003698782,0.0007310093,0.004165295,-0.0029157067,0.017101036,-0.046891227,-0.017460918,0.022965772,0.020233337,-0.024072073,0.017220996,0.009370248,0.0010363255,0.0194336,-0.019606877,0.01818068,-0.020819811,0.007410893,0.0019326969,0.017887443,0.006651143,0.00067394477,-0.011889419,-0.025058415,-0.008543854,0.021579562,0.0047484366,0.014062037,0.0075508473,-0.009510202,-0.009143656,0.0046817916,0.013982063,-0.0027990784,0.011782787,0.014541878,-0.015701497,-0.029350337,0.021979429,0.01332228,-0.026244693,-0.0123492675,-0.003895384,0.0071576433,-0.035454992,-0.00046984528,0.0033522295,0.039347045,0.0005119148,0.00476843,-0.012995721,0.0024042083,-0.006931051,-0.014461905,-0.0127558,0.0034555288,-0.0074842023,-0.030256703,-0.007057676,-0.00807734,0.007804097,-0.006957709,0.017181009,-0.034575284,-0.008603834,-0.005008351,-0.015834786,0.02943031,0.016861115,-0.0050849924,0.014235313,0.0051449724,0.0025924798,-0.0025741523,0.04289254,-0.002104307,0.012969063,-0.008310596,0.00423194,0.0074975314,0.0018810473,-0.014248641,-0.024725191,0.0151016945,-0.017527562,0.0018727167,0.0002830318,0.015168339,0.0144219175,-0.004048667,-0.004358565,0.011836103,-0.010343261,-0.005911387,0.0022825818,0.0073175905,0.00403867,0.013188991,0.03334902,0.006111321,0.008597169,0.030123414,-0.015474904,0.0017877447,-0.024551915,0.013155668,0.023525586,-0.0255116,0.017220996,0.004358565,-0.00934359,0.0099967085,0.011162991,0.03092315,-0.021046404,-0.015514892,0.0011946067,-0.01816735,0.010876419,-0.10124666,-0.03550831,0.0056348112,0.013942076,0.005951374,0.020419942,-0.006857742,-0.020873128,-0.021259667,0.0137554705,0.0057880944,-0.029163731,-0.018767154,-0.021392956,0.030896494,-0.005494857,-0.0027307675,-0.006801094,-0.014821786,0.021392956,-0.0018110704,-0.0018843795,-0.012362596,-0.0072176233,-0.017194338,-0.018713837,-0.024272008,0.03801415,0.00015880188,0.0044951867,-0.028630573,-0.0014070367,-0.00916365,-0.026537929,-0.009576847,-0.013995391,-0.0077107945,0.0050016865,0.00578143,-0.04467862,0.008363913,0.010136662,-0.0006268769,-0.006591163,0.015341615,-0.027377652,-0.00093136,0.029243704,-0.020886457,-0.01041657,-0.02424535,0.005291591,-0.02980352,-0.009190307,0.019460259,-0.0041286405,0.004801752,0.0011787785,-0.001257086,-0.011216307,-0.013395589,0.00088137644,-0.0051616337,0.03876057,-0.0033455652,0.00075850025,-0.006951045,-0.0062112883,0.018140694,-0.006351242,-0.008263946,0.018154023,-0.012189319,0.0075508473,-0.044358727,-0.0040153447,0.0093302615,-0.010636497,0.032789204,-0.005264933,-0.014235313,-0.018393943,0.007297597,-0.016114693,0.015021721,0.020033404,0.0137688,0.0011046362,0.010616505,-0.0039453674,0.012109346,0.021099718,-0.0072842683,-0.019153694,-0.003768759,0.039320387,-0.006747778,-0.0016852784,0.018154023,0.0010963057,-0.015035049,-0.021033075,-0.04345236,0.017287642,0.016341286,-0.008610498,0.00236922,0.009290274,0.028950468,-0.014475234,-0.0035654926,0.015434918,-0.03372223,0.004501851,-0.012929076,-0.008483873,-0.0044685286,-0.0102233,0.01615468,0.0022792495,0.010876419,-0.0059647025,0.01895376,-0.0069976957,-0.0042952523,0.017207667,-0.00036133936,0.0085905045,0.008084005,0.03129636,-0.016994404,-0.014915089,0.020100048,-0.012009379,-0.006684466,0.01306903,0.00015765642,-0.00530492,0.0005277429,0.015421589,0.015528221,0.032202728,-0.003485519,-0.0014286962,0.033908837,0.001367883,0.010509873,0.025271678,-0.020993087,0.019846799,0.006897729,-0.010216636,-0.00725761,0.01818068,-0.028443968,-0.011242964,-0.014435247,-0.013688826,0.006101324,-0.0022509254,0.013848773,-0.0019077052,0.017181009,0.03422873,0.005324913,-0.0035188415,0.014128681,-0.004898387,0.005038341,0.0012320944,-0.005561502,-0.017847456,0.0008538855,-0.0047884234,0.011849431,0.015421589,-0.013942076,0.0029790192,-0.013702155,0.0001199605,-0.024431955,0.019926772,0.022179363,-0.016487904,-0.03964028,0.0050849924,0.017487574,0.022792496,0.0012504216,0.004048667,-0.00997005,0.0076041627,-0.014328616,-0.020259995,0.0005598157,-0.010469886,0.0016852784,0.01716768,-0.008990373,-0.001987679,0.026417969,0.023792166,0.0046917885,-0.0071909656,-0.00032051947,-0.023259008,-0.009170313,0.02071318,-0.03156294,-0.030869836,-0.006324584,0.013795458,-0.00047151142,0.016874444,0.00947688,0.00985009,-0.029883493,0.024205362,-0.013522214,-0.015075036,-0.030603256,0.029270362,0.010503208,0.021539574,0.01743426,-0.023898797,0.022019416,-0.0068777353,0.027857494,-0.021259667,0.0025758184,0.006197959,0.006447877,-0.00025200035,-0.004941706,-0.021246338,-0.005504854,-0.008390571,-0.0097301295,0.027244363,-0.04446536,0.05216949,0.010243294,-0.016008062,0.0122493,-0.0199401,0.009077012,0.019753495,0.006431216,-0.037960835,-0.027377652,0.016381273,-0.0038620618,0.022512587,-0.010996379,-0.0015211658,-0.0102233,0.007071005,0.008230623,-0.009490209,-0.010083347,0.024431955,0.002427534,0.02828402,0.0035721571,-0.022192692,-0.011882754,0.010056688,0.0011904413,-0.01426197,-0.017500903,-0.00010985966,0.005591492,-0.0077707744,-0.012049366,0.011869425,0.00858384,-0.024698535,-0.030283362,0.020140035,0.011949399,-0.013968734,0.042732596,-0.011649498,-0.011982721,-0.016967745,-0.0060913274,-0.007130985,-0.013109017,-0.009710136}; + + // Specifies that the vector search will consider the 150 nearest neighbors + // in the specified index + var options = new VectorSearchOptions() + { + IndexName = "vector_index", + NumberOfCandidates = 150 + }; + + // Builds aggregation pipeline and specifies that the $vectorSearch stage + // returns 10 results + var results = queryableCollection + .VectorSearch(m => m.Embedding, vector, 10, options) + .Select(m => new { m.Title, m.Plot }); + +The results of the preceding example contain the following documents: + +.. code-block:: json + + { "_id" : ObjectId("573a13a0f29313caabd04a4f"), "plot" : "A reporter, learning of time travelers visiting 20th century disasters, tries to change the history they know by averting upcoming disasters.", "title" : "Thrill Seekers" } + { "_id" : ObjectId("573a13d8f29313caabda6557"), "plot" : "At the age of 21, Tim discovers he can travel in time and change what happens and has happened in his own life. His decision to make his world a better place by getting a girlfriend turns out not to be as easy as you might think.", "title" : "About Time" } + { "_id" : ObjectId("573a13a5f29313caabd13b4b"), "plot" : "Hoping to alter the events of the past, a 19th century inventor instead travels 800,000 years into the future, where he finds humankind divided into two warring races.", "title" : "The Time Machine" } + { "_id" : ObjectId("573a13aef29313caabd2e2d7"), "plot" : "After using his mother's newly built time machine, Dolf gets stuck involuntary in the year 1212. He ends up in a children's crusade where he confronts his new friends with modern techniques...", "title" : "Crusade in Jeans" } + { "_id" : ObjectId("573a1399f29313caabceec0e"), "plot" : "An officer for a security agency that regulates time travel, must fend for his life against a shady politician who has a tie to his past.", "title" : "Timecop" } + { "_id" : ObjectId("573a1399f29313caabcee36f"), "plot" : "A time-travel experiment in which a robot probe is sent from the year 2073 to the year 1973 goes terribly wrong thrusting one of the project scientists, a man named Nicholas Sinclair into a...", "title" : "A.P.E.X." } + { "_id" : ObjectId("573a13c6f29313caabd715d3"), "plot" : "Agent J travels in time to M.I.B.'s early days in 1969 to stop an alien from assassinating his friend Agent K and changing history.", "title" : "Men in Black 3" } + { "_id" : ObjectId("573a13d4f29313caabd98c13"), "plot" : "Bound by a shared destiny, a teen bursting with scientific curiosity and a former boy-genius inventor embark on a mission to unearth the secrets of a place somewhere in time and space that exists in their collective memory.", "title" : "Tomorrowland" } + { "_id" : ObjectId("573a13b6f29313caabd477fa"), "plot" : "With the help of his uncle, a man travels to the future to try and bring his girlfriend back to life.", "title" : "Love Story 2050" } + { "_id" : ObjectId("573a13e5f29313caabdc40c9"), "plot" : "A dimension-traveling wizard gets stuck in the 21st century because cell-phone radiation interferes with his magic. With his home world on the brink of war, he seeks help from a jaded ...", "title" : "The Portal" } + +For more information about Atlas Vector Search, Atlas Vector Search indexes, and how +to incorporate them into your application, see :atlas:`Atlas Vector Search Overview ` +in the Atlas manual. For more examples about running Atlas Vector Search queries using the +{+driver-short+}, see :atlas:`Run Vector Search Queries ` +in the Atlas manual and select :guilabel:`C#` from the language dropdown. \ No newline at end of file diff --git a/source/aggregation/unwind.txt b/source/aggregation/unwind.txt deleted file mode 100644 index 3dbd5d84..00000000 --- a/source/aggregation/unwind.txt +++ /dev/null @@ -1,52 +0,0 @@ -.. _csharp-aggregation-unwind: - -====== -Unwind -====== - -.. facet:: - :name: genre - :values: reference - -.. meta:: - :keywords: code example, transform, pipeline - -.. contents:: On this page - :local: - :backlinks: none - :depth: 2 - :class: singlecol - - -Unwind ------- - -Use the ``unwind()`` method to create an :manual:`$unwind ` -pipeline stage to deconstruct an array field from input documents, creating -an output document for each array element. - -The following example creates a document for each element in the ``sizes`` array: - -.. literalinclude:: /includes/aggregation/Builders.cs - :start-after: // begin unwindStage - :end-before: // end unwindStage - :language: csharp - :dedent: - -To preserve documents that have missing or ``null`` -values for the array field, or where array is empty: - - -.. literalinclude:: /includes/aggregation/Builders.cs - :start-after: // begin unwindPreserve - :end-before: // end unwindPreserve - :language: csharp - :dedent: - -To include the array index, in this example in a field called ``"position"``: - -.. literalinclude:: /includes/aggregation/Builders.cs - :start-after: // begin unwindIndex - :end-before: // end unwindIndex - :language: csharp - :dedent: diff --git a/source/aggregation/vectorSearch.txt b/source/aggregation/vectorSearch.txt deleted file mode 100644 index b04cf3a3..00000000 --- a/source/aggregation/vectorSearch.txt +++ /dev/null @@ -1,18 +0,0 @@ -.. _csharp-aggregation-vectorsearch: - -============ -VectorSearch -============ - -.. facet:: - :name: genre - :values: reference - -.. meta:: - :keywords: code example, transform, pipeline - -.. contents:: On this page - :local: - :backlinks: none - :depth: 2 - :class: singlecol \ No newline at end of file From aa37a1b6d94b9a28d4bcd3ace63784f3c60b1b6f Mon Sep 17 00:00:00 2001 From: Mike Woofter <108414937+mongoKart@users.noreply.github.com> Date: Mon, 14 Apr 2025 15:44:47 -0500 Subject: [PATCH 05/17] wip --- snooty.toml | 3 +- source/aggregation.txt | 250 +---------------------------- source/aggregation/operators.txt | 263 ------------------------------- source/aggregation/stages.txt | 204 ++++++++++++++++-------- 4 files changed, 144 insertions(+), 576 deletions(-) delete mode 100644 source/aggregation/operators.txt diff --git a/snooty.toml b/snooty.toml index 61696522..33ce9b2c 100644 --- a/snooty.toml +++ b/snooty.toml @@ -1,7 +1,8 @@ toc_landing_pages = [ "/get-started", "/connect/connection-options", - "/security/authentication" + "/security/authentication", + "/aggregation" ] name = "csharp" title = "C#/.NET" diff --git a/source/aggregation.txt b/source/aggregation.txt index d418d36f..bd26a7bb 100644 --- a/source/aggregation.txt +++ b/source/aggregation.txt @@ -21,7 +21,7 @@ Aggregation Operations :titlesonly: :maxdepth: 1 - Aggregation Stages + Stages Overview -------- @@ -97,226 +97,11 @@ performing aggregation operations: property of the ``AggregateOptions`` object that you pass to the ``Aggregate()`` method. Link to Stages page -Link to Operators page -Build an Aggregation Pipeline ------------------------------ - -The following sections describe the different ways to build an aggregation -pipeline by using the {+driver-long+}. - -Builders -~~~~~~~~ - -You can create an aggregation pipeline in the following ways: - -- Create an ``EmptyPipelineDefinition`` object and chain calls to the relevant - aggregation methods. Then, pass the pipeline object to the ``IMongoCollection.Aggregate()`` - method. as shown in the following example: - -- Chain aggregation methods directly from the call to the - ``IMongoCollection.Aggregate()`` method. - -Select the :guilabel:`EmptyPipelineDefinition` -or :guilabel:`Aggregate` tab to see the corresponding code: - -.. tabs:: - - .. tab:: EmptyPipelineDefinition - :tabid: empty-pipeline-definition - - .. code-block:: csharp - - // Defines the aggregation pipeline - var pipeline = new EmptyPipelineDefinition() - .Match(...) - .Group(...) - .Merge(...); - - // Executes the aggregation pipeline - var results = collection.Aggregate(pipeline).ToList(); - - .. tab:: Aggregate - :tabid: aggregate - - .. code-block:: csharp - - // Defines and executes the aggregation pipeline - var pipeline = collection.Aggregate() - .Match(...) - .Group(...) - .Merge(...); - -LINQ -~~~~ - -You can use LINQ to create an :ref:`aggregation pipeline `. -The {+driver-short+} automatically translates each LINQ statement into the corresponding -aggregation pipeline stages. In this section you can learn which -aggregation pipeline stages are supported. - -To learn more about the aggregation pipeline stages, see the -:ref:`aggregation-pipeline-operator-reference` page in the server manual. - -In this guide you can learn how to use -`LINQ `__ -with the {+driver-long+}. LINQ allows you to construct queries against -strongly typed collections of objects by using language keywords and operators. -The {+driver-short+} automatically translates LINQ queries into -:manual:`aggregation operations `. - -.. important:: - - LINQ3 is the only LINQ provider available in the {+driver-long+}. If you have - manually configured your project to use LINQ2, it will not compile. - -To use LINQ to query your collection, you must first create an -an `IQueryable -`__ -object that links to the collection. To create the object, use the ``AsQueryable()`` method -as follows: - -.. code-block:: csharp - :emphasize-lines: 3 - - var restaurantsDatabase = client.GetDatabase("sample_restaurants"); - var restaurantsCollection = restaurantsDatabase.GetCollection("restaurants"); - var queryableCollection = restaurantsCollection.AsQueryable(); - -Once you have the queryable object, you can compose a query using -**method syntax**. Some pipeline stages also support **query comprehension syntax**, -which resembles SQL query syntax. - -Select the :guilabel:`Method Syntax` or :guilabel:`Query Syntax` tab to see -how to compose a query using LINQ: - -.. tabs:: - - .. tab:: Method Syntax - :tabid: method-syntax - - .. code-block:: csharp - - var query = queryableCollection - .Where(r => r.Name == "The Movable Feast") - .Select(r => new { r.Name, r.Address }); - - .. tab:: Query Syntax - :tabid: query-syntax - - .. code-block:: csharp - - var query = from r in queryableCollection - where r.Name == "The Movable Feast" - select new { r.Name, r.Address }; - -You can print the results of the preceding example as follows: - -.. io-code-block:: - - .. input:: - :language: csharp - - foreach (var restaurant in query) - { - Console.WriteLine(restaurant.ToJson()); - } - - .. output:: - - { "name" : "The Movable Feast", "address" : { "building" : "284", "coord" : [-73.982923900000003, 40.6580753], "street" : "Prospect Park West", "zipcode" : "11215" } } - -.. tip:: Accessing Query Results - - You can also access the results of your query by using the ``ToList()`` or - ``ToCursor()`` methods: - - .. code-block:: csharp - - var results = query.ToList(); - - .. code-block:: csharp - - var results = query.ToCursor(); - -View Translated Queries ------------------------ - -When you run a LINQ query, the {+driver-short+} automatically translates your -query into an aggregation pipeline written with the {+query-api+}. You can view -the translated query by using the ``ToString()`` method or the -``LoggedStages`` property. - -To see the translated query for **non-scalar operations**, use the ``ToString()`` -method. Non-scalar operations are operations that return a query object, such -as: - -- ``Where`` -- ``Select`` -- ``SelectMany`` -- ``GroupJoin`` - -The following example calls the ``ToString()`` method on a LINQ query and prints -the translated query: - -.. io-code-block:: - - .. input:: - :language: csharp - - var queryableCollection = _restaurantsCollection.AsQueryable(); - var query = queryableCollection - .Where(r => r.Name == "The Movable Feast"); - - var queryTranslated = query.ToString(); - Console.WriteLine(queryTranslated); - - .. output:: - - sample_restaurants.restaurants.Aggregate([{ "$match" : { "name" : "The Movable Feast" } }]) - -To get the translated query for **scalar operations** use the ``LoggedStages`` -property. Scalar operations are operations that return a scalar result rather than a -query object, such as: - -- ``First`` -- ``Sum`` -- ``Count`` -- ``Min`` -- ``Max`` - -To get a translated query with the ``LoggedStages`` property, you must save -the translated query directly after it is executed, and before executing any -other queries with the same queryable object. - -The following example uses the ``LoggedStages`` property on a LINQ query that -uses a scalar operation, then prints the translated query: - -.. io-code-block:: - - .. input:: - :language: csharp - :emphasize-lines: 6 - - - var queryableCollection = _restaurantsCollection.AsQueryable(); - var query = queryableCollection - .Where(r => r.Name == "The Movable Feast"); - - var result = query.FirstOrDefault(); - var queryTranslated = query.LoggedStages; - - Console.WriteLine(queryTranslated.ToJson()); - - .. output:: - - [{ "$match" : { "name" : "The Movable Feast" } }, { "$limit" : NumberLong(1) }] - -.. important:: +Troubleshooting +--------------- - ``LoggedStages`` is not thread-safe. Executing a query and accessing the - associated ``LoggedStages`` property from multiple threads might have - non-deterministic results. +.. include:: /includes/troubleshooting/unsupported-filter-expression.rst Additional Information ---------------------- @@ -349,29 +134,4 @@ following API documentation: - `Match() <{+new-api-root+}/MongoDB.Driver/MongoDB.Driver.PipelineStageDefinitionBuilder.Match.html>`__ - `Where() <{+new-api-root+}/MongoDB.Driver/MongoDB.Driver.Linq.MongoQueryable.Where.html>`__ - `GroupBy() <{+new-api-root+}/MongoDB.Driver/MongoDB.Driver.Linq.MongoQueryable.GroupBy.html>`__ -- `Select() <{+new-api-root+}/MongoDB.Driver/MongoDB.Driver.Linq.MongoQueryable.Select.html>`__ -- `PipelineDefinitionBuilder <{+new-api-root+}/MongoDB.Driver/MongoDB.Driver.PipelineDefinitionBuilder.html>`__ - -.. TODO: integrate into existing page - -Unsupported Aggregation Stages ------------------------------- - -The {+driver-long+} implementation of LINQ does not support the following -aggregation stages: - -- ``$redact`` -- ``$geoNear`` -- ``$out`` - -To learn how to create an aggregation pipeline with the ``$out`` stage by using Builders, see -the :ref:`` section. - - - - - -Troubleshooting ---------------- - -.. include:: /includes/troubleshooting/unsupported-filter-expression.rst \ No newline at end of file +- `Select() <{+new-api-root+}/MongoDB.Driver/MongoDB.Driver.Linq.MongoQueryable.Select.html>`__ \ No newline at end of file diff --git a/source/aggregation/operators.txt b/source/aggregation/operators.txt deleted file mode 100644 index a39eaf6d..00000000 --- a/source/aggregation/operators.txt +++ /dev/null @@ -1,263 +0,0 @@ -Supported Methods ------------------ - -The following are some methods supported by the {+driver-long+} implementation of LINQ: - -.. list-table:: - :header-rows: 1 - :widths: 40 60 - - * - Method Name - - Description - - * - ``Any`` - - Determines if any documents match the specified criteria - - * - ``Average`` - - Calculates the average of the specified fields - - * - ``Count`` - - Returns an ``Int32`` that represents the number of documents that match the specified criteria - - * - ``LongCount`` - - Returns an ``Int64`` that represents the number of documents that match the specified criteria - - * - ``DateFromString`` - - Converts a ``string`` to a ``DateTime`` object - - * - ``Distinct`` - - Returns distinct documents that match the specified criteria - - * - ``DistinctMany`` - - Returns distinct documents from an array that match the specified criteria - - * - ``Exists`` - - Tests whether a field exists - - * - ``First`` - - Returns the first matching document, and throws an exception if none are found - - * - ``FirstOrDefault`` - - Returns the first matching document, or ``null`` if none are found - - * - ``GroupBy`` - - Groups documents based on specified criteria - - * - ``GroupJoin`` - - Performs a left outer join to another collection in the same database - - * - ``IsMissing`` - - Returns ``true`` if a field is missing and false otherwies - - * - ``IsNullOrMissing`` - - Returns ``true`` if a field is null or missing and false otherwise - - * - ``Max`` - - Returns the document with the maximum specified value - - * - ``OfType`` - - Returns documents that match the specified type - - * - ``OrderBy``, ``OrderByDescending`` - - Returns results in a specified sort order - - * - ``ThenBy``, ``ThenByDescending`` - - Allows a secondary sort to be specified - - * - ``Select`` - - Selects documents based on specified criteria - - * - ``SelectMany`` - - Projects each element of a sequence and combines the resulting sequences into one document - - * - ``Single`` - - Returns the only matching document, and throws an exception if there is not exactly one document - - * - ``SingleOrDefault`` - - Returns a single matching document or ``null`` if no documents match - - * - ``Skip`` - - Skips over a specified number of documents and returns the rest of the results - - * - ``Sum`` - - Returns the sum of the values in a specified field - - * - ``Take`` - - Specifies the number of results to return - - * - ``Where`` - - Returns all documents that match your specified criteria - -Bitwise Operators -~~~~~~~~~~~~~~~~~ - -This section describes the :wikipedia:`bitwise operators ` -supported by the {+driver-short+} that you can use in an aggregation pipeline. -You can use multiple bitwise operators in the same -stage. The following guidelines apply when using bitwise operators: - -- All operands must be of type ``int`` or ``long``. - -- ``$bitAnd``, ``$bitOr``, and ``$bitXor`` take two or more operands. ``$bitNot`` takes one operand. - -- Bitwise operations are evaluated from left to right. - -The examples in this section use the following documents in a collection called -``ingredients``: - -.. code-block:: json - - { "_id" : 1, "name" : "watermelon", "is_available" : 1, "is_cheap" : 1 }, - { "_id" : 2, "name" : "onions", "is_available" : 1, "is_cheap" : 0 }, - { "_id" : 3, "name" : "eggs", "is_available" : 0, "is_cheap" : 0 }, - { "_id" : 4, "name" : "potatoes", "is_available" : 1, "is_cheap" : 1 }, - { "_id" : 5, "name" : "pasta", "is_available" : 0, "is_cheap" : 1 }, - { "_id" : 6, "name" : "cheese", "is_available" : 1 } - -The ``"is_available"`` field represents if an ingredient is available. If this -field has a value of ``0``, the ingredient is not available. If it has a value -of ``1``, the ingredient is available. - -The ``"is_cheap"`` field represents if an ingredient is cheap. If this field has -a value of ``0``, the ingredient is not cheap. If it has a value of ``1``, the -ingredient is cheap. - -The following ``Ingredient`` class models the documents in the ``ingredients`` -collection: - -.. literalinclude:: /includes/fundamentals/code-examples/linq.cs - :language: csharp - :dedent: - :start-after: start-ingredient-model - :end-before: end-ingredient-model - -.. note:: Missing or Undefined Operands - - If the operands you pass to any bitwise operator are of type `nullable `__ - ``int`` or ``long`` and contain a missing or undefined value, the entire expression - evaluates to ``null``. If the operands are of type non-nullable ``int`` or - ``long`` and contain a missing or undefined value, the {+driver-short+} will - throw an error. - -$bitAnd -+++++++ - -The ``$bitAnd`` aggregation operator performs a bitwise AND operation on the given -arguments. You can use the ``$bitAnd`` operator by connecting two or more -clauses with a ``&`` character. - -The following example shows how to create a ``$bitAnd`` stage by using LINQ. The -code retrieves the document in which the ``Name`` field has the -value ``"watermelon"``. It then performs a bitwise AND operation on the values of the -``IsAvailable`` and ``IsCheap`` fields in this document. - -.. literalinclude:: /includes/fundamentals/code-examples/linq.cs - :language: csharp - :dedent: - :start-after: start-bitAnd-example - :end-before: end-bitAnd-example - -The preceding code returns ``1``, the result of the AND operation on the values -of the ``IsAvailable`` field (``1``) and the ``IsCheap`` field (``1``). - -The following example performs the same bitwise AND operation on all -documents in the collection: - -.. io-code-block:: - :copyable: true - - .. input:: /includes/fundamentals/code-examples/linq.cs - :language: csharp - :dedent: - :start-after: start-bitAnd-collection-example - :end-before: end-bitAnd-collection-example - - .. output:: - :language: json - :visible: false - - 1 - 0 - 0 - 1 - 0 - null - -The ``null`` result comes from the document where the ``Name`` field -has the value of ``"cheese"``. This document is missing an ``IsCheap`` field, so -the expression evaluates to ``null``. - -$bitOr -++++++ - -The ``$bitOr`` aggregation operator performs a bitwise OR operation on the given -arguments. You can use the ``$bitOr`` operator by connecting two or more -clauses with a ``|`` character. - -The following example shows how to create a ``$bitOr`` stage by using LINQ. The -code retrieves the document in which the ``Name`` field has the -value ``"onions"``. It then performs a bitwise OR operation on the values of the -``IsAvailable`` and ``IsCheap`` fields in this document. - -.. literalinclude:: /includes/fundamentals/code-examples/linq.cs - :language: csharp - :dedent: - :start-after: start-bitOr-example - :end-before: end-bitOr-example - -The preceding code returns ``1``, the result of the OR operation on the values -of the ``IsAvailable`` field (``1``) and the ``IsCheap`` field (``0``). - -$bitNot -+++++++ - -The ``$bitNot`` aggregation operator performs a bitwise NOT operation on the given -argument. You can use the ``$bitNot`` operator by preceding an -operand with a ``~`` character. ``$bitNot`` only takes one argument. The -following example shows how to create a ``$bitNot`` stage by using LINQ: - -.. io-code-block:: - :copyable: true - - .. input:: /includes/fundamentals/code-examples/linq.cs - :language: csharp - :dedent: - :start-after: start-bitNot-example - :end-before: end-bitNot-example - - .. output:: - :language: json - :visible: false - - -2 - -1 - -1 - -2 - -2 - null - -$bitXor -+++++++ - -The ``$bitXor`` aggregation operator performs a bitwise XOR operation on the given -arguments. You can use the ``$bitXor`` operator by connecting two or more -clauses with a ``^`` character. - -The following example shows how to create a ``$bitXor`` stage by using LINQ. The -code retrieves the documents in which the ``Name`` field has -the value ``"watermelon"`` or ``"onions"``. It then performs a bitwise XOR -operation on the values of the ``IsAvailable`` and ``IsCheap`` fields in these -documents. - -.. literalinclude:: /includes/fundamentals/code-examples/linq.cs - :language: csharp - :dedent: - :start-after: start-bitXor-example - :end-before: end-bitXor-example - -The result contains the following values: - -.. code-block:: json - - 0 - 1 \ No newline at end of file diff --git a/source/aggregation/stages.txt b/source/aggregation/stages.txt index 7ca77406..47e292d8 100644 --- a/source/aggregation/stages.txt +++ b/source/aggregation/stages.txt @@ -1,4 +1,75 @@ +.. _csharp-aggregation-stages: +================== +Aggregation Stages +================== + +.. facet:: + :name: genre + :values: reference + +.. meta:: + :keywords: dotnet, code example, transform, pipeline + +.. contents:: On this page + :local: + :backlinks: none + :depth: 2 + :class: singlecol + +Overview +-------- + +The {+driver-short+} provides type-safe methods that you can use to create an +aggregation pipeline. On this page, you can learn how to use these methods. + +Build an Aggregation Pipeline +----------------------------- + +The following sections describe the different ways to build an aggregation +pipeline by using the {+driver-long+}. + +Builders +~~~~~~~~ + +You can create an aggregation pipeline in the following ways: + +- Create an ``EmptyPipelineDefinition`` object and chain calls to the relevant + aggregation methods. Then, pass the pipeline object to the ``IMongoCollection.Aggregate()`` + method. as shown in the following example: + +- Chain aggregation methods directly from the call to the + ``IMongoCollection.Aggregate()`` method. + +Select the :guilabel:`EmptyPipelineDefinition` +or :guilabel:`Aggregate` tab to see the corresponding code: + +.. tabs:: + + .. tab:: EmptyPipelineDefinition + :tabid: empty-pipeline-definition + + .. code-block:: csharp + + // Defines the aggregation pipeline + var pipeline = new EmptyPipelineDefinition() + .Match(...) + .Group(...) + .Merge(...); + + // Executes the aggregation pipeline + var results = collection.Aggregate(pipeline).ToList(); + + .. tab:: Aggregate + :tabid: aggregate + + .. code-block:: csharp + + // Defines and executes the aggregation pipeline + var pipeline = collection.Aggregate() + .Match(...) + .Group(...) + .Merge(...); Aggregation Stage Methods ------------------------- @@ -15,92 +86,92 @@ method name. * - Stage - Description + - Builders Method - * - :ref:`Bucket() ` - + * - :manual:`$bucket ` - Categorizes incoming documents into groups, called buckets, based on a specified expression and bucket boundaries. - - * - :ref:`BucketAuto() ` - + - :ref:`Bucket() ` + + * - :manual:`$bucketAuto ` - Categorizes incoming documents into a specific number of groups, called buckets, based on a specified expression. Bucket boundaries are automatically determined in an attempt to evenly distribute the documents into the specified number of buckets. - - * - :ref:`ChangeStream() ` - + - :ref:`BucketAuto() ` + + * - :manual:`$changeStream ` - Returns a change stream cursor for the collection. This stage can occur only once in an aggregation pipeline and it must occur as the first stage. - - * - :ref:`ChangeStreamSplitLargeEvent() ` - + - :ref:`ChangeStream() ` + + * - :manual:`$changeStreamSplitLargeEvent ` - Splits large change stream events that exceed 16 MB into smaller fragments returned in a change stream cursor. - You can use $changeStreamSplitLargeEvent only in a $changeStream pipeline, and + You can use ``$changeStreamSplitLargeEvent`` only in a ``$changeStream`` pipeline, and it must be the final stage in the pipeline. + - :ref:`ChangeStreamSplitLargeEvent() ` - * - :ref:`Count() ` - + * - :manual:`$count ` - Returns a count of the number of documents at this stage of the aggregation pipeline. + - :ref:`Count() ` - * - :ref:`Densify() ` - + * - :manual:`$densify ` - Creates new documents in a sequence of documents where certain values in a field are missing. - - * - :ref:`Documents() ` + - :ref:`Densify() ` + * - :manual:`$documents ` - Returns literal documents from input expressions. + - :ref:`Documents() ` - * - :ref:`Facet() ` - + * - :manual:`$facet ` - Processes multiple aggregation pipelines within a single stage on the same set of input documents. Enables the creation of multi-faceted aggregations capable of characterizing data across multiple dimensions, or facets, in a single stage. + - :ref:`Facet() ` - * - :ref:`GraphLookup() ` - + * - :manual:`$graphLookup ` - Performs a recursive search on a collection. To each output document, adds a new array field that contains the traversal results of the recursive search for that document. + - :ref:`GraphLookup() ` - * - :ref:`Group() ` - + * - :manual:`$group ` - Groups input documents by a specified identifier expression and applies the accumulator expressions, if specified, to each group. Consumes all input documents and outputs one document per each distinct group. The output documents contain only the identifier field and, if specified, accumulated fields. + - :ref:`Group() ` - * - :ref:`Limit() ` - + * - :manual:`$limit ` - Passes the first *n* documents unmodified to the pipeline, where *n* is the specified limit. For each input document, outputs either one document (for the first *n* documents) or zero documents (after the first *n* documents). - - * - :ref:`Lookup() ` - + - :ref:`Limit() ` + + * - :manual:`$lookup ` - Performs a left outer join to another collection in the *same* database to filter in documents from the "joined" collection for processing. + - :ref:`Lookup() ` - * - :ref:`Match() ` - + * - :manual:`$match ` - Filters the document stream to allow only matching documents to pass unmodified into the next pipeline stage. For each input document, outputs either one document (a match) or zero documents (no match). + - :ref:`Match() ` - * - :ref:`Merge() ` - + * - :manual:`$merge ` - Writes the resulting documents of the aggregation pipeline to a collection. The stage can incorporate (insert new documents, merge documents, replace documents, keep existing @@ -108,26 +179,26 @@ method name. custom update pipeline) the results into an output collection. To use this stage, it must be the last stage in the pipeline. + - :ref:`Merge() ` - * - :ref:`Out() ` - + * - :manual:`$out ` - Writes the resulting documents of the aggregation pipeline to a collection. To use this stage, it must be the last stage in the pipeline. + - :ref:`Out() ` - * - :ref:`Project() ` - + * - :manual:`$project ` - Reshapes each document in the stream, such as by adding new fields or removing existing fields. For each input document, outputs one document. - - * - :ref:`RankFusion() ` + - :ref:`Project() ` + * - :manual:`$rankFusion ` - Uses a rank fusion algorithm to combine results from a Vector Search query and an Atlas Search query. - - * - :ref:`ReplaceRoot() ` + - :ref:`RankFusion() ` + * - :manual:`$replaceRoot ` - Replaces a document with the specified embedded document. The operation replaces all existing fields in the input document, including the ``_id`` field. Specify a document embedded in @@ -135,23 +206,23 @@ method name. top level. The ``$replaceWith`` stage is an alias for the ``$replaceRoot`` stage. + - :ref:`ReplaceRoot() ` - * - :ref:`ReplaceWith() ` - + * - :manual:`$replaceWith ` - Replaces a document with the specified embedded document. The operation replaces all existing fields in the input document, including the ``_id`` field. Specify a document embedded in the input document to promote the embedded document to the top level. The ``$replaceWith`` stage is an alias for the ``$replaceRoot`` stage. + - :ref:`ReplaceWith() ` - * - :ref:`Sample() ` - + * - :manual:`$sample ` - Randomly selects the specified number of documents from its input. + - :ref:`Sample() ` - * - :ref:`Search() ` - + * - :manual:`$search ` - Performs a full-text search of the field or fields in an :atlas:`Atlas ` collection. @@ -160,9 +231,9 @@ method name. available for self-managed deployments. To learn more, see :atlas:`Atlas Search Aggregation Pipeline Stages ` in the Atlas documentation. + - :ref:`Search() ` - * - :ref:`SearchMeta() ` - + * - :manual:`$searchMeta ` - Returns different types of metadata result documents for the :atlas:`Atlas Search ` query against an :atlas:`Atlas ` @@ -172,60 +243,59 @@ method name. and is not available for self-managed deployments. To learn more, see :atlas:`Atlas Search Aggregation Pipeline Stages ` in the Atlas documentation. + - :ref:`SearchMeta() ` - * - :ref:`Set() ` - + * - :manual:`$set ` - Adds new fields to documents. Like the ``Project()`` method, this method reshapes each document in the stream by adding new fields to output documents that contain both the existing fields from the input documents and the newly added fields. + - :ref:`Set() ` - * - :ref:`SetWindowFields() ` - + * - :manual:`$setWindowFields ` - Groups documents into windows and applies one or more operators to the documents in each window. + - :ref:`SetWindowFields() ` - .. versionadded:: 5.0 - - * - :ref:`Skip() ` - + * - :manual:`$skip ` - Skips the first *n* documents, where *n* is the specified skip number, and passes the remaining documents unmodified to the pipeline. For each input document, outputs either zero documents (for the first *n* documents) or one document (if after the first *n* documents). + - :ref:`Skip() ` - * - :ref:`Sort() ` - + * - :manual:`$sort ` - Reorders the document stream by a specified sort key. The documents remain unmodified. For each input document, outputs one document. + - :ref:`Sort() ` - * - :ref:`SortByCount() ` - + * - :manual:`$sortByCount ` - Groups incoming documents based on the value of a specified expression, then computes the count of documents in each distinct group. + - :ref:`SortByCount() ` - * - :ref:`UnionWith() ` - + * - :manual:`$unionWith ` - Combines pipeline results from two collections into a single result set. + - :ref:`UnionWith() ` - * - :ref:`Unwind() ` - + * - :manual:`$unwind ` - Deconstructs an array field from the input documents to output a document for *each* element. Each output document replaces the array with an element value. For each input document, outputs *n* Documents, where *n* is the number of array elements. *n* can be zero for an empty array. + - :ref:`Unwind() ` - * - :ref:`VectorSearch() ` - + * - :manual:`$vectorSearch ` - Performs an :abbr:`ANN (Approximate Nearest Neighbor)` or :abbr:`ENN (Exact Nearest Neighbor)` search on a vector in the specified field of an :atlas:`Atlas ` collection. + - :ref:`VectorSearch() ` You can add stages to your pipeline that don't have corresponding type-safe methods in the ``PipelineDefinitionBuilder`` interface by providing your query From b1ab3a62bebf90c24f80a9f250bc2c024764c3a4 Mon Sep 17 00:00:00 2001 From: Mike Woofter <108414937+mongoKart@users.noreply.github.com> Date: Mon, 14 Apr 2025 20:56:48 -0500 Subject: [PATCH 06/17] first draft --- source/aggregation/stages.txt | 90 +++++++++++++++++++++++------------ 1 file changed, 59 insertions(+), 31 deletions(-) diff --git a/source/aggregation/stages.txt b/source/aggregation/stages.txt index 47e292d8..f481d150 100644 --- a/source/aggregation/stages.txt +++ b/source/aggregation/stages.txt @@ -20,29 +20,32 @@ Aggregation Stages Overview -------- -The {+driver-short+} provides type-safe methods that you can use to create an -aggregation pipeline. On this page, you can learn how to use these methods. +On this page, you can learn how to create an aggregation pipeline and add stages to it +by using methods in the {+driver-short+}. Build an Aggregation Pipeline ----------------------------- -The following sections describe the different ways to build an aggregation -pipeline by using the {+driver-long+}. +You can use the {+driver-short+} to build an aggregation pipeline by using builders +methods or BSON documents. See the following sections to learn more about each of these +approaches. + +.. _csharp-aggregation-stages-builders: Builders ~~~~~~~~ -You can create an aggregation pipeline in the following ways: - -- Create an ``EmptyPipelineDefinition`` object and chain calls to the relevant - aggregation methods. Then, pass the pipeline object to the ``IMongoCollection.Aggregate()`` - method. as shown in the following example: +You can build a type-safe aggregation pipeline in the following ways: -- Chain aggregation methods directly from the call to the +- Construct an ``EmptyPipelineDefinition`` object. Chain calls from this object + to the relevant aggregation methods. Then, pass the pipeline object to the ``IMongoCollection.Aggregate()`` method. +- Call the ``IMongoCollection.Aggregate()`` method. Chain calls from this + method call to the relevant aggregation methods. + Select the :guilabel:`EmptyPipelineDefinition` -or :guilabel:`Aggregate` tab to see the corresponding code: +or :guilabel:`Aggregate` tab to see the corresponding code for each approach: .. tabs:: @@ -66,25 +69,52 @@ or :guilabel:`Aggregate` tab to see the corresponding code: .. code-block:: csharp // Defines and executes the aggregation pipeline - var pipeline = collection.Aggregate() + var results = collection.Aggregate() .Match(...) .Group(...) .Merge(...); +.. _csharp-aggregation-stages-bsondocument: + +BsonDocument +~~~~~~~~~~~~ + +Some aggregation stages don't have corresponding methods in the {+driver-short+}. +To add these stages to your pipeline, use ``BsonDocument`` objects or string literals +to construct a stage in the Query API syntax. Then, pass the BSON document to the +`PipelineDefinitionBuilder.AppendStage() <{+new-api-root+}/MongoDB.Driver/MongoDB.Driver.PipelineDefinitionBuilder.AppendStage.html>`__ +method. This syntax supports all stages in the aggregation pipeline, but doesn't provide +type hints or type safety to your code. + +The following code example shows how to add ``$unset``, an aggregation +stage without a corresponding method, to an empty aggregation pipeline: + +.. code-block:: csharp + + var pipeline = new EmptyPipelineDefinition().AppendStage("{ $unset: "field1" }"); + +.. note:: + + If you use a ``BsonDocument`` to define a pipeline stage, the driver doesn't + recognize any ``BsonClassMap`` attributes, serialization attributes, or + serialization conventions. The field names that you use in the ``BsonDocument`` must + match the field names stored in {+mdb-server+}. + Aggregation Stage Methods ------------------------- The following table lists the builders methods in the {+driver-short+} that correspond -to stages in the aggregation pipeline. Because each of these methods returns a -``PipelineDefinition`` object, you can chain method calls together. -For more information about a method, click the -method name. +to stages in the aggregation pipeline. For more information about an aggregation stage, +click the stage name. For more information about a builders method, click the +method name. If an aggregation stage isn't in the table, you must use the +:ref:`BsonDocument ` syntax to add the stage +to your pipeline. .. list-table:: :header-rows: 1 :widths: 20 80 - * - Stage + * - Aggregation Stage - Description - Builders Method @@ -297,21 +327,19 @@ method name. :atlas:`Atlas ` collection. - :ref:`VectorSearch() ` -You can add stages to your pipeline that don't have corresponding type-safe -methods in the ``PipelineDefinitionBuilder`` interface by providing your query -as a ``BsonDocument`` to the `AppendStage() method -<{+new-api-root+}/MongoDB.Driver/MongoDB.Driver.PipelineDefinitionBuilder.AppendStage.html>`__. - -.. code-block:: csharp +API Documentation +----------------- - var pipeline = new EmptyPipelineDefinition().AppendStage("{ $set: { field1: '$field2' } }"); +To learn more about assembling an aggregation pipeline, see +:manual:`Aggregation Pipeline `. -.. note:: +To learn more about creating pipeline stages, see +:manual:`Aggregation Stages `. - When using a ``BsonDocument`` to define your pipeline stage, the driver does - not take into account any ``BsonClassMap``, serialization attributes or - serialization conventions. The field names used in the ``BsonDocument`` must - match those stored on the server. +For more information about the methods and classes used on this page, see the +following API documentation: - For more information on providing a query as a ``BsonDocument``, see our - :ref:`FAQ page `. \ No newline at end of file +- `Aggregate() <{+new-api-root+}/MongoDB.Driver/MongoDB.Driver.IMongoCollection-1.Aggregate.html>`__ +- `AggregateOptions <{+new-api-root+}/MongoDB.Driver/MongoDB.Driver.AggregateOptions.html>`__ +- `EmptyPipelineDefinition <{+new-api-root+}/MongoDB.Driver/MongoDB.Driver.EmptyPipelineDefinition-1.-ctor.html>`__ +- `BsonDocument <{+new-api-root+}/MongoDB.Bson/MongoDB.Bson.BsonDocument.html>`__ \ No newline at end of file From e29ccda3da3c85109f29f344642467a9a7f1f26c Mon Sep 17 00:00:00 2001 From: Mike Woofter <108414937+mongoKart@users.noreply.github.com> Date: Tue, 15 Apr 2025 08:24:52 -0500 Subject: [PATCH 07/17] remove stage files --- source/aggregation/stages/bucket.txt | 60 ------ source/aggregation/stages/bucketAuto.txt | 51 ----- source/aggregation/stages/changeStream.txt | 20 -- .../changeStreamSplitLargeEvent copy.txt | 18 -- source/aggregation/stages/count.txt | 44 ----- source/aggregation/stages/densify.txt | 81 -------- source/aggregation/stages/documents.txt | 46 ----- source/aggregation/stages/facet.txt | 39 ---- source/aggregation/stages/graphLookup.txt | 60 ------ source/aggregation/stages/group.txt | 97 --------- source/aggregation/stages/limit.txt | 57 ------ source/aggregation/stages/lookup.txt | 146 -------------- source/aggregation/stages/match.txt | 78 -------- source/aggregation/stages/merge.txt | 49 ----- source/aggregation/stages/out.txt | 89 --------- source/aggregation/stages/project.txt | 95 --------- source/aggregation/stages/rankFusion.txt | 18 -- source/aggregation/stages/replaceRoot.txt | 34 ---- source/aggregation/stages/replaceWith.txt | 18 -- source/aggregation/stages/sample.txt | 60 ------ source/aggregation/stages/search.txt | 45 ----- source/aggregation/stages/searchMeta.txt | 46 ----- source/aggregation/stages/set.txt | 18 -- source/aggregation/stages/setWindowFields.txt | 44 ----- source/aggregation/stages/skip.txt | 61 ------ source/aggregation/stages/sort.txt | 87 -------- source/aggregation/stages/sortByCount.txt | 47 ----- source/aggregation/stages/unionWith.txt | 18 -- source/aggregation/stages/unwind.txt | 186 ------------------ source/aggregation/stages/vectorSearch.txt | 97 --------- 30 files changed, 1809 deletions(-) delete mode 100644 source/aggregation/stages/bucket.txt delete mode 100644 source/aggregation/stages/bucketAuto.txt delete mode 100644 source/aggregation/stages/changeStream.txt delete mode 100644 source/aggregation/stages/changeStreamSplitLargeEvent copy.txt delete mode 100644 source/aggregation/stages/count.txt delete mode 100644 source/aggregation/stages/densify.txt delete mode 100644 source/aggregation/stages/documents.txt delete mode 100644 source/aggregation/stages/facet.txt delete mode 100644 source/aggregation/stages/graphLookup.txt delete mode 100644 source/aggregation/stages/group.txt delete mode 100644 source/aggregation/stages/limit.txt delete mode 100644 source/aggregation/stages/lookup.txt delete mode 100644 source/aggregation/stages/match.txt delete mode 100644 source/aggregation/stages/merge.txt delete mode 100644 source/aggregation/stages/out.txt delete mode 100644 source/aggregation/stages/project.txt delete mode 100644 source/aggregation/stages/rankFusion.txt delete mode 100644 source/aggregation/stages/replaceRoot.txt delete mode 100644 source/aggregation/stages/replaceWith.txt delete mode 100644 source/aggregation/stages/sample.txt delete mode 100644 source/aggregation/stages/search.txt delete mode 100644 source/aggregation/stages/searchMeta.txt delete mode 100644 source/aggregation/stages/set.txt delete mode 100644 source/aggregation/stages/setWindowFields.txt delete mode 100644 source/aggregation/stages/skip.txt delete mode 100644 source/aggregation/stages/sort.txt delete mode 100644 source/aggregation/stages/sortByCount.txt delete mode 100644 source/aggregation/stages/unionWith.txt delete mode 100644 source/aggregation/stages/unwind.txt delete mode 100644 source/aggregation/stages/vectorSearch.txt diff --git a/source/aggregation/stages/bucket.txt b/source/aggregation/stages/bucket.txt deleted file mode 100644 index 69bc0e2d..00000000 --- a/source/aggregation/stages/bucket.txt +++ /dev/null @@ -1,60 +0,0 @@ -.. _csharp-aggregation-bucket: - -====== -Bucket -====== - -.. facet:: - :name: genre - :values: reference - -.. meta:: - :keywords: dotnet, code example, transform, pipeline, group - -.. contents:: On this page - :local: - :backlinks: none - :depth: 2 - :class: singlecol - -Overview --------- - -Use the ``bucket()`` method to create a :manual:`$bucket ` -pipeline stage that automates the bucketing of data around predefined boundary -values. - -Example -------- - -The following example creates a pipeline stage that groups incoming documents based -on the value of their ``screenSize`` field, inclusive of the lower boundary -and exclusive of the upper boundary. - -.. literalinclude:: /includes/aggregation/Builders.cs - :start-after: // begin basicBucket - :end-before: // end basicBucket - :language: csharp - :dedent: - -Use the ``BucketOptions`` class to specify a default bucket for values -outside of the specified boundaries, and to specify additional accumulators. - -The following example creates a pipeline stage that groups incoming documents based -on the value of their ``screenSize`` field, counting the number of documents -that fall within each bucket, pushing the value of ``screenSize`` into a -field called ``matches``, and capturing any screen sizes greater than "70" -into a bucket called "monster" for monstrously large screen sizes: - -.. literalinclude:: /includes/aggregation/Builders.cs - :start-after: // begin bucketOptions - :end-before: // end bucketOptions - :language: csharp - :dedent: - -API Documentation ------------------ - -To learn more about the methods and types used on this page, see the following -API documentation: - diff --git a/source/aggregation/stages/bucketAuto.txt b/source/aggregation/stages/bucketAuto.txt deleted file mode 100644 index ac5fe169..00000000 --- a/source/aggregation/stages/bucketAuto.txt +++ /dev/null @@ -1,51 +0,0 @@ -.. _csharp-aggregation-bucketauto: - -========== -BucketAuto -========== - -.. facet:: - :name: genre - :values: reference - -.. meta:: - :keywords: code example, transform, pipeline - -.. contents:: On this page - :local: - :backlinks: none - :depth: 2 - :class: singlecol - - -BucketAuto ----------- - -Use the ``bucketAuto()`` method to create a :manual:`$bucketAuto ` -pipeline stage that automatically determines the boundaries of each bucket -in its attempt to distribute the documents evenly into a specified number of buckets. - -The following example creates a pipeline stage that will attempt to create and evenly -distribute documents into *10* buckets using the value of their ``price`` field: - -.. literalinclude:: /includes/aggregation/Builders.cs - :start-after: // begin bucketAutoBasic - :end-before: // end bucketAutoBasic - :language: csharp - :dedent: - - -Use the ``BucketAutoOptions`` class to specify a :wikipedia:`preferred number ` -based scheme to set boundary values, and specify additional accumulators. - -The following example creates a pipeline stage that will attempt to create and evenly -distribute documents into *10* buckets using the value of their ``price`` field, -setting the bucket boundaries at powers of 2 (2, 4, 8, 16, ...). It also counts -the number of documents in each bucket, and calculates their average ``price`` -in a new field called ``avgPrice``: - -.. literalinclude:: /includes/aggregation/Builders.cs - :start-after: // begin bucketAutoOptions - :end-before: // end bucketAutoOptions - :language: csharp - :dedent: \ No newline at end of file diff --git a/source/aggregation/stages/changeStream.txt b/source/aggregation/stages/changeStream.txt deleted file mode 100644 index a62a9842..00000000 --- a/source/aggregation/stages/changeStream.txt +++ /dev/null @@ -1,20 +0,0 @@ -.. _csharp-aggregation-changestream: - -============ -ChangeStream -============ - -.. facet:: - :name: genre - :values: reference - -.. meta:: - :keywords: code example, transform, pipeline - -.. contents:: On this page - :local: - :backlinks: none - :depth: 2 - :class: singlecol -Appends a $changeStream stage to the pipeline. Normally you would prefer to use the Watch method of IMongoCollection. Only use this method if subsequent stages project away the resume token (the _id) or you don't want the resulting cursor to automatically resume. -Returns a Change Stream cursor on a collection, a database, or an entire cluster. Must be used as the first stage in an aggregation pipeline. \ No newline at end of file diff --git a/source/aggregation/stages/changeStreamSplitLargeEvent copy.txt b/source/aggregation/stages/changeStreamSplitLargeEvent copy.txt deleted file mode 100644 index 528bca02..00000000 --- a/source/aggregation/stages/changeStreamSplitLargeEvent copy.txt +++ /dev/null @@ -1,18 +0,0 @@ -.. _csharp-aggregation-changestreamsplitlargeevent: - -=========================== -changeStreamSplitLargeEvent -=========================== - -.. facet:: - :name: genre - :values: reference - -.. meta:: - :keywords: code example, transform, pipeline - -.. contents:: On this page - :local: - :backlinks: none - :depth: 2 - :class: singlecol \ No newline at end of file diff --git a/source/aggregation/stages/count.txt b/source/aggregation/stages/count.txt deleted file mode 100644 index 1fbf6113..00000000 --- a/source/aggregation/stages/count.txt +++ /dev/null @@ -1,44 +0,0 @@ -.. _csharp-aggregation-count: - -===== -Count -===== - -.. facet:: - :name: genre - :values: reference - -.. meta:: - :keywords: code example, transform, pipeline - -.. contents:: On this page - :local: - :backlinks: none - :depth: 2 - :class: singlecol - - -Count ------ - -Use the ``count()`` method to create a :manual:`$count ` -pipeline stage that counts the number of documents that enter the stage, and assigns -that value to a specified field name. If you do not specify a field, -``count()`` defaults the field name to "count". - -.. tip:: - - The ``$count`` stage is syntactic sugar for: - - .. code-block:: json - - { "$group":{ "_id": 0, "count": { "$sum" : 1 } } } - -The following example creates a pipeline stage that outputs the count of incoming -documents in a field called "total": - -.. literalinclude:: /includes/aggregation/Builders.cs - :start-after: // begin count - :end-before: // end count - :language: csharp - :dedent: \ No newline at end of file diff --git a/source/aggregation/stages/densify.txt b/source/aggregation/stages/densify.txt deleted file mode 100644 index 39508fc9..00000000 --- a/source/aggregation/stages/densify.txt +++ /dev/null @@ -1,81 +0,0 @@ -.. _csharp-aggregation-densify: - -======= -Densify -======= - -.. facet:: - :name: genre - :values: reference - -.. meta:: - :keywords: code example, transform, pipeline - -.. contents:: On this page - :local: - :backlinks: none - :depth: 2 - :class: singlecol - - -Densify -------- - -Use the ``densify()`` method to create a -:manual:`$densify ` pipeline -stage that generates a sequence of documents to span a specified interval. -Creates new documents in a sequence of documents where certain values in a field are missing. - -You can use $densify to: - - Fill gaps in time series data. - - Add missing values between groups of data. - - Populate your data with a specified range of values. -.. tip:: - - You can use the ``$densify()`` aggregation stage only when running - MongoDB v5.1 or later. - -Consider the following documents retrieved from the :atlas:`Atlas sample weather dataset ` -that contain measurements for a similar ``position`` field, spaced one hour -apart: - -.. code-block:: none - :copyable: false - - Document{{ _id=5553a..., position=Document{{type=Point, coordinates=[-47.9, 47.6]}}, ts=Mon Mar 05 08:00:00 EST 1984, ... }} - Document{{ _id=5553b..., position=Document{{type=Point, coordinates=[-47.9, 47.6]}}, ts=Mon Mar 05 09:00:00 EST 1984, ... }} - -Suppose you needed to create a pipeline stage that performs the following -actions on these documents: - -- Add a document at every 15-minute interval for which a ``ts`` value does not - already exist. -- Group the documents by the ``position`` field. - -The call to the ``densify()`` aggregation stage builder that accomplishes -these actions resembles the following: - -.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggregateDensify.java - :start-after: // begin densify aggregate - :end-before: // end densify aggregate - :language: csharp - :dedent: - -The following output highlights the documents generated by the aggregate stage -which contain ``ts`` values every 15 minutes between the existing documents: - -.. code-block:: none - :emphasize-lines: 2-4 - :copyable: false - - Document{{ _id=5553a..., position=Document{{type=Point, coordinates=[-47.9, 47.6]}}, ts=Mon Mar 05 08:00:00 EST 1984, ... }} - Document{{ position=Document{{coordinates=[-47.9, 47.6]}}, ts=Mon Mar 05 08:15:00 EST 1984 }} - Document{{ position=Document{{coordinates=[-47.9, 47.6]}}, ts=Mon Mar 05 08:30:00 EST 1984 }} - Document{{ position=Document{{coordinates=[-47.9, 47.6]}}, ts=Mon Mar 05 08:45:00 EST 1984 }} - Document{{ _id=5553b..., position=Document{{type=Point, coordinates=[-47.9, 47.6]}}, ts=Mon Mar 05 09:00:00 EST 1984, ... }} - -See the `densify package API documentation <{+core-api+}/client/model/densify/package-summary.html>`__ -for more information. diff --git a/source/aggregation/stages/documents.txt b/source/aggregation/stages/documents.txt deleted file mode 100644 index 6c491ff4..00000000 --- a/source/aggregation/stages/documents.txt +++ /dev/null @@ -1,46 +0,0 @@ -.. _csharp-aggregation-documents: - -========= -Documents -========= - -.. facet:: - :name: genre - :values: reference - -.. meta:: - :keywords: code example, transform, pipeline - -.. contents:: On this page - :local: - :backlinks: none - :depth: 2 - :class: singlecol - - -Documents ---------- - -Use the ``documents()`` method to create a -:manual:`$documents ` -pipeline stage that returns literal documents from input values. - -.. important:: - - If you use a ``$documents`` stage in an aggregation pipeline, it must be the first - stage in the pipeline. - -The following example creates a pipeline stage that creates -sample documents with a ``title`` field: - -.. literalinclude:: /includes/aggregation/Builders.cs - :start-after: // begin documents - :end-before: // end documents - :language: csharp - :dedent: - -.. important:: - - If you use the ``documents()`` method to provide the input to an aggregation pipeline, - you must call the ``aggregate()`` method on a database instead of on a - collection. \ No newline at end of file diff --git a/source/aggregation/stages/facet.txt b/source/aggregation/stages/facet.txt deleted file mode 100644 index 3e711a28..00000000 --- a/source/aggregation/stages/facet.txt +++ /dev/null @@ -1,39 +0,0 @@ -.. _csharp-aggregation-facet: - -===== -Facet -===== - -.. facet:: - :name: genre - :values: reference - -.. meta:: - :keywords: code example, transform, pipeline - -.. contents:: On this page - :local: - :backlinks: none - :depth: 2 - :class: singlecol - - -Facet ------ - -Use the ``facet()`` method to create a :manual:`$facet ` -pipeline stage that allows for the definition of parallel pipelines. - -The following example creates a pipeline stage that executes two parallel aggregations: - -- The first aggregation distributes incoming documents into 5 groups according to - their ``attributes.screen_size`` field. - -- The second aggregation counts all *manufacturers* and returns their count, limited - to the top **5**. - -.. literalinclude:: /includes/aggregation/Builders.cs - :start-after: // begin facet - :end-before: // end facet - :language: csharp - :dedent: \ No newline at end of file diff --git a/source/aggregation/stages/graphLookup.txt b/source/aggregation/stages/graphLookup.txt deleted file mode 100644 index c461e40d..00000000 --- a/source/aggregation/stages/graphLookup.txt +++ /dev/null @@ -1,60 +0,0 @@ -.. _csharp-aggregation-graphlookup: - -=========== -GraphLookup -=========== - -.. facet:: - :name: genre - :values: reference - -.. meta:: - :keywords: code example, transform, pipeline - -.. contents:: On this page - :local: - :backlinks: none - :depth: 2 - :class: singlecol - - -GraphLookup ------------ - -Use the ``graphLookup()`` method to create a :manual:`$graphLookup ` -pipeline stage that performs a recursive search on a specified collection to match -a specified field in one document to a specified field of another document. - -The following example computes the social network graph for users in the -``contacts`` collection, recursively matching the value in the ``friends`` field -to the ``name`` field: - -.. literalinclude:: /includes/aggregation/Builders.cs - :start-after: // begin graphLookupBasic - :end-before: // end graphLookupBasic - :language: csharp - :dedent: - -Using ``GraphLookupOptions``, you can specify the depth to recurse as well as -the name of the depth field, if desired. In this example, ``$graphLookup`` will -recurse up to two times, and create a field called ``degrees`` with the -recursion depth information for every document. - -.. literalinclude:: /includes/aggregation/Builders.cs - :start-after: // begin graphLookupDepth - :end-before: // end graphLookupDepth - :language: csharp - :dedent: - -Using ``GraphLookupOptions``, you can specify a filter that documents must match -in order for MongoDB to include them in your search. In this -example, only links with "golf" in their ``hobbies`` field will be included. - -.. literalinclude:: /includes/aggregation/Builders.cs - :start-after: // begin graphLookupMatch - :end-before: // end graphLookupMatch - :language: csharp - :dedent: - -- The :manual:`$graphLookup ` stage has - a strict memory limit of 100 megabytes and ignores the ``AllowDiskUse`` property. \ No newline at end of file diff --git a/source/aggregation/stages/group.txt b/source/aggregation/stages/group.txt deleted file mode 100644 index b4d9804a..00000000 --- a/source/aggregation/stages/group.txt +++ /dev/null @@ -1,97 +0,0 @@ -.. _csharp-aggregation-group: - -===== -Group -===== - -.. facet:: - :name: genre - :values: reference - -.. meta:: - :keywords: code example, transform, pipeline - -.. contents:: On this page - :local: - :backlinks: none - :depth: 2 - :class: singlecol - - -Group ------ - -Use the ``group()`` method to create a :manual:`$group ` -pipeline stage to group documents by a specified expression and output a document -for each distinct grouping. - -.. tip:: - - The driver includes the `Accumulators <{+core-api+}/client/model/Accumulators.html>`__ - class with static factory methods for each of the supported accumulators. - -The following example creates a pipeline stage that groups documents by the value -of the ``customerId`` field. Each group accumulates the sum and average -of the values of the ``quantity`` field into the ``totalQuantity`` and -``averageQuantity`` fields. - -.. literalinclude:: /includes/aggregation/Builders.cs - :start-after: // begin group - :end-before: // end group - :language: csharp - :dedent: - -Learn more about accumulator operators from the Server manual section -on :manual:`Accumulators `. - - -$group -~~~~~~ - -The ``$group`` aggregation stage separates documents into groups according to -the criteria you specify. - -Select the :guilabel:`Method Syntax` or :guilabel:`Query Syntax` tab to see how -to generate an ``$group`` stage using LINQ: - -.. tabs:: - - .. tab:: Method Syntax - :tabid: method-syntax - - .. code-block:: csharp - :emphasize-lines: 2 - - var query = queryableCollection - .GroupBy(r => r.Cuisine) - .Select(g => new { Cuisine = g.Key, Count = g.Count() }); - - .. tab:: Query Syntax - :tabid: query-syntax - - .. code-block:: csharp - :emphasize-lines: 2 - - var query = from r in queryableCollection - group r by r.Cuisine into g - select new {Cuisine = g.Key, Count = g.Count()}; - -The preceding example groups each document by the value in its ``Cuisine`` field, -then counts how many documents have each ``Cuisine`` value. The result contains -the following documents: - -.. code-block:: json - - // Results Truncated - - { "cuisine" : "Caribbean", "count" : 657 } - { "cuisine" : "Café/Coffee/Tea", "count" : 1214 } - { "cuisine" : "Iranian", "count" : 2 } - { "cuisine" : "Nuts/Confectionary", "count" : 6 } - { "cuisine" : "Middle Eastern", "count" : 168 } - ... - -.. note:: Result Order - - The preceding queries don't always return results in the same order. Running - this example may return the results in a different order than shown above. \ No newline at end of file diff --git a/source/aggregation/stages/limit.txt b/source/aggregation/stages/limit.txt deleted file mode 100644 index b737744d..00000000 --- a/source/aggregation/stages/limit.txt +++ /dev/null @@ -1,57 +0,0 @@ -.. _csharp-aggregation-limit: - -===== -Limit -===== - -.. facet:: - :name: genre - :values: reference - -.. meta:: - :keywords: code example, transform, pipeline - -.. contents:: On this page - :local: - :backlinks: none - :depth: 2 - :class: singlecol - - -Limit ------ - -Use the :manual:`$limit ` pipeline stage -to limit the number of documents passed to the next stage. - -The following example creates a pipeline stage that limits the number of documents to ``10``: - -.. literalinclude:: /includes/aggregation/Builders.cs - :start-after: // begin limit - :end-before: // end limit - :language: csharp - :dedent: - -$limit -~~~~~~ - -The ``$limit`` aggregation stage limits the number of documents returned by the -query. The following example shows how to generate a ``$limit`` stage using LINQ: - -.. code-block:: csharp - :emphasize-lines: 4 - - var query = queryableCollection - .Where(r => r.Cuisine == "Italian") - .Select(r => new {r.Name, r.Cuisine}) - .Take(5); - -The result of the preceding example contains the following documents: - -.. code-block:: json - - { "name" : "Philadelhia Grille Express", "cuisine" : "Italian" } - { "name" : "Isle Of Capri Resturant", "cuisine" : "Italian" } - { "name" : "Marchis Restaurant", "cuisine" : "Italian" } - { "name" : "Crystal Room", "cuisine" : "Italian" } - { "name" : "Forlinis Restaurant", "cuisine" : "Italian" } \ No newline at end of file diff --git a/source/aggregation/stages/lookup.txt b/source/aggregation/stages/lookup.txt deleted file mode 100644 index f9be046b..00000000 --- a/source/aggregation/stages/lookup.txt +++ /dev/null @@ -1,146 +0,0 @@ -.. _csharp-aggregation-lookup: - -====== -Lookup -====== - -.. facet:: - :name: genre - :values: reference - -.. meta:: - :keywords: code example, transform, pipeline - -.. contents:: On this page - :local: - :backlinks: none - :depth: 2 - :class: singlecol - - -Lookup ------- - -Use the ``lookup()`` method to create a :manual:`$lookup ` -pipeline stage to perform joins and uncorrelated subqueries between two collections. - -Left Outer Join -~~~~~~~~~~~~~~~ - -The following example creates a pipeline stage that performs a left outer -join between the ``movies`` and ``comments`` collections: - -- It joins the ``_id`` field from ``movies`` to the ``movie_id`` field in ``comments`` -- It outputs the results in the ``joined_comments`` field: - -.. literalinclude:: /includes/aggregation/Builders.cs - :start-after: // begin basic lookup - :end-before: // end basic lookup - :language: csharp - :dedent: - -Full Join and Uncorrelated Subqueries -~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ - -The following example creates a pipeline stage that joins two collections, ``orders`` -and ``warehouses``, by the item and whether the available quantity is enough -to fulfill the ordered quantity: - -.. literalinclude:: /includes/aggregation/Builders.cs - :start-after: // begin advanced lookup - :end-before: // end advanced lookup - :language: csharp - :dedent: - -$lookup -~~~~~~~ - -The ``$lookup`` aggregation stage joins documents from one collection to documents -from another collection in the same database. The ``$lookup`` stage adds a new -array field to each input document. The new array field contains the matching -documents from the "joined" collection. - -.. note:: - - To perform a lookup, you must make both collections queryable by using the - ``AsQueryable()`` method. - - To learn how to make a collection queryable, see :ref:`csharp-linq-queryable`. - -Consider a second collection in the ``sample_restaurants`` database called -``reviews`` that has restaurant reviews. You can join documents from that collection -to documents with the same ``name`` value in the ``restaurants`` collection using -the ``$lookup`` stage. - -The following ``Review`` class models the documents in the ``reviews`` collection: - -.. literalinclude:: /includes/fundamentals/code-examples/linq.cs - :language: csharp - :dedent: - :start-after: start-review-model - :end-before: end-review-model - -Select the :guilabel:`Method Syntax` or :guilabel:`Query Syntax` tab to see how -to generate a ``$lookup`` stage by using LINQ: - -.. tabs:: - - .. tab:: Method Syntax - :tabid: method-syntax - - .. code-block:: csharp - - var query = queryableCollection - .GroupJoin(reviewCollection, - restaurant => restaurant.Name, - review => review.RestaurantName, - (restaurant, reviews) => - new { Restaurant = restaurant, Reviews = reviews } - ); - - .. tab:: Query Syntax - :tabid: query-syntax - - .. code-block:: csharp - - var query = from restaurant in queryableCollection - join rv in reviewCollection on restaurant.Name equals rv.RestaurantName into reviews - select new { restaurant, reviews }; - -The preceding example returns all documents from the ``restaurants`` collection. Each -restaurant document has an added field called ``reviews``, which contains all -reviews for that restaurant. A review matches a restaurant if the value of the -``name`` field in the review document matches the ``name`` field of the restaurant -document. - -The following shows a subset of the returned results: - -.. code-block:: json - - // Results Truncated - - { - "restaurant": { - "_id": ObjectId("..."), - "name": "The Movable Feast", - "restaurant_id": "40361606", - "cuisine": "American", - "address": { ... }, - "borough": "Brooklyn", - "grades": [ ... ] - }, - "reviews": [ - { - "_id": ObjectId("..."), - "restaurant_name": "The Movable Feast", - "reviewer": "Lazlo Cravensworth", - "review_text": "Great restaurant! 12/10 stars!" - }, - { - "_id": ObjectId("..."), - "restaurant_name": "The Movable Feast", - "reviewer": "Michael Scarn", - "review_text": "It really was a feast" - } - ] - } \ No newline at end of file diff --git a/source/aggregation/stages/match.txt b/source/aggregation/stages/match.txt deleted file mode 100644 index bf6755f4..00000000 --- a/source/aggregation/stages/match.txt +++ /dev/null @@ -1,78 +0,0 @@ -.. _csharp-aggregation-match: - -===== -Match -===== - -.. facet:: - :name: genre - :values: reference - -.. meta:: - :keywords: code example, transform, pipeline - -.. contents:: On this page - :local: - :backlinks: none - :depth: 2 - :class: singlecol - -Match ------ - -Use the ``match()`` method to create a :manual:`$match ` -pipeline stage that matches incoming documents against the specified -query filter, filtering out documents that do not match. - -.. tip:: - - The filter can be an instance of any class that implements ``Bson``, but it's - convenient to combine with use of the :ref:`Filters ` class. - -The following example creates a pipeline stage that matches all documents where the -``title`` field is equal to "The Shawshank Redemption": - -.. literalinclude:: /includes/aggregation/Builders.cs - :start-after: begin match - :end-before: end match - :language: csharp - :dedent: - - -$match -~~~~~~ - -The ``$match`` aggregation stage returns the documents that match a specified -criteria. - -Select the :guilabel:`Method Syntax` or :guilabel:`Query Syntax` tab to see how -to generate a ``$match`` stage using LINQ: - -.. tabs:: - - .. tab:: Method Syntax - :tabid: method-syntax - - .. code-block:: csharp - :emphasize-lines: 2 - - var query = queryableCollection - .Where(r => r.Name == "The Movable Feast"); - - .. tab:: Query Syntax - :tabid: query-syntax - - .. code-block:: csharp - :emphasize-lines: 2 - - var query = from r in queryableCollection - where r.Name == "The Movable Feast" - select r; - -The result of the preceding example contains the following document: - -.. code-block:: json - - // Results Truncated - - { "_id" : ObjectId(...), "name" : "The Movable Feast", "restaurant_id" : "40361606", "cuisine" : "American", "address" : {...}, "borough" : "Brooklyn", "grades" : [...] } diff --git a/source/aggregation/stages/merge.txt b/source/aggregation/stages/merge.txt deleted file mode 100644 index 90a54393..00000000 --- a/source/aggregation/stages/merge.txt +++ /dev/null @@ -1,49 +0,0 @@ -.. _csharp-aggregation-merge: - -===== -Merge -===== - -.. facet:: - :name: genre - :values: reference - -.. meta:: - :keywords: code example, transform, pipeline - -.. contents:: On this page - :local: - :backlinks: none - :depth: 2 - :class: singlecol - - -Merge ------ - -Use the ``merge()`` method to create a :manual:`$merge ` -pipeline stage that merges all documents into the specified collection. - -.. important:: - - The ``$merge`` stage must be the last stage in any aggregation pipeline. - -The following example merges the pipeline into the ``authors`` collection using the default -options: - -.. literalinclude:: /includes/aggregation/Builders.cs - :start-after: // begin mergeStage - :end-before: // end mergeStage - :language: csharp - :dedent: - -The following example merges the pipeline into the ``customers`` collection in the -``reporting`` database using some options that specify to replace -the document if both ``date`` and ``customerId`` match, otherwise insert the -document: - -.. literalinclude:: /includes/aggregation/Builders.cs - :start-after: // begin mergeOptions - :end-before: // end mergeOptions - :language: csharp - :dedent: diff --git a/source/aggregation/stages/out.txt b/source/aggregation/stages/out.txt deleted file mode 100644 index 908d8d5a..00000000 --- a/source/aggregation/stages/out.txt +++ /dev/null @@ -1,89 +0,0 @@ -.. _csharp-aggregation-out: - -=== -Out -=== - -.. facet:: - :name: genre - :values: reference - -.. meta:: - :keywords: code example, transform, pipeline - -.. contents:: On this page - :local: - :backlinks: none - :depth: 2 - :class: singlecol - -Out ---- - -Use the ``out()`` method to create an :manual:`$out ` -pipeline stage that writes all documents to the specified collection in -the same database. - -.. important:: - - The ``$out`` stage must be the last stage in any aggregation pipeline. - -The following example writes the results of the pipeline to the ``authors`` -collection: - -.. literalinclude:: /includes/aggregation/Builders.cs - :start-after: // begin out - :end-before: // end out - :language: csharp - :dedent: - -.. _csharp-builders-out: - -Write Pipeline Results to a Collection -~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ - -You can write the documents returned from an aggregation pipeline to a -collection by creating an ``$out`` stage at the end of your aggregation -pipeline. To create an ``$out`` stage, call the ``Out()`` method on a -``PipelineStageDefinitionBuilder``. The ``Out()`` method requires the name of -the collection you want to write the documents to. - -The following example builds an aggregation pipeline that matches all documents -with a ``season`` field value of ``"Spring"`` and outputs them to -a ``springFlowers`` collection: - -.. code-block:: csharp - - var outputCollection = database.GetCollection("springFlowers"); - var matchFilter = Builders.Filter.AnyEq(f => f.Season, "spring"); - - // Creates an aggregation pipeline and outputs resulting documents to a new collection. - var pipeline = new EmptyPipelineDefinition() - .Match(matchFilter) - .Out(outputCollection); - -You can write the results of an aggregation pipeline to a time series collection -by specifying a ``TimeSeriesOption`` object and passing it as the second -parameter to the ``Out()`` method. - -Imagine that the documents in the ``plants.flowers`` collection contain a ``datePlanted`` field that -holds BSON date values. You can store the documents in this collection in a time -series collection by using the ``datePlanted`` field as the time field. - -The following example creates a ``TimeSeriesOptions`` object and specifies -``datePlanted`` as the ``timeField``. It then builds an aggregation pipeline that matches all documents -with a ``season`` field value of ``"Spring"`` and outputs them to a -time series collection called ``springFlowerTimes``. - -.. code-block:: csharp - - var timeSeriesOptions = new TimeSeriesOptions("datePlanted"); - var collectionName = "springFlowerTimes" - var matchFilter = Builders.Filter.AnyEq(f => f.Season, "spring"); - - // Creates an aggregation pipeline and outputs resulting documents to a time series collection. - var pipeline = new EmptyPipelineDefinition() - .Match(matchFilter) - .Out(collectionName, timeSeriesOptions); - -To learn more about time series collections, see :ref:`csharp-time-series`. \ No newline at end of file diff --git a/source/aggregation/stages/project.txt b/source/aggregation/stages/project.txt deleted file mode 100644 index d503b6c4..00000000 --- a/source/aggregation/stages/project.txt +++ /dev/null @@ -1,95 +0,0 @@ -.. _csharp-aggregation-project: - -======= -Project -======= - -.. facet:: - :name: genre - :values: reference - -.. meta:: - :keywords: code example, transform, pipeline - -.. contents:: On this page - :local: - :backlinks: none - :depth: 2 - :class: singlecol - - -Project -------- - -Use the ``project()`` method to create a :manual:`$project ` -pipeline stage that project specified document fields. Field projection -in aggregation follows the same rules as :ref:`field projection in queries `. - -.. tip:: - - Though the projection can be an instance of any class that implements ``Bson``, - it's convenient to combine with use of :ref:`Projections `. - -The following example creates a pipeline stage that excludes the ``_id`` field but -includes the ``title`` and ``plot`` fields: - -.. literalinclude:: /includes/aggregation/Builders.cs - :start-after: begin project - :end-before: end project - :language: csharp - :dedent: - - -Projecting Computed Fields -~~~~~~~~~~~~~~~~~~~~~~~~~~ - -The ``$project`` stage can project computed fields as well. - -The following example creates a pipeline stage that projects the ``rated`` field -into a new field called ``rating``, effectively renaming the field. - -.. literalinclude:: /includes/aggregation/Builders.cs - :start-after: begin computed - :end-before: end computed - :language: csharp - :dedent: - -$project -~~~~~~~~ - -The ``$project`` aggregation stage returns a document containing only the specified -fields. - -Select the :guilabel:`Method Syntax` or :guilabel:`Query Syntax` tab to see how -to generate a ``$project`` stage using LINQ: - -.. tabs:: - - .. tab:: Method Syntax - :tabid: method-syntax - - .. code-block:: csharp - :emphasize-lines: 2 - - var query = queryableCollection - .Select(r => new { r.Name, r.Address }); - - .. tab:: Query Syntax - :tabid: query-syntax - - .. code-block:: csharp - :emphasize-lines: 2 - - var query = from r in queryableCollection - select new { r.Name, r.Address }; - -The result of the preceding example contains the following document: - -.. code-block:: json - - { "name" : "The Movable Feast", "address" : { "building" : "284", "coord" : [-73.982923900000003, 40.6580753], "street" : "Prospect Park West", "zipcode" : "11215" } } - -.. note:: Excluding the ``_id`` Field - - If you don't include the ``_id`` field in your LINQ projection, the {+driver-short+} - automatically excludes it from the results. diff --git a/source/aggregation/stages/rankFusion.txt b/source/aggregation/stages/rankFusion.txt deleted file mode 100644 index 22ae0fbd..00000000 --- a/source/aggregation/stages/rankFusion.txt +++ /dev/null @@ -1,18 +0,0 @@ -.. _csharp-aggregation-rankfusion: - -========== -RankFusion -========== - -.. facet:: - :name: genre - :values: reference - -.. meta:: - :keywords: code example, transform, pipeline - -.. contents:: On this page - :local: - :backlinks: none - :depth: 2 - :class: singlecol \ No newline at end of file diff --git a/source/aggregation/stages/replaceRoot.txt b/source/aggregation/stages/replaceRoot.txt deleted file mode 100644 index e2b141be..00000000 --- a/source/aggregation/stages/replaceRoot.txt +++ /dev/null @@ -1,34 +0,0 @@ -.. _csharp-aggregation-replaceroot: - -=========== -ReplaceRoot -=========== - -.. facet:: - :name: genre - :values: reference - -.. meta:: - :keywords: code example, transform, pipeline - -.. contents:: On this page - :local: - :backlinks: none - :depth: 2 - :class: singlecol - - -ReplaceRoot ------------ - -Use the ``replaceRoot()`` method to create a :manual:`$replaceRoot ` -pipeline stage that replaces each input document with the specified document. - -The following example replaces each input document with the nested document -in the ``spanish_translation`` field: - -.. literalinclude:: /includes/aggregation/Builders.cs - :start-after: // begin replaceRoot - :end-before: // end replaceRoot - :language: csharp - :dedent: \ No newline at end of file diff --git a/source/aggregation/stages/replaceWith.txt b/source/aggregation/stages/replaceWith.txt deleted file mode 100644 index 4e8a6af1..00000000 --- a/source/aggregation/stages/replaceWith.txt +++ /dev/null @@ -1,18 +0,0 @@ -.. _csharp-aggregation-replacewith: - -=========== -ReplaceWith -=========== - -.. facet:: - :name: genre - :values: reference - -.. meta:: - :keywords: code example, transform, pipeline - -.. contents:: On this page - :local: - :backlinks: none - :depth: 2 - :class: singlecol \ No newline at end of file diff --git a/source/aggregation/stages/sample.txt b/source/aggregation/stages/sample.txt deleted file mode 100644 index c5c90a77..00000000 --- a/source/aggregation/stages/sample.txt +++ /dev/null @@ -1,60 +0,0 @@ -.. _csharp-aggregation-sample: - -====== -Sample -====== - -.. facet:: - :name: genre - :values: reference - -.. meta:: - :keywords: code example, transform, pipeline - -.. contents:: On this page - :local: - :backlinks: none - :depth: 2 - :class: singlecol - - -Sample ------- - -Use the ``sample()`` method to create a :manual:`$sample ` -pipeline stage to randomly select documents from input. - -The following example creates a pipeline stage that randomly selects 5 documents: - -.. literalinclude:: /includes/aggregation/Builders.cs - :start-after: // begin sample - :end-before: // end sample - :language: csharp - :dedent: - - -$sample -~~~~~~~ - -The ``$sample`` aggregation stage returns a random sample of documents from a -collection. The following example shows how to generate a ``$sample`` stage by using -LINQ: - -.. code-block:: csharp - :emphasize-lines: 4 - - var query = queryableCollection - .Aggregate() - .Sample(4) - .ToList(); - -The result of the preceding example contains the following documents: - -.. code-block:: json - - // Results Truncated - - { "name" : "Von Dolhens", "cuisine" : "Ice Cream, Gelato, Yogurt, Ices" } - { "name" : "New York Mercantile Exchange", "cuisine" : "American" } - { "name" : "Michaelangelo's Restaurant", "cuisine" : "Italian" } - { "name" : "Charlie Palmer Steak", "cuisine" : "American" } \ No newline at end of file diff --git a/source/aggregation/stages/search.txt b/source/aggregation/stages/search.txt deleted file mode 100644 index a9b12c7b..00000000 --- a/source/aggregation/stages/search.txt +++ /dev/null @@ -1,45 +0,0 @@ -.. _csharp-aggregation-search: - -====== -Search -====== - -.. facet:: - :name: genre - :values: reference - -.. meta:: - :keywords: code example, transform, pipeline - -.. contents:: On this page - :local: - :backlinks: none - :depth: 2 - :class: singlecol - - -Atlas Full-Text Search ----------------------- - -Use the ``search()`` method to create a :manual:`$search ` -pipeline stage that specifies a full-text search of one or more fields. - -.. tip:: Only Available on Atlas for MongoDB v4.2 and later - - This aggregation pipeline operator is only available for collections hosted - on :atlas:`MongoDB Atlas ` clusters running v4.2 or later that are - covered by an :atlas:`Atlas search index `. - Learn more about the required setup and the functionality of this operator - from the :ref:`Atlas Search ` documentation. - -The following example creates a pipeline stage that searches the ``title`` -field for text that contains the word "Future": - -.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggregateSearchBuilderExample.java - :start-after: // begin atlasTextSearch - :end-before: // end atlasTextSearch - :language: csharp - :dedent: - -Learn more about the builders from the -`search package API documentation <{+core-api+}/client/model/search/package-summary.html>`__. \ No newline at end of file diff --git a/source/aggregation/stages/searchMeta.txt b/source/aggregation/stages/searchMeta.txt deleted file mode 100644 index 1225ec80..00000000 --- a/source/aggregation/stages/searchMeta.txt +++ /dev/null @@ -1,46 +0,0 @@ -.. _csharp-aggregation-searchmeta: - -========== -SearchMeta -========== - -.. facet:: - :name: genre - :values: reference - -.. meta:: - :keywords: code example, transform, pipeline - -.. contents:: On this page - :local: - :backlinks: none - :depth: 2 - :class: singlecol - - -Atlas Search Metadata ---------------------- - -Use the ``searchMeta()`` method to create a -:manual:`$searchMeta ` -pipeline stage which returns only the metadata part of the results from -Atlas full-text search queries. - -.. tip:: Only Available on Atlas for MongoDB v4.4.11 and later - - This aggregation pipeline operator is only available - on :atlas:`MongoDB Atlas ` clusters running v4.4.11 and later. For a - detailed list of version availability, see the MongoDB Atlas documentation - on :atlas:`$searchMeta `. - -The following example shows the ``count`` metadata for an Atlas search -aggregation stage: - -.. literalinclude:: /includes/fundamentals/code-snippets/builders/AggregateSearchBuilderExample.java - :start-after: // begin atlasSearchMeta - :end-before: // end atlasSearchMeta - :language: csharp - :dedent: - -Learn more about this helper from the -`searchMeta() API documentation <{+core-api+}/client/model/Aggregates.html#searchMeta(com.mongodb.client.model.search.SearchCollector)>`__. \ No newline at end of file diff --git a/source/aggregation/stages/set.txt b/source/aggregation/stages/set.txt deleted file mode 100644 index f4c04063..00000000 --- a/source/aggregation/stages/set.txt +++ /dev/null @@ -1,18 +0,0 @@ -.. _csharp-aggregation-set: - -=== -Set -=== - -.. facet:: - :name: genre - :values: reference - -.. meta:: - :keywords: code example, transform, pipeline - -.. contents:: On this page - :local: - :backlinks: none - :depth: 2 - :class: singlecol \ No newline at end of file diff --git a/source/aggregation/stages/setWindowFields.txt b/source/aggregation/stages/setWindowFields.txt deleted file mode 100644 index e09948be..00000000 --- a/source/aggregation/stages/setWindowFields.txt +++ /dev/null @@ -1,44 +0,0 @@ -.. _csharp-aggregation-setwindowfields: - -=============== -SetWindowFields -=============== - -.. facet:: - :name: genre - :values: reference - -.. meta:: - :keywords: code example, transform, pipeline - -.. contents:: On this page - :local: - :backlinks: none - :depth: 2 - :class: singlecol - - -.. _builders-aggregates-setWindowFields: - -SetWindowFields ---------------- - -Use the ``setWindowFields()`` method to create a :manual:`$setWindowFields ` -pipeline stage that allows using window operators to perform operations -on a specified span of documents in a collection. - -.. tip:: Window Functions - - The driver includes the `Windows <{+core-api+}/client/model/Windows.html>`__ - class with static factory methods for building windowed computations. - -The following example creates a pipeline stage that computes the -accumulated rainfall and the average temperature over the past month for -each locality from more fine-grained measurements presented in the ``rainfall`` -and ``temperature`` fields: - -.. literalinclude:: /includes/aggregation/Builders.cs - :start-after: // begin setWindowFields - :end-before: // end setWindowFields - :language: csharp - :dedent: \ No newline at end of file diff --git a/source/aggregation/stages/skip.txt b/source/aggregation/stages/skip.txt deleted file mode 100644 index 7948594f..00000000 --- a/source/aggregation/stages/skip.txt +++ /dev/null @@ -1,61 +0,0 @@ -.. _csharp-aggregation-skip: - -==== -Skip -==== - -.. facet:: - :name: genre - :values: reference - -.. meta:: - :keywords: code example, transform, pipeline - -.. contents:: On this page - :local: - :backlinks: none - :depth: 2 - :class: singlecol - - -Skip ----- - -Use the ``skip()`` method to create a :manual:`$skip ` -pipeline stage to skip over the specified number of documents before -passing documents into the next stage. - -The following example creates a pipeline stage that skips the first ``5`` documents: - -.. literalinclude:: /includes/aggregation/Builders.cs - :start-after: // begin skip - :end-before: // end skip - :language: csharp - :dedent: - - -$skip -~~~~~ - -The ``$skip`` aggregation stage skips over a specified number of documents returned -by a query, then returns the rest of the results. The following example shows how to generate -a ``$skip`` stage using LINQ: - -.. code-block:: csharp - :emphasize-lines: 4 - - var query = queryableCollection - .Where(r => r.Cuisine == "Italian") - .Select(r => new {r.Name, r.Cuisine}) - .Skip(2); - -The preceding example skips the first two restaurants that match the criteria, and -returns the rest. The result contains the following documents: - -.. code-block:: json - - // Results Truncated - - { "name" : "Marchis Restaurant", "cuisine" : "Italian" } - { "name" : "Crystal Room", "cuisine" : "Italian" } - { "name" : "Forlinis Restaurant", "cuisine" : "Italian" } \ No newline at end of file diff --git a/source/aggregation/stages/sort.txt b/source/aggregation/stages/sort.txt deleted file mode 100644 index 3aea6d45..00000000 --- a/source/aggregation/stages/sort.txt +++ /dev/null @@ -1,87 +0,0 @@ -.. _csharp-aggregation-sort: - -==== -Sort -==== - -.. facet:: - :name: genre - :values: reference - -.. meta:: - :keywords: code example, transform, pipeline - -.. contents:: On this page - :local: - :backlinks: none - :depth: 2 - :class: singlecol - - -Sort ----- - -Use the ``sort()`` method to create a :manual:`$sort ` -pipeline stage to sort by the specified criteria. - -.. tip:: - - Though the sort criteria can be an instance of any class that - implements ``Bson``, it's convenient to combine with use of :ref:`Sorts `. - -The following example creates a pipeline stage that sorts in descending order according -to the value of the ``year`` field and then in ascending order according to the -value of the ``title`` field: - -.. literalinclude:: /includes/aggregation/Builders.cs - :start-after: // begin sortStage - :end-before: // end sortStage - :language: csharp - :dedent: - - -$sort -~~~~~ - -The ``$sort`` aggregation stage returns the results of your query in the order -that you specify. - -Select the :guilabel:`Method Syntax` or :guilabel:`Query Syntax` tab to see how -to generate an ``$sort`` stage using LINQ: - -.. tabs:: - - .. tab:: Method Syntax - :tabid: method-syntax - - .. code-block:: csharp - :emphasize-lines: 2 - - var query = queryableCollection - .OrderBy(r => r.Name) - .ThenByDescending(r => r.RestaurantId); - - .. tab:: Query Syntax - :tabid: query-syntax - - .. code-block:: csharp - :emphasize-lines: 2 - - var query = from r in queryableCollection - orderby r.Name, r.RestaurantId descending - select r; - -The preceding example returns the query results sorted alphabetically by the -``Name`` field, with a secondary descending sort on the ``RestaurantId`` field. -The following is a subset of the documents contained in the returned results: - -.. code-block:: json - - // Results Truncated - - ... - { "_id" : ObjectId(...), "name" : "Aba Turkish Restaurant", "restaurant_id" : "41548686", "cuisine" : "Turkish", "address" : {...}, "borough" : "Manhattan", "grades" : [...] } - { "_id" : ObjectId(...), "name" : "Abace Sushi", "restaurant_id" : "50006214", "cuisine" : "Japanese", "address" : { ... }, "borough" : "Manhattan", "grades" : [...] } - { "_id" : ObjectId(...), "name" : "Abacky Potluck", "restaurant_id" : "50011222", "cuisine" : "Asian", "address" : { ... }, "borough" : "Manhattan", "grades" : [...] } - { "_id" : ObjectId(...), "name" : "Abaleh", "restaurant_id" : "50009096", "cuisine" : "Mediterranean", "address" : { ... }, "borough" : "Manhattan", "grades" : [...] } - ... \ No newline at end of file diff --git a/source/aggregation/stages/sortByCount.txt b/source/aggregation/stages/sortByCount.txt deleted file mode 100644 index 0d5fb077..00000000 --- a/source/aggregation/stages/sortByCount.txt +++ /dev/null @@ -1,47 +0,0 @@ -.. _csharp-aggregation-sortbycount: - -=========== -SortByCount -=========== - -.. facet:: - :name: genre - :values: reference - -.. meta:: - :keywords: code example, transform, pipeline - -.. contents:: On this page - :local: - :backlinks: none - :depth: 2 - :class: singlecol - - -SortByCount ------------ - -Use the ``sortByCount()`` method to create a :manual:`$sortByCount ` -pipeline stage that groups documents by a given expression and then sorts -these groups by count in descending order. - -.. tip:: - - The ``$sortByCount`` stage is identical to a ``$group`` stage with a - ``$sum`` accumulator followed by a ``$sort`` stage. - - .. code-block:: json - - [ - { "$group": { "_id": , "count": { "$sum": 1 } } }, - { "$sort": { "count": -1 } } - ] - -The following example groups documents by the truncated value of the field ``x`` -and computes the count for each distinct value: - -.. literalinclude:: /includes/aggregation/Builders.cs - :start-after: // begin sortByCount - :end-before: // end sortByCount - :language: csharp - :dedent: \ No newline at end of file diff --git a/source/aggregation/stages/unionWith.txt b/source/aggregation/stages/unionWith.txt deleted file mode 100644 index ac2bade5..00000000 --- a/source/aggregation/stages/unionWith.txt +++ /dev/null @@ -1,18 +0,0 @@ -.. _csharp-aggregation-unionwith: - -========= -UnionWith -========= - -.. facet:: - :name: genre - :values: reference - -.. meta:: - :keywords: code example, transform, pipeline - -.. contents:: On this page - :local: - :backlinks: none - :depth: 2 - :class: singlecol \ No newline at end of file diff --git a/source/aggregation/stages/unwind.txt b/source/aggregation/stages/unwind.txt deleted file mode 100644 index fc580589..00000000 --- a/source/aggregation/stages/unwind.txt +++ /dev/null @@ -1,186 +0,0 @@ -.. _csharp-aggregation-unwind: - -====== -Unwind -====== - -.. facet:: - :name: genre - :values: reference - -.. meta:: - :keywords: code example, transform, pipeline - -.. contents:: On this page - :local: - :backlinks: none - :depth: 2 - :class: singlecol - - -Unwind ------- - -Use the ``unwind()`` method to create an :manual:`$unwind ` -pipeline stage to deconstruct an array field from input documents, creating -an output document for each array element. - -The following example creates a document for each element in the ``sizes`` array: - -.. literalinclude:: /includes/aggregation/Builders.cs - :start-after: // begin unwindStage - :end-before: // end unwindStage - :language: csharp - :dedent: - -To preserve documents that have missing or ``null`` -values for the array field, or where array is empty: - - -.. literalinclude:: /includes/aggregation/Builders.cs - :start-after: // begin unwindPreserve - :end-before: // end unwindPreserve - :language: csharp - :dedent: - -To include the array index, in this example in a field called ``"position"``: - -.. literalinclude:: /includes/aggregation/Builders.cs - :start-after: // begin unwindIndex - :end-before: // end unwindIndex - :language: csharp - :dedent: - -$unwind -~~~~~~~ - -The ``$unwind`` aggregation stage deconstructs a specified array field and returns -a document for each element in that array. - -Select the :guilabel:`Method Syntax` or :guilabel:`Query Syntax` tab to see how -to generate an ``$unwind`` stage using LINQ: - -.. tabs:: - - .. tab:: Method Syntax - :tabid: method-syntax - - .. code-block:: csharp - :emphasize-lines: 3 - - var query = queryableCollection - .Where(r => r.Name == "The Movable Feast") - .SelectMany(r => r.Grades); - - .. tab:: Query Syntax - :tabid: query-syntax - - .. code-block:: csharp - :emphasize-lines: 3 - - var query = from r in queryableCollection - where r.Name == "The Movable Feast" - from grade in r.Grades - select grade; - -The query in the preceding example finds the document where the ``Name`` field -has the value "The Movable Feast." Then, for each element in this document's -``Grades`` array, the query returns a new document. The result contains the -following documents: - -.. code-block:: json - - { "date" : ISODate("2014-11-19T00:00:00Z"), "grade" : "A", "score" : 11 } - { "date" : ISODate("2013-11-14T00:00:00Z"), "grade" : "A", "score" : 2 } - { "date" : ISODate("2012-12-05T00:00:00Z"), "grade" : "A", "score" : 13 } - { "date" : ISODate("2012-05-17T00:00:00Z"), "grade" : "A", "score" : 11 } - -Nested Statements -+++++++++++++++++ - -You can chain or nest ``Select`` and ``SelectMany`` statements to unwind nested -arrays. Consider a collection that contains documents with a **new** schema. These -documents contain a ``restaurants`` field, which holds an array of documents -represented by the ``Restaurant`` class. The documents within the array each have -a ``grades`` field that holds an array of documents represented by -the ``Grade`` class. The following code is an example of a single document in -this collection: - -.. code-block:: json - - { - "_id": { "$oid": ... }, - "restaurants": [ - { - "_id": { ... } , - "address": { ... }, - "name": "Tov Kosher Kitchen", - "grades": [ - { - "date" : ISODate("2014-11-24T00:00:00Z"), - "grade" : "Z", - "score" : 20.0 - }, - { - "date" : ISODate("2013-01-17T00:00:00Z"), - "grade" : "A", - "score" : 13.0 - } - ] - ... - }, - { - "_id": { ... } , - "address": { ... }, - "name": "Harriet's Kitchen", - "grades": [ - { - "date" : ISODate("2014-04-19T00:00:00Z"), - "grade" : "B", - "score" : 12.0 - } - ], - ... - }, - ... - ] - } - -You can nest ``SelectMany`` statements within ``SelectMany`` or ``Select`` -statements. The following example nests a ``SelectMany`` statement within a -``Select`` statement to retrieve an array from each document in the collection. -Each array holds all grade objects from all restaurants in each document. - -.. io-code-block:: - :copyable: true - - .. input:: /includes/fundamentals/code-examples/linq.cs - :language: csharp - :start-after: start-nested-SelectMany - :end-before: end-nested-SelectMany - - .. output:: - :visible: false - :language: json - - // output for first document in collection - [ - { "date" : ISODate("2014-11-24T00:00:00Z"), - "grade" : "Z", - "score" : 20.0 - }, - { "date" : ISODate("2013-01-17T00:00:00Z"), - "grade" : "A", - "score" : 13.0 - }, - { - "date" : ISODate("2014-04-19T00:00:00Z"), - "grade" : "B", - "score" : 12.0 - }, - ... - ], - // output for second document in collection - [ - ... - ] \ No newline at end of file diff --git a/source/aggregation/stages/vectorSearch.txt b/source/aggregation/stages/vectorSearch.txt deleted file mode 100644 index ed85e192..00000000 --- a/source/aggregation/stages/vectorSearch.txt +++ /dev/null @@ -1,97 +0,0 @@ -.. _csharp-aggregation-vectorsearch: - -============ -VectorSearch -============ - -.. facet:: - :name: genre - :values: reference - -.. meta:: - :keywords: code example, transform, pipeline - -.. contents:: On this page - :local: - :backlinks: none - :depth: 2 - :class: singlecol - - -$vectorSearch -~~~~~~~~~~~~~ - -The ``$vectorSearch`` aggregation stage performs an *approximate nearest neighbor* search -on a vector in the specified field. Your collection *must* have a -defined Atlas Vector Search index before you can perform a vector search on your data. - -.. tip:: - - To obtain the sample dataset used in the following example, see :ref:`csharp-get-started`. - To create the sample Atlas Vector Search index used in the following example, see - :atlas:`Create an Atlas Vector Search Index ` in the - Atlas manual. - -Consider the ``embedded_movies`` collection in the ``sample_mflix`` database. You -can use a ``$vectorSearch`` stage to perform a semantic search on the ``plot_embedding`` -field of the documents in the collection. - -The following ``EmbeddedMovie`` class models the documents in the ``embedded_movies`` -collection: - -.. code-block:: csharp - - [BsonIgnoreExtraElements] - public class EmbeddedMovie - { - [BsonIgnoreIfDefault] - public string Title { get; set; } - - public string Plot { get; set; } - - [BsonElement("plot_embedding")] - public double[] Embedding { get; set; } - } - -The following example shows how to generate a ``$vectorSearch`` stage to search -the ``plot_embedding`` field using vector embeddings for the string ``"time travel"``: - -.. code-block:: csharp - - // Defines vector embeddings for the string "time travel" - var vector = new[] {-0.0016261312,-0.028070757,-0.011342932,-0.012775794,-0.0027440966,0.008683807,-0.02575152,-0.02020668,-0.010283281,-0.0041719596,0.021392956,0.028657231,-0.006634482,0.007490867,0.018593878,0.0038187427,0.029590257,-0.01451522,0.016061379,0.00008528442,-0.008943722,0.01627464,0.024311995,-0.025911469,0.00022596726,-0.008863748,0.008823762,-0.034921836,0.007910728,-0.01515501,0.035801545,-0.0035688248,-0.020299982,-0.03145631,-0.032256044,-0.028763862,-0.0071576433,-0.012769129,0.012322609,-0.006621153,0.010583182,0.024085402,-0.001623632,0.007864078,-0.021406285,0.002554159,0.012229307,-0.011762793,0.0051682983,0.0048484034,0.018087378,0.024325324,-0.037694257,-0.026537929,-0.008803768,-0.017767483,-0.012642504,-0.0062712682,0.0009771782,-0.010409906,0.017754154,-0.004671795,-0.030469967,0.008477209,-0.005218282,-0.0058480743,-0.020153364,-0.0032805866,0.004248601,0.0051449724,0.006791097,0.007650814,0.003458861,-0.0031223053,-0.01932697,-0.033615597,0.00745088,0.006321252,-0.0038154104,0.014555207,0.027697546,-0.02828402,0.0066711367,0.0077107945,0.01794076,0.011349596,-0.0052715978,0.014755142,-0.019753495,-0.011156326,0.011202978,0.022126047,0.00846388,0.030549942,-0.0041386373,0.018847128,-0.00033655585,0.024925126,-0.003555496,-0.019300312,0.010749794,0.0075308536,-0.018287312,-0.016567878,-0.012869096,-0.015528221,0.0078107617,-0.011156326,0.013522214,-0.020646535,-0.01211601,0.055928253,0.011596181,-0.017247654,0.0005939711,-0.026977783,-0.003942035,-0.009583511,-0.0055248477,-0.028737204,0.023179034,0.003995351,0.0219661,-0.008470545,0.023392297,0.010469886,-0.015874773,0.007890735,-0.009690142,-0.00024970944,0.012775794,0.0114762215,0.013422247,0.010429899,-0.03686786,-0.006717788,-0.027484283,0.011556195,-0.036068123,-0.013915418,-0.0016327957,0.0151016945,-0.020473259,0.004671795,-0.012555866,0.0209531,0.01982014,0.024485271,0.0105431955,-0.005178295,0.033162415,-0.013795458,0.007150979,0.010243294,0.005644808,0.017260984,-0.0045618312,0.0024725192,0.004305249,-0.008197301,0.0014203656,0.0018460588,0.005015015,-0.011142998,0.01439526,0.022965772,0.02552493,0.007757446,-0.0019726837,0.009503538,-0.032042783,0.008403899,-0.04609149,0.013808787,0.011749465,0.036388017,0.016314628,0.021939443,-0.0250051,-0.017354285,-0.012962398,0.00006107364,0.019113706,0.03081652,-0.018114036,-0.0084572155,0.009643491,-0.0034721901,0.0072642746,-0.0090636825,0.01642126,0.013428912,0.027724205,0.0071243206,-0.6858542,-0.031029783,-0.014595194,-0.011449563,0.017514233,0.01743426,0.009950057,0.0029706885,-0.015714826,-0.001806072,0.011856096,0.026444625,-0.0010663156,-0.006474535,0.0016161345,-0.020313311,0.0148351155,-0.0018393943,0.0057347785,0.018300641,-0.018647194,0.03345565,-0.008070676,0.0071443142,0.014301958,0.0044818576,0.003838736,-0.007350913,-0.024525259,-0.001142124,-0.018620536,0.017247654,0.007037683,0.010236629,0.06046009,0.0138887605,-0.012122675,0.037694257,0.0055081863,0.042492677,0.00021784494,-0.011656162,0.010276617,0.022325981,0.005984696,-0.009496873,0.013382261,-0.0010563189,0.0026507939,-0.041639622,0.008637156,0.026471283,-0.008403899,0.024858482,-0.00066686375,-0.0016252982,0.027590916,0.0051449724,0.0058647357,-0.008743787,-0.014968405,0.027724205,-0.011596181,0.0047650975,-0.015381602,0.0043718936,0.002159289,0.035908177,-0.008243952,-0.030443309,0.027564257,0.042625964,-0.0033688906,0.01843393,0.019087048,0.024578573,0.03268257,-0.015608194,-0.014128681,-0.0033538956,-0.0028757197,-0.004121976,-0.032389335,0.0034322033,0.058807302,0.010943064,-0.030523283,0.008903735,0.017500903,0.00871713,-0.0029406983,0.013995391,-0.03132302,-0.019660193,-0.00770413,-0.0038853872,0.0015894766,-0.0015294964,-0.006251275,-0.021099718,-0.010256623,-0.008863748,0.028550599,0.02020668,-0.0012962399,-0.003415542,-0.0022509254,0.0119360695,0.027590916,-0.046971202,-0.0015194997,-0.022405956,0.0016677842,-0.00018535563,-0.015421589,-0.031802863,0.03814744,0.0065411795,0.016567878,-0.015621523,0.022899127,-0.011076353,0.02841731,-0.002679118,-0.002342562,0.015341615,0.01804739,-0.020566562,-0.012989056,-0.002990682,0.01643459,0.00042527664,0.008243952,-0.013715484,-0.004835075,-0.009803439,0.03129636,-0.021432944,0.0012087687,-0.015741484,-0.0052016205,0.00080890034,-0.01755422,0.004811749,-0.017967418,-0.026684547,-0.014128681,0.0041386373,-0.013742141,-0.010056688,-0.013268964,-0.0110630235,-0.028337335,0.015981404,-0.00997005,-0.02424535,-0.013968734,-0.028310679,-0.027750863,-0.020699851,0.02235264,0.001057985,0.00081639783,-0.0099367285,0.013522214,-0.012016043,-0.00086471526,0.013568865,0.0019376953,-0.019020405,0.017460918,-0.023045745,0.008503866,0.0064678704,-0.011509543,0.018727167,-0.003372223,-0.0028690554,-0.0027024434,-0.011902748,-0.012182655,-0.015714826,-0.0098634185,0.00593138,0.018753825,0.0010146659,0.013029044,0.0003521757,-0.017620865,0.04102649,0.00552818,0.024485271,-0.009630162,-0.015608194,0.0006718621,-0.0008418062,0.012395918,0.0057980907,0.016221326,0.010616505,0.004838407,-0.012402583,0.019900113,-0.0034521967,0.000247002,-0.03153628,0.0011038032,-0.020819811,0.016234655,-0.00330058,-0.0032289368,0.00078973995,-0.021952773,-0.022459272,0.03118973,0.03673457,-0.021472929,0.0072109587,-0.015075036,0.004855068,-0.0008151483,0.0069643734,0.010023367,-0.010276617,-0.023019087,0.0068244194,-0.0012520878,-0.0015086699,0.022046074,-0.034148756,-0.0022192693,0.002427534,-0.0027124402,0.0060346797,0.015461575,0.0137554705,0.009230294,-0.009583511,0.032629255,0.015994733,-0.019167023,-0.009203636,0.03393549,-0.017274313,-0.012042701,-0.0009930064,0.026777849,-0.013582194,-0.0027590916,-0.017594207,-0.026804507,-0.0014236979,-0.022032745,0.0091236625,-0.0042419364,-0.00858384,-0.0033905501,-0.020739838,0.016821127,0.022539245,0.015381602,0.015141681,0.028817179,-0.019726837,-0.0051283115,-0.011489551,-0.013208984,-0.0047017853,-0.0072309524,0.01767418,0.0025658219,-0.010323267,0.012609182,-0.028097415,0.026871152,-0.010276617,0.021912785,0.0022542577,0.005124979,-0.0019710176,0.004518512,-0.040360045,0.010969722,-0.0031539614,-0.020366628,-0.025778178,-0.0110030435,-0.016221326,0.0036587953,0.016207997,0.003007343,-0.0032555948,0.0044052163,-0.022046074,-0.0008822095,-0.009363583,0.028230704,-0.024538586,0.0029840174,0.0016044717,-0.014181997,0.031349678,-0.014381931,-0.027750863,0.02613806,0.0004136138,-0.005748107,-0.01868718,-0.0010138329,0.0054348772,0.010703143,-0.003682121,0.0030856507,-0.004275259,-0.010403241,0.021113047,-0.022685863,-0.023032416,0.031429652,0.001792743,-0.005644808,-0.011842767,-0.04078657,-0.0026874484,0.06915057,-0.00056939584,-0.013995391,0.010703143,-0.013728813,-0.022939114,-0.015261642,-0.022485929,0.016807798,0.007964044,0.0144219175,0.016821127,0.0076241563,0.005461535,-0.013248971,0.015301628,0.0085171955,-0.004318578,0.011136333,-0.0059047225,-0.010249958,-0.018207338,0.024645219,0.021752838,0.0007614159,-0.013648839,0.01111634,-0.010503208,-0.0038487327,-0.008203966,-0.00397869,0.0029740208,0.008530525,0.005261601,0.01642126,-0.0038753906,-0.013222313,0.026537929,0.024671877,-0.043505676,0.014195326,0.024778508,0.0056914594,-0.025951454,0.017620865,-0.0021359634,0.008643821,0.021299653,0.0041686273,-0.009017031,0.04044002,0.024378639,-0.027777521,-0.014208655,0.0028623908,0.042119466,0.005801423,-0.028124074,-0.03129636,0.022139376,-0.022179363,-0.04067994,0.013688826,0.013328944,0.0046184794,-0.02828402,-0.0063412455,-0.0046184794,-0.011756129,-0.010383247,-0.0018543894,-0.0018593877,-0.00052024535,0.004815081,0.014781799,0.018007403,0.01306903,-0.020433271,0.009043689,0.033189073,-0.006844413,-0.019766824,-0.018767154,0.00533491,-0.0024575242,0.018727167,0.0058080875,-0.013835444,0.0040719924,0.004881726,0.012029372,0.005664801,0.03193615,0.0058047553,0.002695779,0.009290274,0.02361889,0.017834127,0.0049017193,-0.0036388019,0.010776452,-0.019793482,0.0067777685,-0.014208655,-0.024911797,0.002385881,0.0034988478,0.020899786,-0.0025858153,-0.011849431,0.033189073,-0.021312982,0.024965113,-0.014635181,0.014048708,-0.0035921505,-0.003347231,0.030869836,-0.0017161017,-0.0061346465,0.009203636,-0.025165047,0.0068510775,0.021499587,0.013782129,-0.0024475274,-0.0051149824,-0.024445284,0.006167969,0.0068844,-0.00076183246,0.030150073,-0.0055948244,-0.011162991,-0.02057989,-0.009703471,-0.020646535,0.008004031,0.0066378145,-0.019900113,-0.012169327,-0.01439526,0.0044252095,-0.004018677,0.014621852,-0.025085073,-0.013715484,-0.017980747,0.0071043274,0.011456228,-0.01010334,-0.0035321703,-0.03801415,-0.012036037,-0.0028990454,-0.05419549,-0.024058744,-0.024272008,0.015221654,0.027964126,0.03182952,-0.015354944,0.004855068,0.011522872,0.004771762,0.0027874154,0.023405626,0.0004242353,-0.03132302,0.007057676,0.008763781,-0.0027057757,0.023005757,-0.0071176565,-0.005238275,0.029110415,-0.010989714,0.013728813,-0.009630162,-0.029137073,-0.0049317093,-0.0008630492,-0.015248313,0.0043219104,-0.0055681667,-0.013175662,0.029723546,0.025098402,0.012849103,-0.0009996708,0.03118973,-0.0021709518,0.0260181,-0.020526575,0.028097415,-0.016141351,0.010509873,-0.022965772,0.002865723,0.0020493253,0.0020509914,-0.0041419696,-0.00039695262,0.017287642,0.0038987163,0.014795128,-0.014661839,-0.008950386,0.004431874,-0.009383577,0.0012604183,-0.023019087,0.0029273694,-0.033135757,0.009176978,-0.011023037,-0.002102641,0.02663123,-0.03849399,-0.0044152127,0.0004527676,-0.0026924468,0.02828402,0.017727496,0.035135098,0.02728435,-0.005348239,-0.001467017,-0.019766824,0.014715155,0.011982721,0.0045651635,0.023458943,-0.0010046692,-0.0031373003,-0.0006972704,0.0019043729,-0.018967088,-0.024311995,0.0011546199,0.007977373,-0.004755101,-0.010016702,-0.02780418,-0.004688456,0.013022379,-0.005484861,0.0017227661,-0.015394931,-0.028763862,-0.026684547,0.0030589928,-0.018513903,0.028363993,0.0044818576,-0.009270281,0.038920518,-0.016008062,0.0093902415,0.004815081,-0.021059733,0.01451522,-0.0051583014,0.023765508,-0.017874114,-0.016821127,-0.012522544,-0.0028390652,0.0040886537,0.020259995,-0.031216389,-0.014115352,-0.009176978,0.010303274,0.020313311,0.0064112223,-0.02235264,-0.022872468,0.0052449396,0.0005723116,0.0037321046,0.016807798,-0.018527232,-0.009303603,0.0024858483,-0.0012662497,-0.007110992,0.011976057,-0.007790768,-0.042999174,-0.006727785,-0.011829439,0.007024354,0.005278262,-0.017740825,-0.0041519664,0.0085905045,0.027750863,-0.038387362,0.024391968,0.00087721116,0.010509873,-0.00038508154,-0.006857742,0.0183273,-0.0037054466,0.015461575,0.0017394272,-0.0017944091,0.014181997,-0.0052682655,0.009023695,0.00719763,-0.013522214,0.0034422,0.014941746,-0.0016711164,-0.025298337,-0.017634194,0.0058714002,-0.005321581,0.017834127,0.0110630235,-0.03369557,0.029190388,-0.008943722,0.009363583,-0.0034222065,-0.026111402,-0.007037683,-0.006561173,0.02473852,-0.007084334,-0.010110005,-0.008577175,0.0030439978,-0.022712521,0.0054582027,-0.0012620845,-0.0011954397,-0.015741484,0.0129557345,-0.00042111133,0.00846388,0.008930393,0.016487904,0.010469886,-0.007917393,-0.011762793,-0.0214596,0.000917198,0.021672864,0.010269952,-0.007737452,-0.010243294,-0.0067244526,-0.015488233,-0.021552904,0.017127695,0.011109675,0.038067464,0.00871713,-0.0025591573,0.021312982,-0.006237946,0.034628596,-0.0045251767,0.008357248,0.020686522,0.0010696478,0.0076708077,0.03772091,-0.018700508,-0.0020676525,-0.008923728,-0.023298996,0.018233996,-0.010256623,0.0017860786,0.009796774,-0.00897038,-0.01269582,-0.018527232,0.009190307,-0.02372552,-0.042119466,0.008097334,-0.0066778013,-0.021046404,0.0019593548,0.011083017,-0.0016028056,0.012662497,-0.000059095124,0.0071043274,-0.014675168,0.024831824,-0.053582355,0.038387362,0.0005698124,0.015954746,0.021552904,0.031589597,-0.009230294,-0.0006147976,0.002625802,-0.011749465,-0.034362018,-0.0067844326,-0.018793812,0.011442899,-0.008743787,0.017474247,-0.021619547,0.01831397,-0.009037024,-0.0057247817,-0.02728435,0.010363255,0.034415334,-0.024032086,-0.0020126705,-0.0045518344,-0.019353628,-0.018340627,-0.03129636,-0.0034038792,-0.006321252,-0.0016161345,0.033642255,-0.000056075285,-0.005005019,0.004571828,-0.0024075406,-0.00010215386,0.0098634185,0.1980148,-0.003825407,-0.025191706,0.035161756,0.005358236,0.025111731,0.023485601,0.0023342315,-0.011882754,0.018287312,-0.0068910643,0.003912045,0.009243623,-0.001355387,-0.028603915,-0.012802451,-0.030150073,-0.014795128,-0.028630573,-0.0013487226,0.002667455,0.00985009,-0.0033972147,-0.021486258,0.009503538,-0.017847456,0.013062365,-0.014341944,0.005078328,0.025165047,-0.015594865,-0.025924796,-0.0018177348,0.010996379,-0.02993681,0.007324255,0.014475234,-0.028577257,0.005494857,0.00011725306,-0.013315615,0.015941417,0.009376912,0.0025158382,0.008743787,0.023832154,-0.008084005,-0.014195326,-0.008823762,0.0033455652,-0.032362677,-0.021552904,-0.0056081535,0.023298996,-0.025444955,0.0097301295,0.009736794,0.015274971,-0.0012937407,-0.018087378,-0.0039387033,0.008637156,-0.011189649,-0.00023846315,-0.011582852,0.0066411467,-0.018220667,0.0060846633,0.0376676,-0.002709108,0.0072776037,0.0034188742,-0.010249958,-0.0007747449,-0.00795738,-0.022192692,0.03910712,0.032122757,0.023898797,0.0076241563,-0.007397564,-0.003655463,0.011442899,-0.014115352,-0.00505167,-0.031163072,0.030336678,-0.006857742,-0.022259338,0.004048667,0.02072651,0.0030156737,-0.0042119464,0.00041861215,-0.005731446,0.011103011,0.013822115,0.021512916,0.009216965,-0.006537847,-0.027057758,-0.04054665,0.010403241,-0.0056281467,-0.005701456,-0.002709108,-0.00745088,-0.0024841821,0.009356919,-0.022659205,0.004061996,-0.013175662,0.017074378,-0.006141311,-0.014541878,0.02993681,-0.00028448965,-0.025271678,0.011689484,-0.014528549,0.004398552,-0.017274313,0.0045751603,0.012455898,0.004121976,-0.025458284,-0.006744446,0.011822774,-0.015035049,-0.03257594,0.014675168,-0.0039187097,0.019726837,-0.0047251107,0.0022825818,0.011829439,0.005391558,-0.016781142,-0.0058747325,0.010309938,-0.013049036,0.01186276,-0.0011246296,0.0062112883,0.0028190718,-0.021739509,0.009883412,-0.0073175905,-0.012715813,-0.017181009,-0.016607866,-0.042492677,-0.0014478565,-0.01794076,0.012302616,-0.015194997,-0.04433207,-0.020606548,0.009696807,0.010303274,-0.01694109,-0.004018677,0.019353628,-0.001991011,0.000058938927,0.010536531,-0.17274313,0.010143327,0.014235313,-0.024152048,0.025684876,-0.0012504216,0.036601283,-0.003698782,0.0007310093,0.004165295,-0.0029157067,0.017101036,-0.046891227,-0.017460918,0.022965772,0.020233337,-0.024072073,0.017220996,0.009370248,0.0010363255,0.0194336,-0.019606877,0.01818068,-0.020819811,0.007410893,0.0019326969,0.017887443,0.006651143,0.00067394477,-0.011889419,-0.025058415,-0.008543854,0.021579562,0.0047484366,0.014062037,0.0075508473,-0.009510202,-0.009143656,0.0046817916,0.013982063,-0.0027990784,0.011782787,0.014541878,-0.015701497,-0.029350337,0.021979429,0.01332228,-0.026244693,-0.0123492675,-0.003895384,0.0071576433,-0.035454992,-0.00046984528,0.0033522295,0.039347045,0.0005119148,0.00476843,-0.012995721,0.0024042083,-0.006931051,-0.014461905,-0.0127558,0.0034555288,-0.0074842023,-0.030256703,-0.007057676,-0.00807734,0.007804097,-0.006957709,0.017181009,-0.034575284,-0.008603834,-0.005008351,-0.015834786,0.02943031,0.016861115,-0.0050849924,0.014235313,0.0051449724,0.0025924798,-0.0025741523,0.04289254,-0.002104307,0.012969063,-0.008310596,0.00423194,0.0074975314,0.0018810473,-0.014248641,-0.024725191,0.0151016945,-0.017527562,0.0018727167,0.0002830318,0.015168339,0.0144219175,-0.004048667,-0.004358565,0.011836103,-0.010343261,-0.005911387,0.0022825818,0.0073175905,0.00403867,0.013188991,0.03334902,0.006111321,0.008597169,0.030123414,-0.015474904,0.0017877447,-0.024551915,0.013155668,0.023525586,-0.0255116,0.017220996,0.004358565,-0.00934359,0.0099967085,0.011162991,0.03092315,-0.021046404,-0.015514892,0.0011946067,-0.01816735,0.010876419,-0.10124666,-0.03550831,0.0056348112,0.013942076,0.005951374,0.020419942,-0.006857742,-0.020873128,-0.021259667,0.0137554705,0.0057880944,-0.029163731,-0.018767154,-0.021392956,0.030896494,-0.005494857,-0.0027307675,-0.006801094,-0.014821786,0.021392956,-0.0018110704,-0.0018843795,-0.012362596,-0.0072176233,-0.017194338,-0.018713837,-0.024272008,0.03801415,0.00015880188,0.0044951867,-0.028630573,-0.0014070367,-0.00916365,-0.026537929,-0.009576847,-0.013995391,-0.0077107945,0.0050016865,0.00578143,-0.04467862,0.008363913,0.010136662,-0.0006268769,-0.006591163,0.015341615,-0.027377652,-0.00093136,0.029243704,-0.020886457,-0.01041657,-0.02424535,0.005291591,-0.02980352,-0.009190307,0.019460259,-0.0041286405,0.004801752,0.0011787785,-0.001257086,-0.011216307,-0.013395589,0.00088137644,-0.0051616337,0.03876057,-0.0033455652,0.00075850025,-0.006951045,-0.0062112883,0.018140694,-0.006351242,-0.008263946,0.018154023,-0.012189319,0.0075508473,-0.044358727,-0.0040153447,0.0093302615,-0.010636497,0.032789204,-0.005264933,-0.014235313,-0.018393943,0.007297597,-0.016114693,0.015021721,0.020033404,0.0137688,0.0011046362,0.010616505,-0.0039453674,0.012109346,0.021099718,-0.0072842683,-0.019153694,-0.003768759,0.039320387,-0.006747778,-0.0016852784,0.018154023,0.0010963057,-0.015035049,-0.021033075,-0.04345236,0.017287642,0.016341286,-0.008610498,0.00236922,0.009290274,0.028950468,-0.014475234,-0.0035654926,0.015434918,-0.03372223,0.004501851,-0.012929076,-0.008483873,-0.0044685286,-0.0102233,0.01615468,0.0022792495,0.010876419,-0.0059647025,0.01895376,-0.0069976957,-0.0042952523,0.017207667,-0.00036133936,0.0085905045,0.008084005,0.03129636,-0.016994404,-0.014915089,0.020100048,-0.012009379,-0.006684466,0.01306903,0.00015765642,-0.00530492,0.0005277429,0.015421589,0.015528221,0.032202728,-0.003485519,-0.0014286962,0.033908837,0.001367883,0.010509873,0.025271678,-0.020993087,0.019846799,0.006897729,-0.010216636,-0.00725761,0.01818068,-0.028443968,-0.011242964,-0.014435247,-0.013688826,0.006101324,-0.0022509254,0.013848773,-0.0019077052,0.017181009,0.03422873,0.005324913,-0.0035188415,0.014128681,-0.004898387,0.005038341,0.0012320944,-0.005561502,-0.017847456,0.0008538855,-0.0047884234,0.011849431,0.015421589,-0.013942076,0.0029790192,-0.013702155,0.0001199605,-0.024431955,0.019926772,0.022179363,-0.016487904,-0.03964028,0.0050849924,0.017487574,0.022792496,0.0012504216,0.004048667,-0.00997005,0.0076041627,-0.014328616,-0.020259995,0.0005598157,-0.010469886,0.0016852784,0.01716768,-0.008990373,-0.001987679,0.026417969,0.023792166,0.0046917885,-0.0071909656,-0.00032051947,-0.023259008,-0.009170313,0.02071318,-0.03156294,-0.030869836,-0.006324584,0.013795458,-0.00047151142,0.016874444,0.00947688,0.00985009,-0.029883493,0.024205362,-0.013522214,-0.015075036,-0.030603256,0.029270362,0.010503208,0.021539574,0.01743426,-0.023898797,0.022019416,-0.0068777353,0.027857494,-0.021259667,0.0025758184,0.006197959,0.006447877,-0.00025200035,-0.004941706,-0.021246338,-0.005504854,-0.008390571,-0.0097301295,0.027244363,-0.04446536,0.05216949,0.010243294,-0.016008062,0.0122493,-0.0199401,0.009077012,0.019753495,0.006431216,-0.037960835,-0.027377652,0.016381273,-0.0038620618,0.022512587,-0.010996379,-0.0015211658,-0.0102233,0.007071005,0.008230623,-0.009490209,-0.010083347,0.024431955,0.002427534,0.02828402,0.0035721571,-0.022192692,-0.011882754,0.010056688,0.0011904413,-0.01426197,-0.017500903,-0.00010985966,0.005591492,-0.0077707744,-0.012049366,0.011869425,0.00858384,-0.024698535,-0.030283362,0.020140035,0.011949399,-0.013968734,0.042732596,-0.011649498,-0.011982721,-0.016967745,-0.0060913274,-0.007130985,-0.013109017,-0.009710136}; - - // Specifies that the vector search will consider the 150 nearest neighbors - // in the specified index - var options = new VectorSearchOptions() - { - IndexName = "vector_index", - NumberOfCandidates = 150 - }; - - // Builds aggregation pipeline and specifies that the $vectorSearch stage - // returns 10 results - var results = queryableCollection - .VectorSearch(m => m.Embedding, vector, 10, options) - .Select(m => new { m.Title, m.Plot }); - -The results of the preceding example contain the following documents: - -.. code-block:: json - - { "_id" : ObjectId("573a13a0f29313caabd04a4f"), "plot" : "A reporter, learning of time travelers visiting 20th century disasters, tries to change the history they know by averting upcoming disasters.", "title" : "Thrill Seekers" } - { "_id" : ObjectId("573a13d8f29313caabda6557"), "plot" : "At the age of 21, Tim discovers he can travel in time and change what happens and has happened in his own life. His decision to make his world a better place by getting a girlfriend turns out not to be as easy as you might think.", "title" : "About Time" } - { "_id" : ObjectId("573a13a5f29313caabd13b4b"), "plot" : "Hoping to alter the events of the past, a 19th century inventor instead travels 800,000 years into the future, where he finds humankind divided into two warring races.", "title" : "The Time Machine" } - { "_id" : ObjectId("573a13aef29313caabd2e2d7"), "plot" : "After using his mother's newly built time machine, Dolf gets stuck involuntary in the year 1212. He ends up in a children's crusade where he confronts his new friends with modern techniques...", "title" : "Crusade in Jeans" } - { "_id" : ObjectId("573a1399f29313caabceec0e"), "plot" : "An officer for a security agency that regulates time travel, must fend for his life against a shady politician who has a tie to his past.", "title" : "Timecop" } - { "_id" : ObjectId("573a1399f29313caabcee36f"), "plot" : "A time-travel experiment in which a robot probe is sent from the year 2073 to the year 1973 goes terribly wrong thrusting one of the project scientists, a man named Nicholas Sinclair into a...", "title" : "A.P.E.X." } - { "_id" : ObjectId("573a13c6f29313caabd715d3"), "plot" : "Agent J travels in time to M.I.B.'s early days in 1969 to stop an alien from assassinating his friend Agent K and changing history.", "title" : "Men in Black 3" } - { "_id" : ObjectId("573a13d4f29313caabd98c13"), "plot" : "Bound by a shared destiny, a teen bursting with scientific curiosity and a former boy-genius inventor embark on a mission to unearth the secrets of a place somewhere in time and space that exists in their collective memory.", "title" : "Tomorrowland" } - { "_id" : ObjectId("573a13b6f29313caabd477fa"), "plot" : "With the help of his uncle, a man travels to the future to try and bring his girlfriend back to life.", "title" : "Love Story 2050" } - { "_id" : ObjectId("573a13e5f29313caabdc40c9"), "plot" : "A dimension-traveling wizard gets stuck in the 21st century because cell-phone radiation interferes with his magic. With his home world on the brink of war, he seeks help from a jaded ...", "title" : "The Portal" } - -For more information about Atlas Vector Search, Atlas Vector Search indexes, and how -to incorporate them into your application, see :atlas:`Atlas Vector Search Overview ` -in the Atlas manual. For more examples about running Atlas Vector Search queries using the -{+driver-short+}, see :atlas:`Run Vector Search Queries ` -in the Atlas manual and select :guilabel:`C#` from the language dropdown. \ No newline at end of file From 8778f5f8aeb0ecb914773033469f77ee40246fd3 Mon Sep 17 00:00:00 2001 From: Mike Woofter <108414937+mongoKart@users.noreply.github.com> Date: Tue, 15 Apr 2025 08:36:54 -0500 Subject: [PATCH 08/17] edits --- source/aggregation.txt | 31 +++---------------------------- 1 file changed, 3 insertions(+), 28 deletions(-) diff --git a/source/aggregation.txt b/source/aggregation.txt index bd26a7bb..0e7d7410 100644 --- a/source/aggregation.txt +++ b/source/aggregation.txt @@ -34,8 +34,8 @@ return computed results. The MongoDB Aggregation framework is modeled on the concept of data processing pipelines. Documents enter a pipeline comprised of one or more stages, and this pipeline transforms the documents into an aggregated result. -To learn more about the Aggregation Pipeline, see the -:manual:`Aggregation Pipeline ` server manual page. +To learn more about the aggregation stages supported by the {+driver-short+}, see +:ref:`Aggregation Stages `. Analogy ~~~~~~~ @@ -96,8 +96,6 @@ performing aggregation operations: the `AllowDiskUse <{+new-api-root+}/MongoDB.Driver/MongoDB.Driver.AggregateOptions.AllowDiskUse.html#MongoDB_Driver_AggregateOptions_AllowDiskUse>`__ property of the ``AggregateOptions`` object that you pass to the ``Aggregate()`` method. -Link to Stages page - Troubleshooting --------------- @@ -106,32 +104,9 @@ Troubleshooting Additional Information ---------------------- -MongoDB Server Manual -~~~~~~~~~~~~~~~~~~~~~ - To view a full list of expression operators, see :manual:`Aggregation Operators `. -To learn more about assembling an aggregation pipeline and view examples, see -:manual:`Aggregation Pipeline `. - -To learn more about creating pipeline stages, see -:manual:`Aggregation Stages `. - To learn about explaining MongoDB aggregation operations, see :manual:`Explain Results ` and -:manual:`Query Plans `. - -API Documentation -~~~~~~~~~~~~~~~~~ - -For more information about the aggregation operations discussed in this guide, see the -following API documentation: - -- `Aggregate() <{+new-api-root+}/MongoDB.Driver/MongoDB.Driver.IMongoCollection-1.Aggregate.html>`__ -- `AggregateOptions <{+new-api-root+}/MongoDB.Driver/MongoDB.Driver.AggregateOptions.html>`__ -- `Group() <{+new-api-root+}/MongoDB.Driver/MongoDB.Driver.PipelineStageDefinitionBuilder.Group.html>`__ -- `Match() <{+new-api-root+}/MongoDB.Driver/MongoDB.Driver.PipelineStageDefinitionBuilder.Match.html>`__ -- `Where() <{+new-api-root+}/MongoDB.Driver/MongoDB.Driver.Linq.MongoQueryable.Where.html>`__ -- `GroupBy() <{+new-api-root+}/MongoDB.Driver/MongoDB.Driver.Linq.MongoQueryable.GroupBy.html>`__ -- `Select() <{+new-api-root+}/MongoDB.Driver/MongoDB.Driver.Linq.MongoQueryable.Select.html>`__ \ No newline at end of file +:manual:`Query Plans `. \ No newline at end of file From 64576092924a55e257af1784e93eee41d1d7d277 Mon Sep 17 00:00:00 2001 From: Mike Woofter <108414937+mongoKart@users.noreply.github.com> Date: Tue, 15 Apr 2025 08:38:01 -0500 Subject: [PATCH 09/17] add stages to snooty --- snooty.toml | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/snooty.toml b/snooty.toml index 33ce9b2c..00aa0586 100644 --- a/snooty.toml +++ b/snooty.toml @@ -2,7 +2,8 @@ toc_landing_pages = [ "/get-started", "/connect/connection-options", "/security/authentication", - "/aggregation" + "/aggregation", + "/aggregation/stages" ] name = "csharp" title = "C#/.NET" From a53b35b05f9180e0db3441b3ec6824c7d77aa70d Mon Sep 17 00:00:00 2001 From: Mike Woofter <108414937+mongoKart@users.noreply.github.com> Date: Tue, 15 Apr 2025 08:46:00 -0500 Subject: [PATCH 10/17] small fixes --- source/aggregation/stages.txt | 10 +++++----- 1 file changed, 5 insertions(+), 5 deletions(-) diff --git a/source/aggregation/stages.txt b/source/aggregation/stages.txt index f481d150..0090cfa6 100644 --- a/source/aggregation/stages.txt +++ b/source/aggregation/stages.txt @@ -82,9 +82,8 @@ BsonDocument Some aggregation stages don't have corresponding methods in the {+driver-short+}. To add these stages to your pipeline, use ``BsonDocument`` objects or string literals to construct a stage in the Query API syntax. Then, pass the BSON document to the -`PipelineDefinitionBuilder.AppendStage() <{+new-api-root+}/MongoDB.Driver/MongoDB.Driver.PipelineDefinitionBuilder.AppendStage.html>`__ -method. This syntax supports all stages in the aggregation pipeline, but doesn't provide -type hints or type safety to your code. +``PipelineDefinitionBuilder.AppendStage()`` method. This syntax supports all stages +in the aggregation pipeline, but doesn't provide type hints or type safety. The following code example shows how to add ``$unset``, an aggregation stage without a corresponding method, to an empty aggregation pipeline: @@ -112,7 +111,7 @@ to your pipeline. .. list-table:: :header-rows: 1 - :widths: 20 80 + :widths: 20 60 20 * - Aggregation Stage - Description @@ -342,4 +341,5 @@ following API documentation: - `Aggregate() <{+new-api-root+}/MongoDB.Driver/MongoDB.Driver.IMongoCollection-1.Aggregate.html>`__ - `AggregateOptions <{+new-api-root+}/MongoDB.Driver/MongoDB.Driver.AggregateOptions.html>`__ - `EmptyPipelineDefinition <{+new-api-root+}/MongoDB.Driver/MongoDB.Driver.EmptyPipelineDefinition-1.-ctor.html>`__ -- `BsonDocument <{+new-api-root+}/MongoDB.Bson/MongoDB.Bson.BsonDocument.html>`__ \ No newline at end of file +- `BsonDocument <{+new-api-root+}/MongoDB.Bson/MongoDB.Bson.BsonDocument.html>`__ +- `PipelineDefinitionBuilder.AppendStage() <{+new-api-root+}/MongoDB.Driver/MongoDB.Driver.PipelineDefinitionBuilder.AppendStage.html>`__ \ No newline at end of file From e932b5afe7b4b9b5047058a31eb170816917e8ba Mon Sep 17 00:00:00 2001 From: Mike Woofter <108414937+mongoKart@users.noreply.github.com> Date: Tue, 15 Apr 2025 08:51:53 -0500 Subject: [PATCH 11/17] table width --- source/aggregation/stages.txt | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/source/aggregation/stages.txt b/source/aggregation/stages.txt index 0090cfa6..84972924 100644 --- a/source/aggregation/stages.txt +++ b/source/aggregation/stages.txt @@ -111,7 +111,7 @@ to your pipeline. .. list-table:: :header-rows: 1 - :widths: 20 60 20 + :widths: 40 40 20 * - Aggregation Stage - Description From 1fb55d0eba0754e46799c195d10590301000ff16 Mon Sep 17 00:00:00 2001 From: Mike Woofter <108414937+mongoKart@users.noreply.github.com> Date: Tue, 15 Apr 2025 09:00:02 -0500 Subject: [PATCH 12/17] table widths --- source/aggregation/stages.txt | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/source/aggregation/stages.txt b/source/aggregation/stages.txt index 84972924..0090cfa6 100644 --- a/source/aggregation/stages.txt +++ b/source/aggregation/stages.txt @@ -111,7 +111,7 @@ to your pipeline. .. list-table:: :header-rows: 1 - :widths: 40 40 20 + :widths: 20 60 20 * - Aggregation Stage - Description From 9d51161db561f9068cbd7f297fd85fc889477d9a Mon Sep 17 00:00:00 2001 From: Mike Woofter <108414937+mongoKart@users.noreply.github.com> Date: Tue, 15 Apr 2025 09:15:10 -0500 Subject: [PATCH 13/17] table fix --- source/aggregation/stages.txt | 438 +++++++++++++++++----------------- 1 file changed, 219 insertions(+), 219 deletions(-) diff --git a/source/aggregation/stages.txt b/source/aggregation/stages.txt index 0090cfa6..34684253 100644 --- a/source/aggregation/stages.txt +++ b/source/aggregation/stages.txt @@ -21,7 +21,7 @@ Overview -------- On this page, you can learn how to create an aggregation pipeline and add stages to it -by using methods in the {+driver-short+}. +by using methods in the {+driver-short+}. Build an Aggregation Pipeline ----------------------------- @@ -35,7 +35,7 @@ approaches. Builders ~~~~~~~~ -You can build a type-safe aggregation pipeline in the following ways: +You can build a type-safe aggregation pipeline in the following ways: - Construct an ``EmptyPipelineDefinition`` object. Chain calls from this object to the relevant aggregation methods. Then, pass the pipeline object to the @@ -44,11 +44,11 @@ You can build a type-safe aggregation pipeline in the following ways: - Call the ``IMongoCollection.Aggregate()`` method. Chain calls from this method call to the relevant aggregation methods. -Select the :guilabel:`EmptyPipelineDefinition` +Select the :guilabel:`EmptyPipelineDefinition` or :guilabel:`Aggregate` tab to see the corresponding code for each approach: .. tabs:: - + .. tab:: EmptyPipelineDefinition :tabid: empty-pipeline-definition @@ -81,7 +81,7 @@ BsonDocument Some aggregation stages don't have corresponding methods in the {+driver-short+}. To add these stages to your pipeline, use ``BsonDocument`` objects or string literals -to construct a stage in the Query API syntax. Then, pass the BSON document to the +to construct a stage in the Query API syntax. Then, pass the BSON document to the ``PipelineDefinitionBuilder.AppendStage()`` method. This syntax supports all stages in the aggregation pipeline, but doesn't provide type hints or type safety. @@ -92,7 +92,7 @@ stage without a corresponding method, to an empty aggregation pipeline: var pipeline = new EmptyPipelineDefinition().AppendStage("{ $unset: "field1" }"); -.. note:: +.. note:: If you use a ``BsonDocument`` to define a pipeline stage, the driver doesn't recognize any ``BsonClassMap`` attributes, serialization attributes, or @@ -113,218 +113,218 @@ to your pipeline. :header-rows: 1 :widths: 20 60 20 - * - Aggregation Stage - - Description - - Builders Method - - * - :manual:`$bucket ` - - Categorizes incoming documents into groups, called buckets, - based on a specified expression and bucket boundaries. - - :ref:`Bucket() ` - - * - :manual:`$bucketAuto ` - - Categorizes incoming documents into a specific number of - groups, called buckets, based on a specified expression. - Bucket boundaries are automatically determined in an attempt - to evenly distribute the documents into the specified number - of buckets. - - :ref:`BucketAuto() ` - - * - :manual:`$changeStream ` - - Returns a change stream cursor for the - collection. This stage can occur only once in an aggregation - pipeline and it must occur as the first stage. - - :ref:`ChangeStream() ` - - * - :manual:`$changeStreamSplitLargeEvent ` - - Splits large change stream events that exceed 16 MB into smaller fragments returned - in a change stream cursor. - - You can use ``$changeStreamSplitLargeEvent`` only in a ``$changeStream`` pipeline, and - it must be the final stage in the pipeline. - - :ref:`ChangeStreamSplitLargeEvent() ` - - * - :manual:`$count ` - - Returns a count of the number of documents at this stage of - the aggregation pipeline. - - :ref:`Count() ` - - * - :manual:`$densify ` - - Creates new documents in a sequence of documents where certain values in a field are missing. - - :ref:`Densify() ` - - * - :manual:`$documents ` - - Returns literal documents from input expressions. - - :ref:`Documents() ` - - * - :manual:`$facet ` - - Processes multiple aggregation pipelines - within a single stage on the same set - of input documents. Enables the creation of multi-faceted - aggregations capable of characterizing data across multiple - dimensions, or facets, in a single stage. - - :ref:`Facet() ` - - * - :manual:`$graphLookup ` - - Performs a recursive search on a collection. To each output - document, adds a new array field that contains the traversal - results of the recursive search for that document. - - :ref:`GraphLookup() ` - - * - :manual:`$group ` - - Groups input documents by a specified identifier expression - and applies the accumulator expressions, if specified, to - each group. Consumes all input documents and outputs one - document per each distinct group. The output documents - contain only the identifier field and, if specified, accumulated - fields. - - :ref:`Group() ` - - * - :manual:`$limit ` - - Passes the first *n* documents unmodified to the pipeline, - where *n* is the specified limit. For each input document, - outputs either one document (for the first *n* documents) or - zero documents (after the first *n* documents). - - :ref:`Limit() ` - - * - :manual:`$lookup ` - - Performs a left outer join to another collection in the - *same* database to filter in documents from the "joined" - collection for processing. - - :ref:`Lookup() ` - - * - :manual:`$match ` - - Filters the document stream to allow only matching documents - to pass unmodified into the next pipeline stage. - For each input document, outputs either one document (a match) or zero - documents (no match). - - :ref:`Match() ` - - * - :manual:`$merge ` - - Writes the resulting documents of the aggregation pipeline to - a collection. The stage can incorporate (insert new - documents, merge documents, replace documents, keep existing - documents, fail the operation, process documents with a - custom update pipeline) the results into an output - collection. To use this stage, it must be - the last stage in the pipeline. - - :ref:`Merge() ` - - * - :manual:`$out ` - - Writes the resulting documents of the aggregation pipeline to - a collection. To use this stage, it must be - the last stage in the pipeline. - - :ref:`Out() ` - - * - :manual:`$project ` - - Reshapes each document in the stream, such as by adding new - fields or removing existing fields. For each input document, - outputs one document. - - :ref:`Project() ` - - * - :manual:`$rankFusion ` - - Uses a rank fusion algorithm to combine results from a Vector Search - query and an Atlas Search query. - - :ref:`RankFusion() ` - - * - :manual:`$replaceRoot ` - - Replaces a document with the specified embedded document. The - operation replaces all existing fields in the input document, - including the ``_id`` field. Specify a document embedded in - the input document to promote the embedded document to the - top level. - - The ``$replaceWith`` stage is an alias for the ``$replaceRoot`` stage. - - :ref:`ReplaceRoot() ` - - * - :manual:`$replaceWith ` - - Replaces a document with the specified embedded document. - The operation replaces all existing fields in the input document, including - the ``_id`` field. Specify a document embedded in the input document to promote - the embedded document to the top level. - - The ``$replaceWith`` stage is an alias for the ``$replaceRoot`` stage. - - :ref:`ReplaceWith() ` - - * - :manual:`$sample ` - - Randomly selects the specified number of documents from its - input. - - :ref:`Sample() ` - - * - :manual:`$search ` - - Performs a full-text search of the field or fields in an - :atlas:`Atlas ` - collection. - - This stage is available only for MongoDB Atlas clusters, and is not - available for self-managed deployments. To learn more, see - :atlas:`Atlas Search Aggregation Pipeline Stages - ` in the Atlas documentation. - - :ref:`Search() ` - - * - :manual:`$searchMeta ` - - Returns different types of metadata result documents for the - :atlas:`Atlas Search ` query against an - :atlas:`Atlas ` - collection. - - This stage is available only for MongoDB Atlas clusters, - and is not available for self-managed deployments. To learn - more, see :atlas:`Atlas Search Aggregation Pipeline Stages - ` in the Atlas documentation. - - :ref:`SearchMeta() ` - - * - :manual:`$set ` - - Adds new fields to documents. Like the ``Project()`` method, - this method reshapes each - document in the stream by adding new fields to - output documents that contain both the existing fields - from the input documents and the newly added fields. - - :ref:`Set() ` - - * - :manual:`$setWindowFields ` - - Groups documents into windows and applies one or more - operators to the documents in each window. - - :ref:`SetWindowFields() ` - - * - :manual:`$skip ` - - Skips the first *n* documents, where *n* is the specified skip - number, and passes the remaining documents unmodified to the - pipeline. For each input document, outputs either zero - documents (for the first *n* documents) or one document (if - after the first *n* documents). - - :ref:`Skip() ` - - * - :manual:`$sort ` - - Reorders the document stream by a specified sort key. The documents remain unmodified. - For each input document, outputs one document. - - :ref:`Sort() ` - - * - :manual:`$sortByCount ` - - Groups incoming documents based on the value of a specified - expression, then computes the count of documents in each - distinct group. - - :ref:`SortByCount() ` - - * - :manual:`$unionWith ` - - Combines pipeline results from two collections into a single - result set. - - :ref:`UnionWith() ` - - * - :manual:`$unwind ` - - Deconstructs an array field from the input documents to - output a document for *each* element. Each output document - replaces the array with an element value. For each input - document, outputs *n* Documents, where *n* is the number of - array elements. *n* can be zero for an empty array. - - :ref:`Unwind() ` - - * - :manual:`$vectorSearch ` - - Performs an :abbr:`ANN (Approximate Nearest Neighbor)` or - :abbr:`ENN (Exact Nearest Neighbor)` search on a - vector in the specified field of an - :atlas:`Atlas ` collection. - - :ref:`VectorSearch() ` + * - Aggregation Stage + - Description + - Builders Method + + * - :manual:`$bucket ` + - Categorizes incoming documents into groups, called buckets, + based on a specified expression and bucket boundaries. + - :ref:`Bucket() ` + + * - :manual:`$bucketAuto ` + - Categorizes incoming documents into a specific number of + groups, called buckets, based on a specified expression. + Bucket boundaries are automatically determined in an attempt + to evenly distribute the documents into the specified number + of buckets. + - :ref:`BucketAuto() ` + + * - :manual:`$changeStream ` + - Returns a change stream cursor for the + collection. This stage can occur only once in an aggregation + pipeline and it must occur as the first stage. + - :ref:`ChangeStream() ` + + * - :manual:`$changeStreamSplitLargeEvent ` + - Splits large change stream events that exceed 16 MB into smaller fragments returned + in a change stream cursor. + + You can use ``$changeStreamSplitLargeEvent`` only in a ``$changeStream`` pipeline, and + it must be the final stage in the pipeline. + - :ref:`ChangeStreamSplitLargeEvent() ` + + * - :manual:`$count ` + - Returns a count of the number of documents at this stage of + the aggregation pipeline. + - :ref:`Count() ` + + * - :manual:`$densify ` + - Creates new documents in a sequence of documents where certain values in a field are missing. + - :ref:`Densify() ` + + * - :manual:`$documents ` + - Returns literal documents from input expressions. + - :ref:`Documents() ` + + * - :manual:`$facet ` + - Processes multiple aggregation pipelines + within a single stage on the same set + of input documents. Enables the creation of multi-faceted + aggregations capable of characterizing data across multiple + dimensions, or facets, in a single stage. + - :ref:`Facet() ` + + * - :manual:`$graphLookup ` + - Performs a recursive search on a collection. To each output + document, adds a new array field that contains the traversal + results of the recursive search for that document. + - :ref:`GraphLookup() ` + + * - :manual:`$group ` + - Groups input documents by a specified identifier expression + and applies the accumulator expressions, if specified, to + each group. Consumes all input documents and outputs one + document per each distinct group. The output documents + contain only the identifier field and, if specified, accumulated + fields. + - :ref:`Group() ` + + * - :manual:`$limit ` + - Passes the first *n* documents unmodified to the pipeline, + where *n* is the specified limit. For each input document, + outputs either one document (for the first *n* documents) or + zero documents (after the first *n* documents). + - :ref:`Limit() ` + + * - :manual:`$lookup ` + - Performs a left outer join to another collection in the + *same* database to filter in documents from the "joined" + collection for processing. + - :ref:`Lookup() ` + + * - :manual:`$match ` + - Filters the document stream to allow only matching documents + to pass unmodified into the next pipeline stage. + For each input document, outputs either one document (a match) or zero + documents (no match). + - :ref:`Match() ` + + * - :manual:`$merge ` + - Writes the resulting documents of the aggregation pipeline to + a collection. The stage can incorporate (insert new + documents, merge documents, replace documents, keep existing + documents, fail the operation, process documents with a + custom update pipeline) the results into an output + collection. To use this stage, it must be + the last stage in the pipeline. + - :ref:`Merge() ` + + * - :manual:`$out ` + - Writes the resulting documents of the aggregation pipeline to + a collection. To use this stage, it must be + the last stage in the pipeline. + - :ref:`Out() ` + + * - :manual:`$project ` + - Reshapes each document in the stream, such as by adding new + fields or removing existing fields. For each input document, + outputs one document. + - :ref:`Project() ` + + * - :manual:`$rankFusion ` + - Uses a rank fusion algorithm to combine results from a Vector Search + query and an Atlas Search query. + - :ref:`RankFusion() ` + + * - :manual:`$replaceRoot ` + - Replaces a document with the specified embedded document. The + operation replaces all existing fields in the input document, + including the ``_id`` field. Specify a document embedded in + the input document to promote the embedded document to the + top level. + + The ``$replaceWith`` stage is an alias for the ``$replaceRoot`` stage. + - :ref:`ReplaceRoot() ` + + * - :manual:`$replaceWith ` + - Replaces a document with the specified embedded document. + The operation replaces all existing fields in the input document, including + the ``_id`` field. Specify a document embedded in the input document to promote + the embedded document to the top level. + + The ``$replaceWith`` stage is an alias for the ``$replaceRoot`` stage. + - :ref:`ReplaceWith() ` + + * - :manual:`$sample ` + - Randomly selects the specified number of documents from its + input. + - :ref:`Sample() ` + + * - :manual:`$search ` + - Performs a full-text search of the field or fields in an + :atlas:`Atlas ` + collection. + + This stage is available only for MongoDB Atlas clusters, and is not + available for self-managed deployments. To learn more, see + :atlas:`Atlas Search Aggregation Pipeline Stages + ` in the Atlas documentation. + - :ref:`Search() ` + + * - :manual:`$searchMeta ` + - Returns different types of metadata result documents for the + :atlas:`Atlas Search ` query against an + :atlas:`Atlas ` + collection. + + This stage is available only for MongoDB Atlas clusters, + and is not available for self-managed deployments. To learn + more, see :atlas:`Atlas Search Aggregation Pipeline Stages + ` in the Atlas documentation. + - :ref:`SearchMeta() ` + + * - :manual:`$set ` + - Adds new fields to documents. Like the ``Project()`` method, + this method reshapes each + document in the stream by adding new fields to + output documents that contain both the existing fields + from the input documents and the newly added fields. + - :ref:`Set() ` + + * - :manual:`$setWindowFields ` + - Groups documents into windows and applies one or more + operators to the documents in each window. + - :ref:`SetWindowFields() ` + + * - :manual:`$skip ` + - Skips the first *n* documents, where *n* is the specified skip + number, and passes the remaining documents unmodified to the + pipeline. For each input document, outputs either zero + documents (for the first *n* documents) or one document (if + after the first *n* documents). + - :ref:`Skip() ` + + * - :manual:`$sort ` + - Reorders the document stream by a specified sort key. The documents remain unmodified. + For each input document, outputs one document. + - :ref:`Sort() ` + + * - :manual:`$sortByCount ` + - Groups incoming documents based on the value of a specified + expression, then computes the count of documents in each + distinct group. + - :ref:`SortByCount() ` + + * - :manual:`$unionWith ` + - Combines pipeline results from two collections into a single + result set. + - :ref:`UnionWith() ` + + * - :manual:`$unwind ` + - Deconstructs an array field from the input documents to + output a document for *each* element. Each output document + replaces the array with an element value. For each input + document, outputs *n* Documents, where *n* is the number of + array elements. *n* can be zero for an empty array. + - :ref:`Unwind() ` + + * - :manual:`$vectorSearch ` + - Performs an :abbr:`ANN (Approximate Nearest Neighbor)` or + :abbr:`ENN (Exact Nearest Neighbor)` search on a + vector in the specified field of an + :atlas:`Atlas ` collection. + - :ref:`VectorSearch() ` API Documentation ----------------- @@ -332,7 +332,7 @@ API Documentation To learn more about assembling an aggregation pipeline, see :manual:`Aggregation Pipeline `. -To learn more about creating pipeline stages, see +To learn more about creating pipeline stages, see :manual:`Aggregation Stages `. For more information about the methods and classes used on this page, see the From aac403a66f71af85d07db4c59b030d829e641829 Mon Sep 17 00:00:00 2001 From: Mike Woofter <108414937+mongoKart@users.noreply.github.com> Date: Tue, 15 Apr 2025 09:30:12 -0500 Subject: [PATCH 14/17] table widths --- source/aggregation/stages.txt | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/source/aggregation/stages.txt b/source/aggregation/stages.txt index 34684253..34918ad8 100644 --- a/source/aggregation/stages.txt +++ b/source/aggregation/stages.txt @@ -111,7 +111,7 @@ to your pipeline. .. list-table:: :header-rows: 1 - :widths: 20 60 20 + :widths: 30 40 30 * - Aggregation Stage - Description From 11ecea448714ee11806226f15ecc3304fa9ccbe2 Mon Sep 17 00:00:00 2001 From: Mike Woofter <108414937+mongoKart@users.noreply.github.com> Date: Tue, 15 Apr 2025 09:35:17 -0500 Subject: [PATCH 15/17] table widths --- source/aggregation/stages.txt | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/source/aggregation/stages.txt b/source/aggregation/stages.txt index 34918ad8..3decb1a7 100644 --- a/source/aggregation/stages.txt +++ b/source/aggregation/stages.txt @@ -111,7 +111,7 @@ to your pipeline. .. list-table:: :header-rows: 1 - :widths: 30 40 30 + :widths: 28 44 28 * - Aggregation Stage - Description From 1129cc82a7804e4b307299e30b432deb2a2e450f Mon Sep 17 00:00:00 2001 From: Mike Woofter <108414937+mongoKart@users.noreply.github.com> Date: Tue, 15 Apr 2025 12:08:36 -0500 Subject: [PATCH 16/17] rr feedback --- source/aggregation.txt | 2 +- source/aggregation/stages.txt | 54 +++++++++++++++++++++-------------- 2 files changed, 34 insertions(+), 22 deletions(-) diff --git a/source/aggregation.txt b/source/aggregation.txt index 0e7d7410..037ec1a5 100644 --- a/source/aggregation.txt +++ b/source/aggregation.txt @@ -21,7 +21,7 @@ Aggregation Operations :titlesonly: :maxdepth: 1 - Stages + Pipeline Stages Overview -------- diff --git a/source/aggregation/stages.txt b/source/aggregation/stages.txt index 3decb1a7..c08b15f6 100644 --- a/source/aggregation/stages.txt +++ b/source/aggregation/stages.txt @@ -1,8 +1,8 @@ .. _csharp-aggregation-stages: -================== -Aggregation Stages -================== +=========================== +Aggregation Pipeline Stages +=========================== .. facet:: :name: genre @@ -20,20 +20,20 @@ Aggregation Stages Overview -------- -On this page, you can learn how to create an aggregation pipeline and add stages to it +On this page, you can learn how to create an aggregation pipeline and pipeline stages by using methods in the {+driver-short+}. Build an Aggregation Pipeline ----------------------------- -You can use the {+driver-short+} to build an aggregation pipeline by using builders +You can use the {+driver-short+} to build an aggregation pipeline by using builder methods or BSON documents. See the following sections to learn more about each of these approaches. -.. _csharp-aggregation-stages-builders: +.. _csharp-aggregation-stages-builder: -Builders -~~~~~~~~ +Builder Methods +~~~~~~~~~~~~~~~ You can build a type-safe aggregation pipeline in the following ways: @@ -61,7 +61,7 @@ or :guilabel:`Aggregate` tab to see the corresponding code for each approach: .Merge(...); // Executes the aggregation pipeline - var results = collection.Aggregate(pipeline).ToList(); + var results = collection.Aggregate(pipeline); .. tab:: Aggregate :tabid: aggregate @@ -86,13 +86,14 @@ to construct a stage in the Query API syntax. Then, pass the BSON document to th in the aggregation pipeline, but doesn't provide type hints or type safety. The following code example shows how to add ``$unset``, an aggregation -stage without a corresponding method, to an empty aggregation pipeline: +stage without a corresponding builder method, to an empty aggregation pipeline: .. code-block:: csharp - var pipeline = new EmptyPipelineDefinition().AppendStage("{ $unset: "field1" }"); + var pipeline = new EmptyPipelineDefinition() + .AppendStage("{ $unset: "field1" }"); -.. note:: +.. important:: If you use a ``BsonDocument`` to define a pipeline stage, the driver doesn't recognize any ``BsonClassMap`` attributes, serialization attributes, or @@ -102,10 +103,14 @@ stage without a corresponding method, to an empty aggregation pipeline: Aggregation Stage Methods ------------------------- -The following table lists the builders methods in the {+driver-short+} that correspond -to stages in the aggregation pipeline. For more information about an aggregation stage, -click the stage name. For more information about a builders method, click the -method name. If an aggregation stage isn't in the table, you must use the +The following table lists the builder methods in the {+driver-short+} that correspond +to stages in the aggregation pipeline. To learn more about an aggregation stage, +follow the link from the method name to its reference page in the {+mdb-server+} manual. +To learn more about a builder method, follow the link from the method name to its +dedicated page. + +If an aggregation stage isn't in the table, the driver doesn't provide a builder method for +it. In this case, you must use the :ref:`BsonDocument ` syntax to add the stage to your pipeline. @@ -115,7 +120,7 @@ to your pipeline. * - Aggregation Stage - Description - - Builders Method + - Builder Method * - :manual:`$bucket ` - Categorizes incoming documents into groups, called buckets, @@ -166,8 +171,8 @@ to your pipeline. - :ref:`Facet() ` * - :manual:`$graphLookup ` - - Performs a recursive search on a collection. To each output - document, adds a new array field that contains the traversal + - Performs a recursive search on a collection. This method adds + a new array field to each output document that contains the traversal results of the recursive search for that document. - :ref:`GraphLookup() ` @@ -324,16 +329,23 @@ to your pipeline. :abbr:`ENN (Exact Nearest Neighbor)` search on a vector in the specified field of an :atlas:`Atlas ` collection. + + This stage is available only for MongoDB Atlas clusters, and is not + available for self-managed deployments. To learn more, see + :atlas:`Atlas Search Aggregation Pipeline Stages + ` in the Atlas documentation. - :ref:`VectorSearch() ` API Documentation ----------------- To learn more about assembling an aggregation pipeline, see -:manual:`Aggregation Pipeline `. +:manual:`Aggregation Pipeline ` in the {+mdb-server+} +manual. To learn more about creating pipeline stages, see -:manual:`Aggregation Stages `. +:manual:`Aggregation Stages ` in the +{+mdb-server+} manual. For more information about the methods and classes used on this page, see the following API documentation: From 900c1521336b1fe55af64ca415427aaddc3f2e9f Mon Sep 17 00:00:00 2001 From: Mike Woofter <108414937+mongoKart@users.noreply.github.com> Date: Tue, 15 Apr 2025 12:57:18 -0500 Subject: [PATCH 17/17] rr feedback --- source/aggregation/stages.txt | 9 ++++----- 1 file changed, 4 insertions(+), 5 deletions(-) diff --git a/source/aggregation/stages.txt b/source/aggregation/stages.txt index c08b15f6..94d54fd4 100644 --- a/source/aggregation/stages.txt +++ b/source/aggregation/stages.txt @@ -85,13 +85,13 @@ to construct a stage in the Query API syntax. Then, pass the BSON document to th ``PipelineDefinitionBuilder.AppendStage()`` method. This syntax supports all stages in the aggregation pipeline, but doesn't provide type hints or type safety. -The following code example shows how to add ``$unset``, an aggregation -stage without a corresponding builder method, to an empty aggregation pipeline: +The following code example shows how to add the ``$unset`` stage to an empty aggregation +pipeline: .. code-block:: csharp var pipeline = new EmptyPipelineDefinition() - .AppendStage("{ $unset: "field1" }"); + .AppendStage("{ $unset: 'field1' }"); .. important:: @@ -332,8 +332,7 @@ to your pipeline. This stage is available only for MongoDB Atlas clusters, and is not available for self-managed deployments. To learn more, see - :atlas:`Atlas Search Aggregation Pipeline Stages - ` in the Atlas documentation. + :ref:`Atlas Vector Search `. - :ref:`VectorSearch() ` API Documentation