altair.FieldOrDatumDefWithConditionMarkPropFieldDefGradientstringnull#

class altair.FieldOrDatumDefWithConditionMarkPropFieldDefGradientstringnull(aggregate=Undefined, bandPosition=Undefined, bin=Undefined, condition=Undefined, field=Undefined, legend=Undefined, scale=Undefined, sort=Undefined, timeUnit=Undefined, title=Undefined, type=Undefined, **kwds)#

FieldOrDatumDefWithConditionMarkPropFieldDefGradientstringnull schema wrapper

Mapping(required=[]) A FieldDef with Condition { condition: {value: …}, field: …, … }

Attributes
aggregateAggregate

Aggregation function for the field (e.g., "mean", "sum", "median", "min", "max", "count" ).

Default value: undefined (None)

See also: aggregate documentation.

bandPositionfloat

Relative position on a band of a stacked, binned, time unit, or band scale. For example, the marks will be positioned at the beginning of the band if set to 0, and at the middle of the band if set to 0.5.

binanyOf(boolean, BinParams, None)

A flag for binning a quantitative field, an object defining binning parameters, or indicating that the data for x or y channel are binned before they are imported into Vega-Lite ( "binned" ).

If true, default binning parameters will be applied.

If "binned", this indicates that the data for the x (or y ) channel are already binned. You can map the bin-start field to x (or y ) and the bin-end field to x2 (or y2 ). The scale and axis will be formatted similar to binning in Vega-Lite. To adjust the axis ticks based on the bin step, you can also set the axis’s tickMinStep property.

Default value: false

See also: bin documentation.

conditionanyOf(ConditionalValueDefGradientstringnullExprRef,
List(:class:`ConditionalValueDefGradientstringnullExprRef`))

One or more value definition(s) with a parameter or a test predicate.

Note: A field definition’s condition property can only contain conditional value definitions since Vega-Lite only allows at most one encoded field per encoding channel.

fieldField

Required. A string defining the name of the field from which to pull a data value or an object defining iterated values from the repeat operator.

See also: field documentation.

Notes: 1) Dots ( . ) and brackets ( [ and ] ) can be used to access nested objects (e.g., "field": "foo.bar" and "field": "foo['bar']" ). If field names contain dots or brackets but are not nested, you can use \ to escape dots and brackets (e.g., "a\.b" and "a\[0\]" ). See more details about escaping in the field documentation. 2) field is not required if aggregate is count.

legendanyOf(Legend, None)

An object defining properties of the legend. If null, the legend for the encoding channel will be removed.

Default value: If undefined, default legend properties are applied.

See also: legend documentation.

scaleanyOf(Scale, None)

An object defining properties of the channel’s scale, which is the function that transforms values in the data domain (numbers, dates, strings, etc) to visual values (pixels, colors, sizes) of the encoding channels.

If null, the scale will be disabled and the data value will be directly encoded.

Default value: If undefined, default scale properties are applied.

See also: scale documentation.

sortSort

Sort order for the encoded field.

For continuous fields (quantitative or temporal), sort can be either "ascending" or "descending".

For discrete fields, sort can be one of the following:

Default value: "ascending"

Note: null and sorting by another channel is not supported for row and column.

See also: sort documentation.

timeUnitanyOf(TimeUnit, TimeUnitParams)

Time unit (e.g., year, yearmonth, month, hours ) for a temporal field. or a temporal field that gets casted as ordinal.

Default value: undefined (None)

See also: timeUnit documentation.

titleanyOf(Text, None)

A title for the field. If null, the title will be removed.

Default value: derived from the field’s name and transformation function ( aggregate, bin and timeUnit ). If the field has an aggregate function, the function is displayed as part of the title (e.g., "Sum of Profit" ). If the field is binned or has a time unit applied, the applied function is shown in parentheses (e.g., "Profit (binned)", "Transaction Date (year-month)" ). Otherwise, the title is simply the field name.

Notes :

1) You can customize the default field title format by providing the fieldTitle property in the config or fieldTitle function via the compile function’s options.

2) If both field definition’s title and axis, header, or legend title are defined, axis/header/legend title will be used.

typeStandardType

The type of measurement ( "quantitative", "temporal", "ordinal", or "nominal" ) for the encoded field or constant value ( datum ). It can also be a "geojson" type for encoding ‘geoshape’.

Vega-Lite automatically infers data types in many cases as discussed below. However, type is required for a field if: (1) the field is not nominal and the field encoding has no specified aggregate (except argmin and argmax ), bin, scale type, custom sort order, nor timeUnit or (2) if you wish to use an ordinal scale for a field with bin or timeUnit.

Default value:

1) For a data field, "nominal" is the default data type unless the field encoding has aggregate, channel, bin, scale type, sort, or timeUnit that satisfies the following criteria:

  • "quantitative" is the default type if (1) the encoded field contains bin or aggregate except "argmin" and "argmax", (2) the encoding channel is latitude or longitude channel or (3) if the specified scale type is a quantitative scale.

  • "temporal" is the default type if (1) the encoded field contains timeUnit or (2) the specified scale type is a time or utc scale

  • ordinal"" is the default type if (1) the encoded field contains a custom sort order, (2) the specified scale type is an ordinal/point/band scale, or (3) the encoding channel is order.

  1. For a constant value in data domain ( datum ):

  • "quantitative" if the datum is a number

  • "nominal" if the datum is a string

  • "temporal" if the datum is a date time object

Note:

  • Data type describes the semantics of the data rather than the primitive data types (number, string, etc.). The same primitive data type can have different types of measurement. For example, numeric data can represent quantitative, ordinal, or nominal data.

  • Data values for a temporal field can be either a date-time string (e.g., "2015-03-07 12:32:17", "17:01", "2015-03-16". "2015" ) or a timestamp number (e.g., 1552199579097 ).

  • When using with bin, the type property can be either "quantitative" (for using a linear bin scale) or “ordinal” (for using an ordinal bin scale).

  • When using with timeUnit, the type property can be either "temporal" (default, for using a temporal scale) or “ordinal” (for using an ordinal scale).

  • When using with aggregate, the type property refers to the post-aggregation data type. For example, we can calculate count distinct of a categorical field "cat" using {"aggregate": "distinct", "field": "cat"}. The "type" of the aggregate output is "quantitative".

  • Secondary channels (e.g., x2, y2, xError, yError ) do not have type as they must have exactly the same type as their primary channels (e.g., x, y ).

See also: type documentation.

__init__(aggregate=Undefined, bandPosition=Undefined, bin=Undefined, condition=Undefined, field=Undefined, legend=Undefined, scale=Undefined, sort=Undefined, timeUnit=Undefined, title=Undefined, type=Undefined, **kwds)#

Methods

__init__([aggregate, bandPosition, bin, ...])

copy([deep, ignore])

Return a copy of the object

from_dict(dct[, validate, _wrapper_classes])

Construct class from a dictionary representation

from_json(json_string[, validate])

Instantiate the object from a valid JSON string

resolve_references([schema])

Resolve references in the context of this object's schema or root schema.

to_dict([validate, ignore, context])

Return a dictionary representation of the object

to_json([validate, ignore, context, indent, ...])

Emit the JSON representation for this object as a string.

validate(instance[, schema])

Validate the instance against the class schema in the context of the rootschema.

validate_property(name, value[, schema])

Validate a property against property schema in the context of the rootschema