SecondsOfTime¶
Overview¶
The SecondsOfTime expression extracts the seconds component from a time-related value. This is a unary expression that operates on a single child expression and returns the seconds portion as an integer value.
Syntax¶
// DataFrame API usage
import org.apache.spark.sql.catalyst.expressions.SecondsOfTime
SecondsOfTime(child_expression)
Arguments¶
| Argument | Type | Description |
|---|---|---|
| child | Expression | The input expression containing time data from which to extract seconds |
Return Type¶
Integer type representing the seconds component (typically 0-59).
Supported Data Types¶
- Timestamp types
- Time-related string formats
- Date/time structures that contain seconds information
Algorithm¶
- Evaluates the child expression to obtain the input time value
- Extracts the seconds component from the time representation
- Returns the seconds value as an integer
- Handles null propagation from the child expression
- Utilizes the
withNewChildrenInternalmethod for expression tree manipulation
Partitioning Behavior¶
How this expression affects partitioning:
- Preserves partitioning as it's a deterministic transformation
- Does not require shuffle operations
- Can be pushed down in query optimization
Edge Cases¶
- Null handling: Returns null if the child expression evaluates to null
- Invalid time formats: Behavior depends on the underlying time parsing implementation
- Leap seconds: Handles standard 0-59 seconds range, special leap second handling may vary
- Timezone considerations: Seconds extraction is typically timezone-independent
Code Generation¶
This expression likely supports Spark's Tungsten code generation for optimized execution, though the specific implementation would depend on the complete class definition and any codegen-related methods.
Examples¶
-- Example SQL usage (hypothetical)
SELECT SECONDS(current_timestamp()) as current_seconds;
SELECT SECONDS('2023-12-25 14:30:45') as extracted_seconds; -- Returns 45
// Example DataFrame API usage
import org.apache.spark.sql.functions._
import org.apache.spark.sql.catalyst.expressions.SecondsOfTime
// Using in a DataFrame transformation
df.select(SecondsOfTime(col("timestamp_column")))
See Also¶
MinutesOfTime- Extracts minutes componentHoursOfTime- Extracts hours componentsecond()- Built-in SQL function for seconds extraction- Time extraction expressions in Spark Catalyst