S4 class that represents a SparkDataFrame
SparkDataFrame.RdSparkDataFrames can be created using functions like createDataFrame, read.json, table etc.
Slots
envAn R environment that stores bookkeeping states of the SparkDataFrame
sdfA Java object reference to the backing Scala DataFrame
See also
createDataFrame, read.json, table
https://spark.apache.org/docs/latest/sparkr.html#sparkr-dataframes
Other SparkDataFrame functions:
agg(),
alias(),
arrange(),
as.data.frame(),
attach,SparkDataFrame-method,
broadcast(),
cache(),
checkpoint(),
coalesce(),
collect(),
colnames(),
coltypes(),
createOrReplaceTempView(),
crossJoin(),
cube(),
dapply(),
dapplyCollect(),
describe(),
dim(),
distinct(),
drop(),
dropDuplicates(),
dropna(),
dtypes(),
except(),
exceptAll(),
explain(),
filter(),
first(),
gapply(),
gapplyCollect(),
getNumPartitions(),
group_by(),
head(),
hint(),
histogram(),
insertInto(),
intersect(),
intersectAll(),
isLocal(),
isStreaming(),
join(),
limit(),
localCheckpoint(),
merge(),
mutate(),
ncol(),
nrow(),
persist(),
printSchema(),
randomSplit(),
rbind(),
rename(),
repartition(),
repartitionByRange(),
rollup(),
sample(),
saveAsTable(),
schema(),
select(),
selectExpr(),
show(),
showDF(),
storageLevel(),
str(),
subset(),
summary(),
take(),
toJSON(),
union(),
unionAll(),
unionByName(),
unpersist(),
unpivot(),
with(),
withColumn(),
withWatermark(),
write.df(),
write.jdbc(),
write.json(),
write.orc(),
write.parquet(),
write.stream(),
write.text()
Examples
if (FALSE) {
sparkR.session()
df <- createDataFrame(faithful)
}