Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 4 additions & 4 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Contributing to Spark Lance Connector

The Spark Lance connector codebase is at [lancedb/lance-spark](https://github.com/lancedb/lance-spark).
The Spark Lance connector codebase is at [lancedb/lance-spark](https://github.com/lance-format/lance-spark).

## Build Commands

Expand Down Expand Up @@ -52,11 +52,11 @@ make serve-docs

The contents in `lance-spark/docs` are for the ease of contributors to edit and preview.
After code merge, the contents are added to the
[main Lance documentation](https://github.com/lancedb/lance/tree/main/docs)
[main Lance documentation](https://github.com/lance-format/lance/tree/main/docs)
during the Lance doc CI build time, and is presented in the Lance website under
[Apache Spark integration](https://lancedb.github.io/lance/integrations/spark).
[Apache Spark integration](https://lance.org/integrations/spark).

The CONTRIBUTING.md document is auto-built to the [Lance Contributing Guide](https://lancedb.github.io/lance/community/contributing/)
The CONTRIBUTING.md document is auto-built to the [Lance Contributing Guide](https://lance.org/community/contributing/)

## Release Process

Expand Down
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,6 @@ The Apache Spark Connector for Lance allows Apache Spark to efficiently read dat
By using the Apache Spark Connector for Lance, you can leverage Apache Spark's powerful data processing, SQL querying,
and machine learning training capabilities on the AI data lake powered by Lance.

For more details, please visit the [documentation website](https://lancedb.github.io/lance/integrations/spark).
For more details, please visit the [documentation website](https://lance.org/integrations/spark).

For development setup and contribution guidelines, please see [CONTRIBUTING.md](CONTRIBUTING.md).
6 changes: 3 additions & 3 deletions docs/mkdocs.yml
Original file line number Diff line number Diff line change
@@ -1,10 +1,10 @@
site_name: Spark Lance Connector
site_description: The Apache Spark Connector for Lance allows Apache Spark to efficiently read datasets stored in Lance format.
site_url: https://lancedb.github.io/lance-spark/
site_url: https://lance.org/integrations/spark/
docs_dir: src

repo_name: lancedb/lance-spark
repo_url: https://github.com/lancedb/lance-spark
repo_url: https://github.com/lance-format/lance-spark

theme:
name: material
Expand Down Expand Up @@ -57,7 +57,7 @@ plugins:
extra:
social:
- icon: fontawesome/brands/github
link: https://github.com/lancedb/lance-spark
link: https://github.com/lance-format/lance-spark
- icon: fontawesome/brands/discord
link: https://discord.gg/zMM32dvNtd
- icon: fontawesome/brands/twitter
Expand Down
2 changes: 1 addition & 1 deletion docs/src/config.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Configuration

Spark DSV2 catalog integrates with Lance through [Lance Namespace](https://github.com/lancedb/lance-namespace).
Spark DSV2 catalog integrates with Lance through [Lance Namespace](https://github.com/lance-format/lance-namespace).

## Basic Setup

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -158,7 +158,7 @@ public void delete(InternalRow metadata, InternalRow id) throws IOException {
}
// Get the row index which is low 32 bits of row address.
// See
// https://github.com/lancedb/lance/blob/main/rust/lance-core/src/utils/address.rs#L36
// https://github.com/lance-format/lance/blob/main/rust/lance-core/src/utils/address.rs#L36
v.add(RowAddress.rowIndex(id.getLong(0)));
return v;
});
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -82,7 +82,7 @@ public Filter[] pushFilters(Filter[] filters) {
if (!config.isPushDownFilters()) {
return filters;
}
// remove the code after fix this issue https://github.com/lancedb/lance/issues/3578
// remove the code after fix this issue https://github.com/lance-format/lance/issues/3578
boolean hasNestedField = false;
for (StructField field : this.schema.fields()) {
if (field.dataType() instanceof ArrayType) {
Expand Down
4 changes: 2 additions & 2 deletions pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@

<name>${project.artifactId}</name>
<description>Lance Spark Connector</description>
<url>https://lancedb.github.io/lance-spark</url>
<url>https://lance.org/integrations/spark</url>

<developers>
<developer>
Expand All @@ -35,7 +35,7 @@

<issueManagement>
<system>GitHub</system>
<url>https://github.com/lancedb/lance-spark/issues</url>
<url>https://github.com/lance-format/lance-spark/issues</url>
</issueManagement>

<distributionManagement>
Expand Down
Loading