Questions tagged [google-cloud-bigtable]

Google Cloud Bigtable is a fast, fully managed, massively scalable NoSQL database service designed for applications requiring terabytes to petabytes of data. Cloud Bigtable doesn't require you to sacrifice speed, scale, or cost efficiency when your applications grow.

Google Cloud Bigtable is a fast, fully managed, massively scalable NoSQL database service designed for applications requiring terabytes to petabytes of data. Cloud Bigtable doesn't require you to sacrifice speed, scale, or cost efficiency when your applications grow.

Cloud Bigtable is exposed to applications through a supported extension to the Apache HBase 1.0 Java library.

515 questions
6
votes
4 answers

Bigtable Backups and Redundancy

Google Cloud Bigtable looks fantastic, however I have some questions about backups and redundancy. Are there any options for backing up data to protect against human errors? Clusters currently run in a single zone - are there any ways to mitigate…
5
votes
2 answers

ClassNotFoundException: io.grpc.internal.BaseDnsNameResolverProvider when trying to create a BigtableDataClient from Windows

When spinning up a Spring boot application on Intellij, that tries to connect the GCP's BigTable, we get the following error: java.lang.NoClassDefFoundError: io/grpc/internal/BaseDnsNameResolverProvider This is the stack trace: ...nested exception…
riorio
  • 5,015
  • 5
  • 29
  • 70
5
votes
2 answers

How to handle hourly Bigtable connection closes?

I have golang services with persistant Bigtable client. The services are making hundred of read/write operation on Bigtable per sec. Every hours from the service boot, I experience hundred of errors like this one: Retryable error: rpc error: code =…
bedervet
  • 85
  • 5
5
votes
1 answer

how does one support pagination using bigtable Go client?

I store time series data in bigtable with a rowKey of userId#timestamp. Given query parameters of (userId, startTime, endTime) how can I support pagination i.e return 'limit' records starting from 'offset' ? note that userId#startTime rowKey may…
5
votes
1 answer

Big table vs Big Query usecase for timeseries data

I am looking to finalize on Big table vs Big Query for my usecase of timeseries data. I had gone through https://cloud.google.com/bigtable/docs/schema-design-time-series This is for storing an Omniture data which contains information like website…
5
votes
2 answers

Bigtable performance influence column families

We are currently investigating the influence of using multiple column families on the performance of our bigtable queries. We found that splitting the columns into multiple column families does not increase the performance. Does anyone have had…
krelst
  • 331
  • 2
  • 12
5
votes
1 answer

How to read and write data in Google Cloud Bigtable in PySpark application?

I am using Spark on a Google Cloud Dataproc cluster and I would like to access Bigtable in a PySpark job. Do we have any Bigtable connector for Spark like Google BigQuery connector? How can we access Bigtable from a PySpark application?
5
votes
2 answers

Google Cloud Bigtable: query partial keys

So if I have the following data in Bigtable: DEL_6878 ..... DEL_6879 ..... BOM_5876 ..... SFO_8686 ..... SFO_8687 ..... How do I query for say SFO* records? I read the documentation; I know how to get single row; something similar to this:…
Amit
  • 1,775
  • 12
  • 20
5
votes
2 answers

Achieving declared Cloud Bigtable writes QPS

We've setup Bigtable cluster with 5 nodes, and GCP console states that it should support 50K QPS @ 6ms for reads and writes. We are trying to load a large dataset (~800M records) with ~50 fields containing mostly numeric data, and a few short…
Leon Stein
  • 63
  • 6
5
votes
2 answers

Does Cloud Bigtable have a local development server like Cloud Datastore?

If not, is there any document that list best practices to integration-test an application that uses Cloud Bigtable? Specifically the applications written in Java and Python. Ref: https://cloud.google.com/datastore/docs/tools/devserver
Vinay
  • 80
  • 3
4
votes
1 answer

Migration from DynamoDB to Spanner/BigTable

I have a use case where I need to migrate 70 TB of data from DynamoDB to BigTable and Spanner. Tables with a single index will go to BigTable else they will go to Spanner. I can easily handle the historical loads by exporting the data to S3 --> GCS…
4
votes
1 answer

From Google Cloud Bigtable to AWS EMR (HBase)

How to export HBase table from Cloud Bigtable to AWS EMR? I know there's an official way to use Google Cloud Dataflow runner for exporting HBase tables to Google Storage. But the exported file is somehow encoded that I have no way to use it except…
Allen Huang
  • 61
  • 1
  • 5
4
votes
1 answer

Connect to Bigtable emulator from localhost with Node.js client

Trying to connect to Cloud Bigtable emulator from localhost. Saw couple of posts on how to connect to localhost Bigtable emulator with Java. There is no documentation that specifies how to do so with Node.js. @google-cloud/bigtable client needs…
4
votes
2 answers

Performance tuning tips for Google Cloud Bigtable

I'm using a collection of BT tables to store data that's being used for both batch and realtime operations, and want to optimize performance, especially around latency of random access reads. And while I do know the underlying BT codebase fairly…
4
votes
2 answers

How to set TTL on Hbase Row and Bigtable Row

I am trying to evaluate if it is possible to set a TTL on individual row in HBase or Bigtable. I know that Cassandra allows using TTL at insert. I want to find if the same is possible in HBase and in Google Cloud Bigtable. INSERT INTO test (k,v)…
1
2
3
34 35