This article explains TensorFlow Core—the low-level, composable APIs for developers who need fine-grained control: researchers, framework authors, and power users extending Keras. It outlines use cases (custom layers/losses/optimizers, bespoke training loops, distribution with DTensor, advanced metrics), shows how Core underpins tools like Keras, TF Model Optimization, and TF Graphics, and highlights non-ML scientific workloads (physics sims, ray tracing, constrained optimization). You’ll meet the core building blocks—tf.Tensor/Variable/TensorArray, primitive ops, tf.math/linalg/random, tf.function, tf.GradientTape, DTensor, and tf.saved_model—plus guidance: don’t re-implement high-level APIs just to copy Keras; combine them when it helps. For hands-on learning, jump to the Quickstart for TensorFlow Core and the Build with Core tutorials.This article explains TensorFlow Core—the low-level, composable APIs for developers who need fine-grained control: researchers, framework authors, and power users extending Keras. It outlines use cases (custom layers/losses/optimizers, bespoke training loops, distribution with DTensor, advanced metrics), shows how Core underpins tools like Keras, TF Model Optimization, and TF Graphics, and highlights non-ML scientific workloads (physics sims, ray tracing, constrained optimization). You’ll meet the core building blocks—tf.Tensor/Variable/TensorArray, primitive ops, tf.math/linalg/random, tf.function, tf.GradientTape, DTensor, and tf.saved_model—plus guidance: don’t re-implement high-level APIs just to copy Keras; combine them when it helps. For hands-on learning, jump to the Quickstart for TensorFlow Core and the Build with Core tutorials.

A Detailed Overview of TensorFlow Core APIs

Content Overview

  • Who should use the Core APIs
  • Core API applications
  • Build models and workflows
  • Build frameworks and tools
  • Build for scientific computing
  • Core API components
  • Next steps

\ The TensorFlow Core APIs provide a set of comprehensive, composable, and extensible low-level APIs for high-performance (distributed and accelerated) computation, primarily aimed at building machine learning (ML) models as well as authoring ML workflow tools and frameworks within the TensorFlow platform. These APIs provide a foundation for creating highly configurable models with fine-grained control and new frameworks from the ground up.

The Core APIs can be used as an alternative to high-level machine learning APIs like Keras. These high-level APIs are best suited for general machine learning needs. They offer a variety of modules that abstract away the complexities of ML while also offering functionalities for customization through subclassing. If you are looking for an overview of TensorFlow using Keras, see the Quickstarts and Keras sections in the tutorials.

Who should use the Core APIs

The TensorFlow Core low-level APIs are designed with the following ML developers in mind:

  • Researchers building complex models with high levels of configurability
  • Developers interested in using TensorFlow as a high-performance scientific computing platform
  • Framework authors building tools on top of the TensorFlow platform
  • High-level API users interested in:
  • Adding additional functionalities to their machine learning workflows such as custom layers, losses, models, and optimizers
  • Learning more about the inner workings of their models

Core API applications

The TensorFlow Core APIs provide access to low level functionality within the TensorFlow ecosystem. This API provides more flexibility and control for building ML models, applications, and tools, compared to high-level APIs, such as Keras.

Build models and workflows

The Core APIs are most commonly used to build highly customizable and optimized machine learning models and workflows. Here are some of the ways that the TensorFlow Core APIs can improve your machine learning models and workflow development:

TensorFlow

  • Building non-traditional models or layers that do not fully fit the structures supported by high-level APIs
  • Building custom layers, losses, models, and optimizers within Keras
  • Implementing new optimization techniques to expedite convergence during training
  • Creating custom metrics for performance evaluation
  • Designing highly-configurable training loops with support for features like batching, cross-validation, and distribution strategies

Build frameworks and tools

The TensorFlow Core APIs can also serve as the building blocks for new high-level frameworks. Here are some examples of tools and frameworks that are created with the low-level APIs: TensorFlow

  • Keras: deep learning for humans
  • TensorFlow Model Optimization Toolkit: a suite of tools to optimize ML models for deployment and execution
  • TensorFlow Graphics: a library for making useful graphics functions widely accessible

Build for scientific computing

The TensorFlow Core APIs can also be applied outside the realm of machine learning. Here are a few general-purpose use cases of TensorFlow for scientific computing: TensorFlow

  • Physics simulations for solid mechanics and fluid dynamics problems
  • Graphics rendering applications like ray tracing
  • Solving constrained optimization problems

Core API components

Here are some of the fundamental components that comprise TensorFlow Core’s low- level APIs. Note that this is not an all-encompassing list:

TensorFlow

  • Data structures : tf.Tensortf.Variabletf.TensorArray
  • Primitive APIs: tf.shape, slicing, tf.concattf.bitwise
  • Numerical: tf.mathtf.linalgtf.random
  • Functional components: tf.functiontf.GradientTape
  • Distribution: DTensor
  • Export: tf.saved_model

Next steps

The Build with Core documentation provides tutorials of basic machine learning concepts from scratch. The tutorials in this section help you get comfortable with writing low-level code with Core APIs that you can then apply to more complex use cases of your own.

\

:::tip Note: You should not use the Core APIs to simply re-implement high-level APIs, and it is possible to use high-level APIs, such as Keras, with the Core APIs.

:::

To get started using and learning more about the Core APIs, check out the Quickstart for TensorFlow Core.

\ \

:::info Originally published on the TensorFlow website, this article appears here under a new headline and is licensed under CC BY 4.0. Code samples shared under the Apache 2.0 License.

:::

\

Market Opportunity
Threshold Logo
Threshold Price(T)
$0.010027
$0.010027$0.010027
-1.27%
USD
Threshold (T) Live Price Chart
Disclaimer: The articles reposted on this site are sourced from public platforms and are provided for informational purposes only. They do not necessarily reflect the views of MEXC. All rights remain with the original authors. If you believe any content infringes on third-party rights, please contact service@support.mexc.com for removal. MEXC makes no guarantees regarding the accuracy, completeness, or timeliness of the content and is not responsible for any actions taken based on the information provided. The content does not constitute financial, legal, or other professional advice, nor should it be considered a recommendation or endorsement by MEXC.

You May Also Like

Gold Hits $3,700 as Sprott’s Wong Says Dollar’s Store-of-Value Crown May Slip

Gold Hits $3,700 as Sprott’s Wong Says Dollar’s Store-of-Value Crown May Slip

The post Gold Hits $3,700 as Sprott’s Wong Says Dollar’s Store-of-Value Crown May Slip appeared on BitcoinEthereumNews.com. Gold is strutting its way into record territory, smashing through $3,700 an ounce Wednesday morning, as Sprott Asset Management strategist Paul Wong says the yellow metal may finally snatch the dollar’s most coveted role: store of value. Wong Warns: Fiscal Dominance Puts U.S. Dollar on Notice, Gold on Top Gold prices eased slightly to $3,678.9 […] Source: https://news.bitcoin.com/gold-hits-3700-as-sprotts-wong-says-dollars-store-of-value-crown-may-slip/
Share
BitcoinEthereumNews2025/09/18 00:33
Franklin Templeton CEO Dismisses 50bps Rate Cut Ahead FOMC

Franklin Templeton CEO Dismisses 50bps Rate Cut Ahead FOMC

The post Franklin Templeton CEO Dismisses 50bps Rate Cut Ahead FOMC appeared on BitcoinEthereumNews.com. Franklin Templeton CEO Jenny Johnson has weighed in on whether the Federal Reserve should make a 25 basis points (bps) Fed rate cut or 50 bps cut. This comes ahead of the Fed decision today at today’s FOMC meeting, with the market pricing in a 25 bps cut. Bitcoin and the broader crypto market are currently trading flat ahead of the rate cut decision. Franklin Templeton CEO Weighs In On Potential FOMC Decision In a CNBC interview, Jenny Johnson said that she expects the Fed to make a 25 bps cut today instead of a 50 bps cut. She acknowledged the jobs data, which suggested that the labor market is weakening. However, she noted that this data is backward-looking, indicating that it doesn’t show the current state of the economy. She alluded to the wage growth, which she remarked is an indication of a robust labor market. She added that retail sales are up and that consumers are still spending, despite inflation being sticky at 3%, which makes a case for why the FOMC should opt against a 50-basis-point Fed rate cut. In line with this, the Franklin Templeton CEO said that she would go with a 25 bps rate cut if she were Jerome Powell. She remarked that the Fed still has the October and December FOMC meetings to make further cuts if the incoming data warrants it. Johnson also asserted that the data show a robust economy. However, she noted that there can’t be an argument for no Fed rate cut since Powell already signaled at Jackson Hole that they were likely to lower interest rates at this meeting due to concerns over a weakening labor market. Notably, her comment comes as experts argue for both sides on why the Fed should make a 25 bps cut or…
Share
BitcoinEthereumNews2025/09/18 00:36
[Tambay] Tres niños na bagitos

[Tambay] Tres niños na bagitos

Mga bagong lublób sa malupit na mundo ng Philippine politics ang mga newbies na sina Leviste, Barzaga, at San Fernando, kaya madalas nakakangilo ang kanilang ikinikilos
Share
Rappler2026/01/18 10:00