{"id":740,"date":"2026-04-17T05:32:14","date_gmt":"2026-04-16T21:32:14","guid":{"rendered":"https:\/\/connectword.dpdns.org\/?p=740"},"modified":"2026-04-17T05:32:14","modified_gmt":"2026-04-16T21:32:14","slug":"building-transformer-based-nqs-for-frustrated-spin-systems-with-netket","status":"publish","type":"post","link":"https:\/\/connectword.dpdns.org\/?p=740","title":{"rendered":"Building Transformer-Based NQS for Frustrated Spin Systems with NetKet"},"content":{"rendered":"<p>The intersection of <strong>many-body physics<\/strong> and <strong>deep learning<\/strong> has opened a new frontier: <strong>Neural Quantum States (NQS)<\/strong>. While traditional methods struggle with high-dimensional frustrated systems, the global attention mechanism of <strong>Transformers<\/strong> provides a powerful tool for capturing complex quantum correlations.<\/p>\n<p>In this tutorial, we implement a research-grade <strong>Variational Monte Carlo (VMC)<\/strong> pipeline using <strong><a target=\"_blank\" rel=\"noreferrer noopener\" href=\"https:\/\/www.netket.org\/\">NetKet<\/a><\/strong> and <strong>JAX<\/strong> to solve the <strong>frustrated J1\u2013J2 Heisenberg spin chain<\/strong>. We will:<\/p>\n<ul class=\"wp-block-list\">\n<li>Build a custom <strong>Transformer-based NQS<\/strong> architecture.<\/li>\n<li>Optimize the wavefunction using <strong>Stochastic Reconfiguration<\/strong> (natural gradient descent).<\/li>\n<li>Benchmark our results against <strong>exact diagonalization<\/strong> and analyze emergent quantum phases.<\/li>\n<\/ul>\n<p>By the end of this guide, you will have a scalable, physically grounded simulation framework capable of exploring quantum magnetism beyond the reach of classical exact methods.<\/p>\n<div class=\"dm-code-snippet dark dm-normal-version default no-background-mobile\">\n<div class=\"control-language\">\n<div class=\"dm-buttons\">\n<div class=\"dm-buttons-left\">\n<div class=\"dm-button-snippet red-button\"><\/div>\n<div class=\"dm-button-snippet orange-button\"><\/div>\n<div class=\"dm-button-snippet green-button\"><\/div>\n<\/div>\n<div class=\"dm-buttons-right\"><a><span class=\"dm-copy-text\">Copy Code<\/span><span class=\"dm-copy-confirmed\">Copied<\/span><span class=\"dm-error-message\">Use a different Browser<\/span><\/a><\/div>\n<\/div>\n<pre class=\" no-line-numbers\"><code class=\" no-wrap language-php\">!pip -q install --upgrade pip\n!pip -q install \"netket\" \"flax\" \"optax\" \"einops\" \"tqdm\"\n\n\nimport os\nos.environ[\"XLA_PYTHON_CLIENT_PREALLOCATE\"] = \"false\"\n\n\nimport netket as nk\nimport jax\nimport jax.numpy as jnp\nimport numpy as np\nimport matplotlib.pyplot as plt\nfrom flax import linen as nn\nfrom tqdm import tqdm\n\n\njax.config.update(\"jax_enable_x64\", True)\nprint(\"JAX devices:\", jax.devices())\n\n\ndef make_j1j2_chain(L, J2, total_sz=0.0):\n   J1 = 1.0\n   edges = []\n   for i in range(L):\n       edges.append([i, (i+1)%L, 1])\n       edges.append([i, (i+2)%L, 2])\n   g = nk.graph.Graph(edges=edges)\n   hi = nk.hilbert.Spin(s=0.5, N=L, total_sz=total_sz)\n   sigmaz = np.array([[1,0],[0,-1]], dtype=np.float64)\n   mszsz = np.kron(sigmaz, sigmaz)\n   exchange = np.array(\n       [[0,0,0,0],\n        [0,0,2,0],\n        [0,2,0,0],\n        [0,0,0,0]], dtype=np.float64\n   )\n   bond_ops = [\n       (J1*mszsz).tolist(),\n       (J2*mszsz).tolist(),\n       (-J1*exchange).tolist(),\n       (J2*exchange).tolist(),\n   ]\n   bond_colors = [1,2,1,2]\n   H = nk.operator.GraphOperator(hi, g, bond_ops=bond_ops, bond_ops_colors=bond_colors)\n   return g, hi, H<\/code><\/pre>\n<\/div>\n<\/div>\n<p>We install all required libraries and configure JAX for stable high-precision computation. We define the J1\u2013J2 frustrated Heisenberg Hamiltonian using a custom colored graph representation. We construct the Hilbert space and the GraphOperator to efficiently simulate interacting spin systems in NetKet.<\/p>\n<div class=\"dm-code-snippet dark dm-normal-version default no-background-mobile\">\n<div class=\"control-language\">\n<div class=\"dm-buttons\">\n<div class=\"dm-buttons-left\">\n<div class=\"dm-button-snippet red-button\"><\/div>\n<div class=\"dm-button-snippet orange-button\"><\/div>\n<div class=\"dm-button-snippet green-button\"><\/div>\n<\/div>\n<div class=\"dm-buttons-right\"><a><span class=\"dm-copy-text\">Copy Code<\/span><span class=\"dm-copy-confirmed\">Copied<\/span><span class=\"dm-error-message\">Use a different Browser<\/span><\/a><\/div>\n<\/div>\n<pre class=\" no-line-numbers\"><code class=\" no-wrap language-php\">class TransformerLogPsi(nn.Module):\n   L: int\n   d_model: int = 96\n   n_heads: int = 4\n   n_layers: int = 6\n   mlp_mult: int = 4\n\n\n   @nn.compact\n   def __call__(self, sigma):\n       x = (sigma &gt; 0).astype(jnp.int32)\n       tok = nn.Embed(num_embeddings=2, features=self.d_model)(x)\n       pos = self.param(\"pos_embedding\",\n                        nn.initializers.normal(0.02),\n                        (1, self.L, self.d_model))\n       h = tok + pos\n       for _ in range(self.n_layers):\n           h_norm = nn.LayerNorm()(h)\n           attn = nn.SelfAttention(\n               num_heads=self.n_heads,\n               qkv_features=self.d_model,\n               out_features=self.d_model,\n           )(h_norm)\n           h = h + attn\n           h2 = nn.LayerNorm()(h)\n           ff = nn.Dense(self.mlp_mult*self.d_model)(h2)\n           ff = nn.gelu(ff)\n           ff = nn.Dense(self.d_model)(ff)\n           h = h + ff\n       h = nn.LayerNorm()(h)\n       pooled = jnp.mean(h, axis=1)\n       out = nn.Dense(2)(pooled)\n       return out[...,0] + 1j*out[...,1]<\/code><\/pre>\n<\/div>\n<\/div>\n<p>We implement a Transformer-based neural quantum state using Flax. We encode spin configurations into embeddings, apply multi-layer self-attention blocks, and aggregate global information through pooling. We output a complex log-amplitude, allowing our model to represent highly expressive many-body wavefunctions.<\/p>\n<div class=\"dm-code-snippet dark dm-normal-version default no-background-mobile\">\n<div class=\"control-language\">\n<div class=\"dm-buttons\">\n<div class=\"dm-buttons-left\">\n<div class=\"dm-button-snippet red-button\"><\/div>\n<div class=\"dm-button-snippet orange-button\"><\/div>\n<div class=\"dm-button-snippet green-button\"><\/div>\n<\/div>\n<div class=\"dm-buttons-right\"><a><span class=\"dm-copy-text\">Copy Code<\/span><span class=\"dm-copy-confirmed\">Copied<\/span><span class=\"dm-error-message\">Use a different Browser<\/span><\/a><\/div>\n<\/div>\n<pre class=\" no-line-numbers\"><code class=\" no-wrap language-php\">def structure_factor(vs, L):\n   samples = vs.samples\n   spins = samples.reshape(-1, L)\n   corr = np.zeros(L)\n   for r in range(L):\n       corr[r] = np.mean(spins[:,0] * spins[:,r])\n   q = np.arange(L) * 2*np.pi\/L\n   Sq = np.abs(np.fft.fft(corr))\n   return q, Sq\n\n\ndef exact_energy(L, J2):\n   _, hi, H = make_j1j2_chain(L, J2, total_sz=0.0)\n   return nk.exact.lanczos_ed(H, k=1, compute_eigenvectors=False)[0]\n\n\ndef run_vmc(L, J2, n_iter=250):\n   g, hi, H = make_j1j2_chain(L, J2, total_sz=0.0)\n   model = TransformerLogPsi(L=L)\n   sampler = nk.sampler.MetropolisExchange(\n       hilbert=hi,\n       graph=g,\n       n_chains_per_rank=64\n   )\n   vs = nk.vqs.MCState(\n       sampler,\n       model,\n       n_samples=4096,\n       n_discard_per_chain=128\n   )\n   opt = nk.optimizer.Adam(learning_rate=2e-3)\n   sr = nk.optimizer.SR(diag_shift=1e-2)\n   vmc = nk.driver.VMC(H, opt, variational_state=vs, preconditioner=sr)\n   log = vmc.run(n_iter=n_iter, out=None)\n   energy = np.array(log[\"Energy\"][\"Mean\"])\n   var = np.array(log[\"Energy\"][\"Variance\"])\n   return vs, energy, var<\/code><\/pre>\n<\/div>\n<\/div>\n<p>We define the structure factor observable and the exact diagonalization benchmark for validation. We implement the full VMC training routine using MetropolisExchange sampling and Stochastic Reconfiguration. We return energy and variance arrays so that we can analyze convergence and physical accuracy.<\/p>\n<div class=\"dm-code-snippet dark dm-normal-version default no-background-mobile\">\n<div class=\"control-language\">\n<div class=\"dm-buttons\">\n<div class=\"dm-buttons-left\">\n<div class=\"dm-button-snippet red-button\"><\/div>\n<div class=\"dm-button-snippet orange-button\"><\/div>\n<div class=\"dm-button-snippet green-button\"><\/div>\n<\/div>\n<div class=\"dm-buttons-right\"><a><span class=\"dm-copy-text\">Copy Code<\/span><span class=\"dm-copy-confirmed\">Copied<\/span><span class=\"dm-error-message\">Use a different Browser<\/span><\/a><\/div>\n<\/div>\n<pre class=\" no-line-numbers\"><code class=\" no-wrap language-php\">L = 24\nJ2_values = np.linspace(0.0, 0.7, 6)\n\n\nenergies = []\nstructure_peaks = []\n\n\nfor J2 in tqdm(J2_values):\n   vs, e, var = run_vmc(L, J2)\n   energies.append(e[-1])\n   q, Sq = structure_factor(vs, L)\n   structure_peaks.append(np.max(Sq))<\/code><\/pre>\n<\/div>\n<\/div>\n<div class=\"dm-code-snippet dark dm-normal-version default no-background-mobile\">\n<div class=\"control-language\">\n<div class=\"dm-buttons\">\n<div class=\"dm-buttons-left\">\n<div class=\"dm-button-snippet red-button\"><\/div>\n<div class=\"dm-button-snippet orange-button\"><\/div>\n<div class=\"dm-button-snippet green-button\"><\/div>\n<\/div>\n<div class=\"dm-buttons-right\"><a><span class=\"dm-copy-text\">Copy Code<\/span><span class=\"dm-copy-confirmed\">Copied<\/span><span class=\"dm-error-message\">Use a different Browser<\/span><\/a><\/div>\n<\/div>\n<pre class=\" no-line-numbers\"><code class=\" no-wrap language-php\">L = 24\nJ2_values = np.linspace(0.0, 0.7, 6)\n\n\nenergies = []\nstructure_peaks = []\n\n\nfor J2 in tqdm(J2_values):\n   vs, e, var = run_vmc(L, J2)\n   energies.append(e[-1])\n   q, Sq = structure_factor(vs, L)\n   structure_peaks.append(np.max(Sq))<\/code><\/pre>\n<\/div>\n<\/div>\n<p>We sweep across multiple J2 values to explore the frustrated phase diagram. We train a separate variational state for each coupling strength and record the final energy. We compute the structure factor peak for each point to detect possible ordering transitions.<\/p>\n<div class=\"dm-code-snippet dark dm-normal-version default no-background-mobile\">\n<div class=\"control-language\">\n<div class=\"dm-buttons\">\n<div class=\"dm-buttons-left\">\n<div class=\"dm-button-snippet red-button\"><\/div>\n<div class=\"dm-button-snippet orange-button\"><\/div>\n<div class=\"dm-button-snippet green-button\"><\/div>\n<\/div>\n<div class=\"dm-buttons-right\"><a><span class=\"dm-copy-text\">Copy Code<\/span><span class=\"dm-copy-confirmed\">Copied<\/span><span class=\"dm-error-message\">Use a different Browser<\/span><\/a><\/div>\n<\/div>\n<pre class=\" no-line-numbers\"><code class=\" no-wrap language-php\">L_ed = 14\nJ2_test = 0.5\nE_ed = exact_energy(L_ed, J2_test)\n\n\nvs_small, e_small, _ = run_vmc(L_ed, J2_test, n_iter=200)\nE_vmc = e_small[-1]\n\n\nprint(\"ED Energy (L=14):\", E_ed)\nprint(\"VMC Energy:\", E_vmc)\nprint(\"Abs gap:\", abs(E_vmc - E_ed))\n\n\nplt.figure(figsize=(12,4))\n\n\nplt.subplot(1,3,1)\nplt.plot(e_small)\nplt.title(\"Energy Convergence\")\n\n\nplt.subplot(1,3,2)\nplt.plot(J2_values, energies, 'o-')\nplt.title(\"Energy vs J2\")\n\n\nplt.subplot(1,3,3)\nplt.plot(J2_values, structure_peaks, 'o-')\nplt.title(\"Structure Factor Peak\")\n\n\nplt.tight_layout()\nplt.show()<\/code><\/pre>\n<\/div>\n<\/div>\n<p>We benchmark our model against exact diagonalization on a smaller lattice size. We compute the absolute energy gap between VMC and ED to evaluate accuracy. We visualize convergence behavior, phase-energy trends, and structure-factor responses to summarize the physical insights we obtain.<\/p>\n<p>In conclusion, we integrated advanced neural architectures with quantum Monte Carlo techniques to explore frustrated magnetism beyond the reach of small-system exact methods. We validated our Transformer ansatz against Lanczos diagonalization, analyzed convergence behavior, and extracted physically meaningful observables such as structure factor peaks to detect phase transitions. Also, we established a flexible foundation that we can extend toward higher-dimensional lattices, symmetry-projected states, entanglement diagnostics, and time-dependent quantum simulations.<\/p>\n<hr class=\"wp-block-separator has-alpha-channel-opacity\" \/>\n<p>Check out\u00a0the<strong><a href=\"https:\/\/arxiv.org\/pdf\/2604.06425\" target=\"_blank\" rel=\"noreferrer noopener\">\u00a0<\/a><a href=\"https:\/\/github.com\/Marktechpost\/AI-Agents-Projects-Tutorials\/blob\/main\/Deep%20Learning\/transformer_nqs_netket_j1j2_vmc_marktechpost.py\" target=\"_blank\" rel=\"noreferrer noopener\">Full Implementation Codes here<\/a>.\u00a0<\/strong>Also,\u00a0feel free to follow us on\u00a0<strong><a href=\"https:\/\/x.com\/intent\/follow?screen_name=marktechpost\" target=\"_blank\" rel=\"noreferrer noopener\"><mark>Twitter<\/mark><\/a><\/strong>\u00a0and don\u2019t forget to join our\u00a0<strong><a href=\"https:\/\/www.reddit.com\/r\/machinelearningnews\/\" target=\"_blank\" rel=\"noreferrer noopener\">130k+ ML SubReddit<\/a><\/strong>\u00a0and Subscribe to\u00a0<strong><a href=\"https:\/\/www.aidevsignals.com\/\" target=\"_blank\" rel=\"noreferrer noopener\">our Newsletter<\/a><\/strong>. Wait! are you on telegram?\u00a0<strong><a href=\"https:\/\/t.me\/machinelearningresearchnews\" target=\"_blank\" rel=\"noreferrer noopener\">now you can join us on telegram as well.<\/a><\/strong><\/p>\n<p>Need to partner with us for promoting your GitHub Repo OR Hugging Face Page OR Product Release OR Webinar etc.?\u00a0<strong><a href=\"https:\/\/forms.gle\/MTNLpmJtsFA3VRVd9\" target=\"_blank\" rel=\"noreferrer noopener\"><mark>Connect with us<\/mark><\/a><\/strong><\/p>\n<p>The post <a href=\"https:\/\/www.marktechpost.com\/2026\/04\/16\/transformer-nqs-netket-j1j2-guide\/\">Building Transformer-Based NQS for Frustrated Spin Systems with NetKet<\/a> appeared first on <a href=\"https:\/\/www.marktechpost.com\/\">MarkTechPost<\/a>.<\/p>","protected":false},"excerpt":{"rendered":"<p>The intersection of many-body &hellip;<\/p>\n","protected":false},"author":1,"featured_media":29,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[1],"tags":[],"class_list":["post-740","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-uncategorized"],"_links":{"self":[{"href":"https:\/\/connectword.dpdns.org\/index.php?rest_route=\/wp\/v2\/posts\/740","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/connectword.dpdns.org\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/connectword.dpdns.org\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/connectword.dpdns.org\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/connectword.dpdns.org\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=740"}],"version-history":[{"count":0,"href":"https:\/\/connectword.dpdns.org\/index.php?rest_route=\/wp\/v2\/posts\/740\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/connectword.dpdns.org\/index.php?rest_route=\/wp\/v2\/media\/29"}],"wp:attachment":[{"href":"https:\/\/connectword.dpdns.org\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=740"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/connectword.dpdns.org\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=740"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/connectword.dpdns.org\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=740"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}