Skip to content

Commit f4cbbea

Browse files
committed
refactor the docs file! large refactor using pytorch backend
1 parent fa82b68 commit f4cbbea

File tree

176 files changed

+18780
-16229
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

176 files changed

+18780
-16229
lines changed

docs/source/advance.rst

Lines changed: 39 additions & 86 deletions
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,7 @@ the only new line is to set the bond dimension for the new simulator.
1212

1313
.. code-block:: python
1414
15-
c = tc.MPSCircuit(n)
15+
c = tq.MPSCircuit(n)
1616
c.set_split_rules({"max_singular_values": 50})
1717
1818
The larger bond dimension we set, the better approximation ratio (of course the more computational cost we pay)
@@ -31,15 +31,15 @@ The two-qubit gates applied on the circuit can be decomposed via SVD, which may
3131
"fixed_choice": 1, # 1 for normal one, 2 for swapped one
3232
}
3333
34-
c = tc.Circuit(nwires, split=split_conf)
34+
c = tq.Circuit(nwires, split=split_conf)
3535
3636
# or
3737
3838
c.exp1(
3939
i,
4040
(i + 1) % nwires,
4141
theta=paramc[2 * j, i],
42-
unitary=tc.gates._zz_matrix,
42+
unitary=tq.gates._zz_matrix,
4343
split=split_conf
4444
)
4545
@@ -49,23 +49,21 @@ Note ``max_singular_values`` must be specified to make the whole procedure stati
4949
Jitted Function Save/Load
5050
-----------------------------
5151

52-
To reuse the jitted function, we can save it on the disk via support from the TensorFlow `SavedModel <https://www.tensorflow.org/guide/saved_model>`_. That is to say, only jitted quantum function on the TensorFlow backend can be saved on the disk.
52+
To reuse the jitted function, we can save it on the disk via support from PyTorch's `TorchScript <https://pytorch.org/docs/stable/jit.html>`_.
5353

54-
For the JAX-backend quantum function, one can first transform them into the tf-backend function via JAX experimental support: `jax2tf <https://github.com/google/jax/tree/main/jax/experimental/jax2tf>`_.
55-
56-
We wrap the tf-backend `SavedModel` as very easy-to-use function :py:meth:`tensorcircuit.keras.save_func` and :py:meth:`tensorcircuit.keras.load_func`.
54+
We provide easy-to-use functions :py:meth:`tyxonq.torchnn.save_func` and :py:meth:`tyxonq.torchnn.load_func`.
5755

5856
Parameterized Measurements
5957
-----------------------------
6058

61-
For plain measurements API on a ``tc.Circuit``, eg. `c = tc.Circuit(n=3)`, if we want to evaluate the expectation :math:`<Z_1Z_2>`, we need to call the API as ``c.expectation((tc.gates.z(), [1]), (tc.gates.z(), [2]))``.
59+
For plain measurements API on a ``tq.Circuit``, eg. `c = tq.Circuit(n=3)`, if we want to evaluate the expectation :math:`<Z_1Z_2>`, we need to call the API as ``c.expectation((tq.gates.z(), [1]), (tq.gates.z(), [2]))``.
6260

63-
In some cases, we may want to tell the software what to measure but in a tensor fashion. For example, if we want to get the above expectation, we can use the following API: :py:meth:`tensorcircuit.templates.measurements.parameterized_measurements`.
61+
In some cases, we may want to tell the software what to measure but in a tensor fashion. For example, if we want to get the above expectation, we can use the following API: :py:meth:`tyxonq.templates.measurements.parameterized_measurements`.
6462

6563
.. code-block:: python
6664
67-
c = tc.Circuit(3)
68-
z1z2 = tc.templates.measurements.parameterized_measurements(c, tc.array_to_tensor([0, 3, 3, 0]), onehot=True) # 1
65+
c = tq.Circuit(3)
66+
z1z2 = tq.templates.measurements.parameterized_measurements(c, tq.array_to_tensor([0, 3, 3, 0]), onehot=True) # 1
6967
7068
This API corresponds to measure :math:`I_0Z_1Z_2I_3` where 0, 1, 2, 3 are for local I, X, Y, and Z operators respectively.
7169

@@ -77,110 +75,65 @@ We support COO format sparse matrix as most backends only support this format, a
7775
.. code-block:: python
7876
7977
def sparse_test():
80-
m = tc.backend.coo_sparse_matrix(indices=np.array([[0, 1],[1, 0]]), values=np.array([1.0, 1.0]), shape=[2, 2])
81-
n = tc.backend.convert_to_tensor(np.array([[1.0], [0.0]]))
82-
print("is sparse: ", tc.backend.is_sparse(m), tc.backend.is_sparse(n))
83-
print("sparse matmul: ", tc.backend.sparse_dense_matmul(m, n))
78+
m = tq.backend.coo_sparse_matrix(indices=np.array([[0, 1],[1, 0]]), values=np.array([1.0, 1.0]), shape=[2, 2])
79+
n = tq.backend.convert_to_tensor(np.array([[1.0], [0.0]]))
80+
print("is sparse: ", tq.backend.is_sparse(m), tq.backend.is_sparse(n))
81+
print("sparse matmul: ", tq.backend.sparse_dense_matmul(m, n))
8482
85-
for K in ["tensorflow", "jax", "numpy"]:
86-
with tc.runtime_backend(K):
83+
for K in ["pytorch", "numpy"]:
84+
with tq.runtime_backend(K):
8785
print("using backend: ", K)
8886
sparse_test()
8987
9088
The sparse matrix is specifically useful to evaluate Hamiltonian expectation on the circuit, where sparse matrix representation has a good tradeoff between space and time.
91-
Please refer to :py:meth:`tensorcircuit.templates.measurements.sparse_expectation` for more detail.
89+
Please refer to :py:meth:`tyxonq.templates.measurements.sparse_expectation` for more detail.
9290

93-
For different representations to evaluate Hamiltonian expectation in tensorcircuit, please refer to :doc:`tutorials/tfim_vqe_diffreph`.
91+
For different representations to evaluate Hamiltonian expectation in tyxonq, please refer to :doc:`tutorials/tfim_vqe_diffreph`.
9492

95-
Randoms, Jit, Backend Agnostic, and Their Interplay
93+
Randomness, Jit, and Their Interplay
9694
--------------------------------------------------------
9795

98-
.. code-block:: python
99-
100-
import tensorcircuit as tc
101-
K = tc.set_backend("tensorflow")
102-
K.set_random_state(42)
103-
104-
@K.jit
105-
def r():
106-
return K.implicit_randn()
107-
108-
print(r(), r()) # different, correct
109-
110-
.. code-block:: python
111-
112-
import tensorcircuit as tc
113-
K = tc.set_backend("jax")
114-
K.set_random_state(42)
115-
116-
@K.jit
117-
def r():
118-
return K.implicit_randn()
119-
120-
print(r(), r()) # the same, wrong
121-
96+
The interplay between randomness and JIT compilation requires careful handling, especially when aiming for reproducibility. PyTorch uses a stateful pseudo-random number generator (PRNG). To ensure reproducibility in a JIT-compiled function, the random state must be managed explicitly.
12297

12398
.. code-block:: python
12499
125-
import tensorcircuit as tc
126-
import jax
127-
K = tc.set_backend("jax")
128-
key = K.set_random_state(42)
100+
import tyxonq as tq
101+
import torch
102+
K = tq.set_backend("pytorch")
129103
130104
@K.jit
131-
def r(key):
132-
K.set_random_state(key)
133-
return K.implicit_randn()
134-
135-
key1, key2 = K.random_split(key)
105+
def r(generator):
106+
return torch.randn(1, generator=generator)
136107
137-
print(r(key1), r(key2)) # different, correct
108+
g1 = torch.Generator().manual_seed(42)
109+
g2 = torch.Generator().manual_seed(42)
110+
print(r(g1), r(g1)) # same, correct
111+
print(r(g2)) # same as first call, correct
138112
139-
Therefore, a unified jittable random infrastructure with backend agnostic can be formulated as
113+
To get different random numbers, you must use different generator states.
140114

141115
.. code-block:: python
142116
143-
import tensorcircuit as tc
144-
import jax
145-
K = tc.set_backend("tensorflow")
117+
g = torch.Generator().manual_seed(42)
118+
print(r(g), r(g)) # Two calls with the same generator will produce the same result if the function is jitted
119+
120+
g1 = torch.Generator().manual_seed(42)
121+
g2 = torch.Generator().manual_seed(43)
122+
print(r(g1), r(g2)) # different, correct
146123
147-
def ba_key(key):
148-
if tc.backend.name == "tensorflow":
149-
return None
150-
if tc.backend.name == "jax":
151-
return jax.random.PRNGKey(key)
152-
raise ValueError("unsupported backend %s"%tc.backend.name)
153-
154-
155-
@K.jit
156-
def r(key=None):
157-
if key is not None:
158-
K.set_random_state(key)
159-
return K.implicit_randn()
160-
161-
key = ba_key(42)
162-
163-
key1, key2 = K.random_split(key)
164-
165-
print(r(key1), r(key2))
166-
167-
And a more neat approach to achieve this is as follows:
124+
TyxonQ's backend provides helper functions to manage this. ``K.get_random_state`` will return a `torch.Generator` instance, and ``K.random_split`` can be used to create new independent generator objects.
168125

169126
.. code-block:: python
170127
171128
key = K.get_random_state(42)
172129
173130
@K.jit
174131
def r(key):
175-
K.set_random_state(key)
176-
return K.implicit_randn()
132+
# We don't need K.set_random_state inside, as we pass the generator
133+
return K.implicit_randn(generator=key)
177134
178135
key1, key2 = K.random_split(key)
179136
180137
print(r(key1), r(key2))
181138
182-
It is worth noting that since ``Circuit.unitary_kraus`` and ``Circuit.general_kraus`` call ``implicit_rand*`` API, the correct usage of these APIs is the same as above.
183-
184-
One may wonder why random numbers are dealt in such a complicated way, please refer to the `Jax design note <https://github.com/google/jax/blob/main/docs/design_notes/prng.md>`_ for some hints.
185-
186-
If vmap is also involved apart from jit, I currently find no way to maintain the backend agnosticity as TensorFlow seems to have no support of vmap over random keys (ping me on GitHub if you think you have a way to do this). I strongly recommend the users using Jax backend in the vmap+random setup.
139+
This paradigm is crucial when using stochastic elements in your circuits, such as with ``Circuit.unitary_kraus`` and ``Circuit.general_kraus``, inside a JIT-compiled function.

docs/source/api/about.rst

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
1-
tensorcircuit.about
1+
tyxonq.about
22
================================================================================
3-
.. automodule:: tensorcircuit.about
3+
.. automodule:: tyxonq.about
44
:members:
55
:undoc-members:
66
:show-inheritance:

docs/source/api/abstractcircuit.rst

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
1-
tensorcircuit.abstractcircuit
1+
tyxonq.abstractcircuit
22
================================================================================
3-
.. automodule:: tensorcircuit.abstractcircuit
3+
.. automodule:: tyxonq.abstractcircuit
44
:members:
55
:undoc-members:
66
:show-inheritance:

docs/source/api/applications.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
tensorcircuit.applications
1+
tyxonq.applications
22
================================================================================
33
.. toctree::
44
applications/ai.rst
Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
tensorcircuit.applications.ai
1+
tyxonq.applications.ai
22
================================================================================
33
.. toctree::
44
ai/ensemble.rst

docs/source/api/applications/ai/ensemble.rst

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
1-
tensorcircuit.applications.ai.ensemble
1+
tyxonq.applications.ai.ensemble
22
================================================================================
3-
.. automodule:: tensorcircuit.applications.ai.ensemble
3+
.. automodule:: tyxonq.applications.ai.ensemble
44
:members:
55
:undoc-members:
66
:show-inheritance:

docs/source/api/applications/dqas.rst

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
1-
tensorcircuit.applications.dqas
1+
tyxonq.applications.dqas
22
================================================================================
3-
.. automodule:: tensorcircuit.applications.dqas
3+
.. automodule:: tyxonq.applications.dqas
44
:members:
55
:undoc-members:
66
:show-inheritance:
Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
tensorcircuit.applications.finance
1+
tyxonq.applications.finance
22
================================================================================
33
.. toctree::
44
finance/portfolio.rst

docs/source/api/applications/finance/portfolio.rst

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
1-
tensorcircuit.applications.finance.portfolio
1+
tyxonq.applications.finance.portfolio
22
================================================================================
3-
.. automodule:: tensorcircuit.applications.finance.portfolio
3+
.. automodule:: tyxonq.applications.finance.portfolio
44
:members:
55
:undoc-members:
66
:show-inheritance:

docs/source/api/applications/graphdata.rst

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
1-
tensorcircuit.applications.graphdata
1+
tyxonq.applications.graphdata
22
================================================================================
3-
.. automodule:: tensorcircuit.applications.graphdata
3+
.. automodule:: tyxonq.applications.graphdata
44
:members:
55
:undoc-members:
66
:show-inheritance:

0 commit comments

Comments
 (0)