RNN心脏病预测

本文为为🔗365天深度学习训练营内部文章

原作者:K同学啊

 一 前期准备

1.数据导入

import pandas as pd
from keras.optimizers import Adam
from matplotlib import pyplot as plt
from sklearn.model_selection import train_test_split
from sklearn.preprocessing import StandardScaler
from keras.models import Sequential
from keras.layers import Dense,SimpleRNN
import warnings
warnings.filterwarnings('ignore')

df = pd.read_csv('heart.csv')

2.检查数据 

查看是否有空值

print(df.shape)
print(df.isnull().sum())
(303, 14)
age         0
sex         0
cp          0
trestbps    0
chol        0
fbs         0
restecg     0
thalach     0
exang       0
oldpeak     0
slope       0
ca          0
thal        0
target      0
dtype: int64

二 数据预处理 

1.拆分训练集 

X = df.iloc[:,:-1]
y = df.iloc[:,-1]

X_train,X_test,y_train,y_test = train_test_split(X,y,test_size=0.1,random_state=14)

2.数据标准化 

sc = StandardScaler()
X_train = sc.fit_transform(X_train)
X_test = sc.fit_transform(X_test)
X_train = X_train.reshape(X_train.shape[0],X_train.shape[1],1)
X_test = X_test.reshape(X_test.shape[0],X_test.shape[1],1)
array([[[ 1.44626869],
        [ 0.54006172],
        [ 0.62321699],
        [ 1.37686599],
        [ 0.83801861],
        [-0.48989795],
        [ 0.92069654],
        [-1.38834656],
        [ 1.34839972],
        [ 1.83944021],
        [-0.74161985],
        [ 0.18805174],
        [ 1.09773445]],

       [[-0.11901962],
        [ 0.54006172],
        [ 1.4632051 ],
        [-0.7179976 ],
        [-1.01585167],
        [-0.48989795],
        [-0.86315301],
        [ 0.77440436],
        [-0.74161985],
        [ 0.85288923],
        [-0.74161985],
        [-0.78354893],
        [ 1.09773445]],

 三 构建RNN模型

model = Sequential()
model.add(SimpleRNN(200,input_shape=(X_train.shape[1],1),activation='relu'))
model.add(Dense(100,activation='relu'))
model.add(Dense(1,activation='sigmoid'))
model.summary()
Model: "sequential"
_________________________________________________________________
 Layer (type)                Output Shape              Param #   
=================================================================
 simple_rnn (SimpleRNN)      (None, 200)               40400     
                                                                 
 dense (Dense)               (None, 100)               20100     
                                                                 
 dense_1 (Dense)             (None, 1)                 101       
                                                                 
=================================================================
Total params: 60,601
Trainable params: 60,601
Non-trainable params: 0
_________________________________________________________________

四 编译模型 

optimizer = Adam(learning_rate=1e-4)
# 定义损失函数为二元交叉熵(binary_crossentropy),适用于二分类任务。使用先前定义的优化器,并设置监控指标为准确率
model.compile(loss='binary_crossentropy',optimizer=optimizer,metrics='accuracy')

五 训练模型 

epochs = 100
model.fit(x=X_train,y=y_train,validation_data=(X_test,y_test),verbose=1,
         epochs=epochs,batch_size=128)

acc = model.history.history['accuracy']
val_acc = model.history.history['val_accuracy']
loss = model.history.history['loss']
val_loss = model.history.history['val_loss']
Epoch 1/100
3/3 [==============================] - 1s 130ms/step - loss: 0.6872 - accuracy: 0.5551 - val_loss: 0.6884 - val_accuracy: 0.5806
Epoch 2/100
3/3 [==============================] - 0s 19ms/step - loss: 0.6763 - accuracy: 0.6250 - val_loss: 0.6848 - val_accuracy: 0.6129
Epoch 3/100
3/3 [==============================] - 0s 19ms/step - loss: 0.6660 - accuracy: 0.6912 - val_loss: 0.6814 - val_accuracy: 0.6452
Epoch 4/100
3/3 [==============================] - 0s 18ms/step - loss: 0.6562 - accuracy: 0.7426 - val_loss: 0.6781 - val_accuracy: 0.6452
Epoch 5/100
3/3 [==============================] - 0s 18ms/step - loss: 0.6467 - accuracy: 0.7647 - val_loss: 0.6751 - val_accuracy: 0.6129
Epoch 6/100
3/3 [==============================] - 0s 19ms/step - loss: 0.6375 - accuracy: 0.7941 - val_loss: 0.6722 - val_accuracy: 0.6452
Epoch 7/100
3/3 [==============================] - 0s 18ms/step - loss: 0.6285 - accuracy: 0.8051 - val_loss: 0.6694 - val_accuracy: 0.6129
Epoch 8/100
3/3 [==============================] - 0s 18ms/step - loss: 0.6193 - accuracy: 0.8015 - val_loss: 0.6666 - val_accuracy: 0.6129
Epoch 9/100
3/3 [==============================] - 0s 18ms/step - loss: 0.6094 - accuracy: 0.8125 - val_loss: 0.6635 - val_accuracy: 0.5806
Epoch 10/100
3/3 [==============================] - 0s 18ms/step - loss: 0.6002 - accuracy: 0.8162 - val_loss: 0.6602 - val_accuracy: 0.6129
Epoch 11/100
3/3 [==============================] - 0s 25ms/step - loss: 0.5903 - accuracy: 0.8125 - val_loss: 0.6565 - val_accuracy: 0.5806
Epoch 12/100
3/3 [==============================] - 0s 18ms/step - loss: 0.5795 - accuracy: 0.8125 - val_loss: 0.6526 - val_accuracy: 0.5806
Epoch 13/100
3/3 [==============================] - 0s 18ms/step - loss: 0.5686 - accuracy: 0.8125 - val_loss: 0.6484 - val_accuracy: 0.6129
Epoch 14/100
3/3 [==============================] - 0s 20ms/step - loss: 0.5571 - accuracy: 0.8125 - val_loss: 0.6436 - val_accuracy: 0.6452
Epoch 15/100
3/3 [==============================] - 0s 20ms/step - loss: 0.5451 - accuracy: 0.8125 - val_loss: 0.6377 - val_accuracy: 0.6452
Epoch 16/100
3/3 [==============================] - 0s 17ms/step - loss: 0.5322 - accuracy: 0.8125 - val_loss: 0.6315 - val_accuracy: 0.6452
Epoch 17/100
3/3 [==============================] - 0s 24ms/step - loss: 0.5190 - accuracy: 0.8199 - val_loss: 0.6251 - val_accuracy: 0.6452
Epoch 18/100
3/3 [==============================] - 0s 17ms/step - loss: 0.5053 - accuracy: 0.8199 - val_loss: 0.6190 - val_accuracy: 0.6774
Epoch 19/100
3/3 [==============================] - 0s 17ms/step - loss: 0.4910 - accuracy: 0.8162 - val_loss: 0.6132 - val_accuracy: 0.6774
Epoch 20/100
3/3 [==============================] - 0s 16ms/step - loss: 0.4765 - accuracy: 0.8199 - val_loss: 0.6076 - val_accuracy: 0.6774
Epoch 21/100
3/3 [==============================] - 0s 16ms/step - loss: 0.4616 - accuracy: 0.8235 - val_loss: 0.6007 - val_accuracy: 0.6774
Epoch 22/100
3/3 [==============================] - 0s 16ms/step - loss: 0.4470 - accuracy: 0.8125 - val_loss: 0.5943 - val_accuracy: 0.6774
Epoch 23/100
3/3 [==============================] - 0s 16ms/step - loss: 0.4345 - accuracy: 0.8162 - val_loss: 0.5906 - val_accuracy: 0.6774
Epoch 24/100
3/3 [==============================] - 0s 15ms/step - loss: 0.4219 - accuracy: 0.8162 - val_loss: 0.5901 - val_accuracy: 0.7419
Epoch 25/100
3/3 [==============================] - 0s 16ms/step - loss: 0.4116 - accuracy: 0.8162 - val_loss: 0.5921 - val_accuracy: 0.7742
Epoch 26/100
3/3 [==============================] - 0s 16ms/step - loss: 0.4056 - accuracy: 0.8272 - val_loss: 0.5990 - val_accuracy: 0.7419
Epoch 27/100
3/3 [==============================] - 0s 15ms/step - loss: 0.3983 - accuracy: 0.8309 - val_loss: 0.5970 - val_accuracy: 0.7097
Epoch 28/100
3/3 [==============================] - 0s 15ms/step - loss: 0.3920 - accuracy: 0.8309 - val_loss: 0.5914 - val_accuracy: 0.7097
Epoch 29/100
3/3 [==============================] - 0s 15ms/step - loss: 0.3860 - accuracy: 0.8235 - val_loss: 0.5863 - val_accuracy: 0.7097
Epoch 30/100
3/3 [==============================] - 0s 17ms/step - loss: 0.3802 - accuracy: 0.8235 - val_loss: 0.5724 - val_accuracy: 0.7097
Epoch 31/100
3/3 [==============================] - 0s 18ms/step - loss: 0.3757 - accuracy: 0.8346 - val_loss: 0.5572 - val_accuracy: 0.7419
Epoch 32/100
3/3 [==============================] - 0s 20ms/step - loss: 0.3766 - accuracy: 0.8272 - val_loss: 0.5545 - val_accuracy: 0.7419
Epoch 33/100
3/3 [==============================] - 0s 18ms/step - loss: 0.3706 - accuracy: 0.8272 - val_loss: 0.5608 - val_accuracy: 0.7419
Epoch 34/100
3/3 [==============================] - 0s 17ms/step - loss: 0.3639 - accuracy: 0.8382 - val_loss: 0.5899 - val_accuracy: 0.7419
Epoch 35/100
3/3 [==============================] - 0s 16ms/step - loss: 0.3694 - accuracy: 0.8272 - val_loss: 0.6097 - val_accuracy: 0.7742
Epoch 36/100
3/3 [==============================] - 0s 16ms/step - loss: 0.3682 - accuracy: 0.8346 - val_loss: 0.5859 - val_accuracy: 0.7419
Epoch 37/100
3/3 [==============================] - 0s 17ms/step - loss: 0.3567 - accuracy: 0.8309 - val_loss: 0.5680 - val_accuracy: 0.7419
Epoch 38/100
3/3 [==============================] - 0s 17ms/step - loss: 0.3497 - accuracy: 0.8419 - val_loss: 0.5528 - val_accuracy: 0.7419
Epoch 39/100
3/3 [==============================] - 0s 16ms/step - loss: 0.3484 - accuracy: 0.8603 - val_loss: 0.5417 - val_accuracy: 0.7742
Epoch 40/100
3/3 [==============================] - 0s 22ms/step - loss: 0.3487 - accuracy: 0.8603 - val_loss: 0.5386 - val_accuracy: 0.6774
Epoch 41/100
3/3 [==============================] - 0s 22ms/step - loss: 0.3473 - accuracy: 0.8640 - val_loss: 0.5383 - val_accuracy: 0.7097
Epoch 42/100
3/3 [==============================] - 0s 19ms/step - loss: 0.3422 - accuracy: 0.8676 - val_loss: 0.5425 - val_accuracy: 0.7742
Epoch 43/100
3/3 [==============================] - 0s 19ms/step - loss: 0.3353 - accuracy: 0.8713 - val_loss: 0.5467 - val_accuracy: 0.7419
Epoch 44/100
3/3 [==============================] - 0s 18ms/step - loss: 0.3318 - accuracy: 0.8787 - val_loss: 0.5565 - val_accuracy: 0.7419
Epoch 45/100
3/3 [==============================] - 0s 17ms/step - loss: 0.3289 - accuracy: 0.8750 - val_loss: 0.5572 - val_accuracy: 0.7419
Epoch 46/100
3/3 [==============================] - 0s 18ms/step - loss: 0.3263 - accuracy: 0.8750 - val_loss: 0.5548 - val_accuracy: 0.7419
Epoch 47/100
3/3 [==============================] - 0s 19ms/step - loss: 0.3227 - accuracy: 0.8787 - val_loss: 0.5520 - val_accuracy: 0.7419
Epoch 48/100
3/3 [==============================] - 0s 18ms/step - loss: 0.3191 - accuracy: 0.8824 - val_loss: 0.5564 - val_accuracy: 0.7419
Epoch 49/100
3/3 [==============================] - 0s 19ms/step - loss: 0.3172 - accuracy: 0.8713 - val_loss: 0.5539 - val_accuracy: 0.7419
Epoch 50/100
3/3 [==============================] - 0s 20ms/step - loss: 0.3149 - accuracy: 0.8824 - val_loss: 0.5381 - val_accuracy: 0.7419
Epoch 51/100
3/3 [==============================] - 0s 17ms/step - loss: 0.3110 - accuracy: 0.8824 - val_loss: 0.5427 - val_accuracy: 0.7419
Epoch 52/100
3/3 [==============================] - 0s 18ms/step - loss: 0.3084 - accuracy: 0.8787 - val_loss: 0.5510 - val_accuracy: 0.7419
Epoch 53/100
3/3 [==============================] - 0s 17ms/step - loss: 0.3069 - accuracy: 0.8750 - val_loss: 0.5571 - val_accuracy: 0.7419
Epoch 54/100
3/3 [==============================] - 0s 19ms/step - loss: 0.3052 - accuracy: 0.8860 - val_loss: 0.5468 - val_accuracy: 0.7419
Epoch 55/100
3/3 [==============================] - 0s 18ms/step - loss: 0.3024 - accuracy: 0.8787 - val_loss: 0.5347 - val_accuracy: 0.7419
Epoch 56/100
3/3 [==============================] - 0s 18ms/step - loss: 0.3010 - accuracy: 0.8787 - val_loss: 0.5417 - val_accuracy: 0.7419
Epoch 57/100
3/3 [==============================] - 0s 21ms/step - loss: 0.3013 - accuracy: 0.8860 - val_loss: 0.5496 - val_accuracy: 0.7419
Epoch 58/100
3/3 [==============================] - 0s 18ms/step - loss: 0.2975 - accuracy: 0.8824 - val_loss: 0.5355 - val_accuracy: 0.7419
Epoch 59/100
3/3 [==============================] - 0s 18ms/step - loss: 0.2954 - accuracy: 0.8787 - val_loss: 0.5198 - val_accuracy: 0.7419
Epoch 60/100
3/3 [==============================] - 0s 18ms/step - loss: 0.2970 - accuracy: 0.8787 - val_loss: 0.5148 - val_accuracy: 0.7419
Epoch 61/100
3/3 [==============================] - 0s 19ms/step - loss: 0.2991 - accuracy: 0.8824 - val_loss: 0.5187 - val_accuracy: 0.7419
Epoch 62/100
3/3 [==============================] - 0s 19ms/step - loss: 0.2958 - accuracy: 0.8787 - val_loss: 0.5376 - val_accuracy: 0.7419
Epoch 63/100
3/3 [==============================] - 0s 18ms/step - loss: 0.2891 - accuracy: 0.8860 - val_loss: 0.5659 - val_accuracy: 0.7419
Epoch 64/100
3/3 [==============================] - 0s 17ms/step - loss: 0.2923 - accuracy: 0.8824 - val_loss: 0.5777 - val_accuracy: 0.7419
Epoch 65/100
3/3 [==============================] - 0s 19ms/step - loss: 0.2892 - accuracy: 0.8824 - val_loss: 0.5560 - val_accuracy: 0.7419
Epoch 66/100
3/3 [==============================] - 0s 19ms/step - loss: 0.2848 - accuracy: 0.8934 - val_loss: 0.5405 - val_accuracy: 0.7419
Epoch 67/100
3/3 [==============================] - 0s 17ms/step - loss: 0.2828 - accuracy: 0.8897 - val_loss: 0.5334 - val_accuracy: 0.7419
Epoch 68/100
3/3 [==============================] - 0s 17ms/step - loss: 0.2810 - accuracy: 0.8934 - val_loss: 0.5332 - val_accuracy: 0.7419
Epoch 69/100
3/3 [==============================] - 0s 17ms/step - loss: 0.2792 - accuracy: 0.8934 - val_loss: 0.5307 - val_accuracy: 0.7419
Epoch 70/100
3/3 [==============================] - 0s 18ms/step - loss: 0.2780 - accuracy: 0.8934 - val_loss: 0.5370 - val_accuracy: 0.7419
Epoch 71/100
3/3 [==============================] - 0s 18ms/step - loss: 0.2763 - accuracy: 0.8934 - val_loss: 0.5459 - val_accuracy: 0.7419
Epoch 72/100
3/3 [==============================] - 0s 21ms/step - loss: 0.2762 - accuracy: 0.8971 - val_loss: 0.5583 - val_accuracy: 0.7419
Epoch 73/100
3/3 [==============================] - 0s 15ms/step - loss: 0.2759 - accuracy: 0.8971 - val_loss: 0.5676 - val_accuracy: 0.7419
Epoch 74/100
3/3 [==============================] - 0s 15ms/step - loss: 0.2764 - accuracy: 0.8934 - val_loss: 0.5715 - val_accuracy: 0.7419
Epoch 75/100
3/3 [==============================] - 0s 15ms/step - loss: 0.2747 - accuracy: 0.8934 - val_loss: 0.5540 - val_accuracy: 0.7419
Epoch 76/100
3/3 [==============================] - 0s 15ms/step - loss: 0.2701 - accuracy: 0.8971 - val_loss: 0.5387 - val_accuracy: 0.7419
Epoch 77/100
3/3 [==============================] - 0s 15ms/step - loss: 0.2689 - accuracy: 0.9044 - val_loss: 0.5308 - val_accuracy: 0.7419
Epoch 78/100
3/3 [==============================] - 0s 18ms/step - loss: 0.2701 - accuracy: 0.9081 - val_loss: 0.5241 - val_accuracy: 0.7097
Epoch 79/100
3/3 [==============================] - 0s 15ms/step - loss: 0.2716 - accuracy: 0.9007 - val_loss: 0.5241 - val_accuracy: 0.7097
Epoch 80/100
3/3 [==============================] - 0s 15ms/step - loss: 0.2690 - accuracy: 0.9007 - val_loss: 0.5332 - val_accuracy: 0.7097
Epoch 81/100
3/3 [==============================] - 0s 15ms/step - loss: 0.2650 - accuracy: 0.9154 - val_loss: 0.5418 - val_accuracy: 0.7419
Epoch 82/100
3/3 [==============================] - 0s 15ms/step - loss: 0.2631 - accuracy: 0.9118 - val_loss: 0.5434 - val_accuracy: 0.7419
Epoch 83/100
3/3 [==============================] - 0s 16ms/step - loss: 0.2620 - accuracy: 0.9154 - val_loss: 0.5406 - val_accuracy: 0.7419
Epoch 84/100
3/3 [==============================] - 0s 17ms/step - loss: 0.2603 - accuracy: 0.9154 - val_loss: 0.5395 - val_accuracy: 0.7419
Epoch 85/100
3/3 [==============================] - 0s 26ms/step - loss: 0.2588 - accuracy: 0.9154 - val_loss: 0.5497 - val_accuracy: 0.7419
Epoch 86/100
3/3 [==============================] - 0s 18ms/step - loss: 0.2562 - accuracy: 0.9081 - val_loss: 0.5687 - val_accuracy: 0.7419
Epoch 87/100
3/3 [==============================] - 0s 19ms/step - loss: 0.2609 - accuracy: 0.8971 - val_loss: 0.5754 - val_accuracy: 0.7419
Epoch 88/100
3/3 [==============================] - 0s 17ms/step - loss: 0.2569 - accuracy: 0.8971 - val_loss: 0.5555 - val_accuracy: 0.7419
Epoch 89/100
3/3 [==============================] - 0s 18ms/step - loss: 0.2532 - accuracy: 0.9081 - val_loss: 0.5399 - val_accuracy: 0.7419
Epoch 90/100
3/3 [==============================] - 0s 19ms/step - loss: 0.2545 - accuracy: 0.9191 - val_loss: 0.5361 - val_accuracy: 0.7419
Epoch 91/100
3/3 [==============================] - 0s 18ms/step - loss: 0.2578 - accuracy: 0.9118 - val_loss: 0.5375 - val_accuracy: 0.7419
Epoch 92/100
3/3 [==============================] - 0s 18ms/step - loss: 0.2572 - accuracy: 0.9118 - val_loss: 0.5507 - val_accuracy: 0.7419
Epoch 93/100
3/3 [==============================] - 0s 18ms/step - loss: 0.2516 - accuracy: 0.9118 - val_loss: 0.5715 - val_accuracy: 0.7419
Epoch 94/100
3/3 [==============================] - 0s 17ms/step - loss: 0.2487 - accuracy: 0.9118 - val_loss: 0.5705 - val_accuracy: 0.7419
Epoch 95/100
3/3 [==============================] - 0s 18ms/step - loss: 0.2464 - accuracy: 0.9118 - val_loss: 0.5551 - val_accuracy: 0.7419
Epoch 96/100
3/3 [==============================] - 0s 20ms/step - loss: 0.2454 - accuracy: 0.9191 - val_loss: 0.5480 - val_accuracy: 0.7419
Epoch 97/100
3/3 [==============================] - 0s 17ms/step - loss: 0.2438 - accuracy: 0.9154 - val_loss: 0.5543 - val_accuracy: 0.7419
Epoch 98/100
3/3 [==============================] - 0s 17ms/step - loss: 0.2447 - accuracy: 0.9118 - val_loss: 0.5534 - val_accuracy: 0.7419
Epoch 99/100
3/3 [==============================] - 0s 17ms/step - loss: 0.2446 - accuracy: 0.9118 - val_loss: 0.5425 - val_accuracy: 0.7419
Epoch 100/100
3/3 [==============================] - 0s 19ms/step - loss: 0.2434 - accuracy: 0.9118 - val_loss: 0.5213 - val_accuracy: 0.7742

 六 结果可视化

epochs_range = range(100)
plt.figure(figsize=(14,4))
plt.subplot(1,2,1)
plt.plot(epochs_range,acc,label='training accuracy')
plt.plot(epochs_range,val_acc,label='validation accuracy')
plt.legend(loc='lower right')
plt.title('training and validation accuracy')

plt.subplot(1,2,2)
plt.plot(epochs_range,loss,label='training loss')
plt.plot(epochs_range,val_loss,label='validation loss')
plt.legend(loc='upper right')
plt.title('training and validation loss')
plt.show()

总结: 

1. 模型输入要求
RNN 输入格式:许多深度学习模型,尤其是 RNN 和 LSTM,需要输入数据的形状为三维:(样本数, 时间步数, 特征数)。这使得模型能够处理序列数据并学习时间依赖关系。
2. 数据原始形状
在标准化后,X_train 和 X_test 的形状是 (样本数, 特征数)。例如,如果 X_train 有 100 个样本和 10 个特征,则其形状为 (100, 10)。
3. 重塑的目的
重塑为三维:通过 X_train.reshape(X_train.shape[0], X_train.shape[1], 1),你将数据的形状改变为 (样本数, 特征数, 1)。这里的 1 表示特征数,在单变量情况下,只包含一个特征。
例如,假设 X_train 原本的形状是 (100, 10),重塑后将变为 (100, 10, 1),表示有 100 个样本,每个样本有 10 个时间步(特征)。
4. 适应模型结构
通过这种重塑,数据可以被 RNN 模型正确地处理,从而捕捉到特征随时间变化的模式。

本文来自互联网用户投稿,该文观点仅代表作者本人,不代表本站立场。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如若转载,请注明出处:http://www.mfbz.cn/a/889772.html

如若内容造成侵权/违法违规/事实不符,请联系我们进行投诉反馈qq邮箱809451989@qq.com,一经查实,立即删除!

相关文章

Flink job的提交流程

在Flink中,作业(Job)的提交流程是一个复杂的过程,涉及多个组件和模块,包括作业的编译、优化、序列化、任务分发、任务调度、资源分配等。Flink通过分布式架构来管理作业的生命周期,确保作业在不同节点上以高…

std::future::then的概念和使用方法

std::future::then是 C 中用于异步操作的一种机制,它允许在一个异步任务完成后,接着执行另一个操作(即延续操作)。以下是关于 std::future::then 的概念和使用方法: 1. 概念: std::future::then 的主要目…

python 边际分布图

import seaborn as snspenguins sns.load_dataset("penguins") colors {"Gentoo": #AE5259, "Adelie": #CF992C, "Chinstrap": #6B9DAA}# 分类散点图 sns.jointplot(datapenguins, x"bill_length_mm", y"bill_depth_…

MyBatisPlus分页查询

一、导入依赖 <!-- MyBatis-plus的依赖 --> <dependency><groupId>com.baomidou</groupId><artifactId>mybatis-plus-boot-starter</artifactId><version>3.5.4</version> </dependency><!-- mysql的依赖 --> &l…

CocosCreator 快速部署 TON 游戏:Web2 游戏如何使用 Ton支付

在本篇文章中&#xff0c;我们将继续探讨如何使用 Cocos Creator 开发 Telegram 游戏&#xff0c;重点介绍如何集成 TON 支付功能。通过这一教程&#xff0c;开发者将学会如何在游戏中接入 TON Connect&#xff0c;实现钱包连接、支付以及支付后的校验流程&#xff0c;最终为 W…

贴吧软件怎么切换ip

在网络使用中&#xff0c;有时我们需要切换IP地址来满足特定的需求&#xff0c;比如需要切换贴吧软件IP以进行不同的操作。本文将介绍几种贴吧切换IP地址的方法&#xff0c;帮助用户更好地管理自己的网络身份和访问权限。 1、更换网络环境‌ 通过连接到不同的Wi-Fi网络或使用移…

TON生态小游戏开发:推广、经济模型与UI设计的建设指南

随着区块链技术的快速发展&#xff0c;基于区块链的Web3游戏正引领行业变革。而TON生态小游戏&#xff0c;借助Telegram庞大的用户基础和TON&#xff08;The Open Network&#xff09;链上技术&#xff0c;已成为这一领域的明星之一。国内外开发者正迅速涌入&#xff0c;开发和…

【开源】RISC-V 修改neofetch中的Host描述

neofetch 介绍 neofetch 是一款用于在终端中显示系统信息的工具&#xff0c;其主要特点是以美观的方式展示宿主机的基本信息。它通常用于展示系统的分发版本、内核版本、硬件信息、桌面环境&#xff0c;以及一些个性化的设置&#xff0c;配合 ASCII 艺术风格的 logo&#xff0…

基于Opencv中的DNN模块实现图像/视频的风格迁移

一、DNN模块的介绍 1、简介 OpenCV中的DNN&#xff08;Deep Neural Network&#xff09;模块是一个功能强大的组件&#xff0c;它支持深度学习网络模型的加载和推理。虽然DNN模块不提供模型的训练功能&#xff0c;但它可以与主流的深度学习框架&#xff08;如TensorFlow、Caf…

Visual Studio的实用调试技巧总结

对于很多学习编程的老铁们来说&#xff0c;是不是也像下面这张图一样写代码呢&#xff1f; 那当我们这样编写代码的时候遇到了问题&#xff1f;大家又是怎么排查问题的呢&#xff1f;是不是也像下面这张图一样&#xff0c;毫无目的的一遍遍尝试呢&#xff1f; 这篇文章我就以 V…

iPhone使用指南:如何在没有备份的情况下从 iPhone 恢复已删除的照片

本指南将向您展示如何在没有备份的情况下从 iPhone 恢复已删除的照片。我们所有人在生活中的某个时刻都一定做过一些愚蠢的事情&#xff0c;例如从手机或电脑中删除一些重要的东西。这是很自然的&#xff0c;没有什么可羞耻的。您可能在辛苦工作一天后回来。当突然想看一些照片…

科技云报到:云服务的中场战事,从AI应用开始

科技云报到原创。 从去年的大模型之战&#xff0c;到今年的AI应用之争&#xff0c;云服务正在迈入全新的发展阶段。AI这个杠杆将各家厂商的竞争策略更向前推进了一步。 “云AI”能够孵化出多少可能&#xff1f;在业界眼中&#xff0c;“云AI”则意味着新的悬念&#xff1a;云计…

Java基础知识全面总结

第一章&#xff1a;类与对象 第一课&#xff1a;什么是面向对象编程 1.面向对象编程和面向过程编程的区别 无论是面向过程编程还是面向对象编程都是用于解决一个实际问题&#xff0c;当面向过程编程在解决一个问题时&#xff0c;更多的情况下是不会做出重用的设计思考的&…

蓝桥杯【物联网】零基础到国奖之路:十七. 扩展模块之单路ADC和NE555

蓝桥杯【物联网】零基础到国奖之路:十七. 扩展模块之单路ADC和NE555 第一节 硬件解读第二节 CubeMx配置第三节 代码1&#xff0c;脉冲部分代码2&#xff0c;ADC部分代码![在这里插入图片描述](https://i-blog.csdnimg.cn/direct/57531a4ee76d46daa227ae0a52993191.png) 第一节 …

pytorh学习笔记——波士顿房价预测

机器学习的“hello world”&#xff1a;波士顿房价预测 波士顿房价预测的背景不用提了&#xff0c;简单了解一下数据集的结构。 波士顿房价的数据集&#xff0c;共有506组数据&#xff0c;每组数据共14项&#xff0c;前13项是影响房价的各种因素&#xff0c;比如&…

Nullinux:一款针对Linux操作系统的安全检测工具

关于Nullinux Nullinux是一款针对Linux操作系统的安全检测工具&#xff0c;广大研究人员可以利用该工具针对Linux目标设备执行网络侦查和安全检测。 该工具可以通过SMB枚举目标设备的安全状况信息&#xff0c;其中包括操作系统信息、域信息、共享信息、目录信息和用户信息。如…

第四次论文问题知识点及问题

1、NP-hard问题 NP-hard&#xff0c;指所有NP问题都能在多项式时间复杂度内归约到的问题。 2、启发式算法 ‌‌启发式算法&#xff08;heuristic algorithm&#xff09;是相对于最优化算法提出的。它是一种基于直观或经验构造的算法&#xff0c;旨在以可接受的花费给出待解决…

『网络游戏』进入游戏主城UI跳转主城【26】

首先在Unity客户端中创建一个空节点重命名为MainCityWnd 设置父物体为全局 创建空节点钉在左上角作为角色信息UI 在钉子下创建Image 创建脚本&#xff1a;MainCityWnd.cs 编写脚本&#xff1a;MainCityWnd.cs 挂载脚本 创建脚本&#xff1a;MainCitySys.cs 编写脚本&#xff1a…

Java中二维数组-杨辉三角

使用二维数组打印一个10行杨辉三角 1 1 1 1 2 1 1 3 3 1 1 4 6 4 1 1 5 10 10 5 1 1&#xff09;第一行有1个元素&#xff0c;第n行有n个元素 2&#xff09;每一行的第一个元素和最后一个元素都是1 3&#xff09;从第三行开始&#xff0c;对于非第一个元素和最后一个元素的元素…

商贸物流痛点解析

商贸物流痛点解析 在当今全球化的商业环境中&#xff0c;商贸与物流之间的紧密协作已成为业务成功的关键因素。然而&#xff0c;许多组织面临着信息不对称、资源配套不足以及系统间隔离等痛点&#xff0c;这些问题严重阻碍了商贸体系与物流、仓储和园区的有效联动&#xff0c;…