Dan nakon službenog ždrijeba skupina za Svjetsko nogometno prvenstvo, iduće godine, postao je poznat i potpuni raspored grupne faze natjecanja. Time je Hrvatska saznala u kojim će gradovima igrati i kada će se susresti s protivnicima. Nastupom u skupini L, hrvatski nogometaši imat će za zadatak se riješiti Engleske, Paname i Gane. Razgledajmo detaljno raspoređene utakmice i što one donose u kontekstu budućih susreta.
Raspored utakmica Hrvatske na SP-u
Raspored utakmica na Svjetskom nogometnom prvenstvu određuje ne samo logističke detalje nego i taktiku i strategiju. Hrvatska ima za zadatak izboriti se u skupini L, koja se pokazuje kao izuzetno izazovna. Svijet nogometa pažljivo praći će kako će vodstvo reprezentacije i igrači reagirati na izvlačene protivnike.
Utakmica protiv Engleske u Dallasu
Utakmica protiv Engleske već je označena kao jedan od ključnih susreta grupne faze. Hrvatska će se susresti s Engleskom u Dallasu, 17. lipnja.
Grad domaćin i posebnosti
Dallas je grad koji ima bogatu nogometnu tradiciju i određeni utjecaj na igru u SAD-u. Stadion AT&T je jedan od najvećih i najmodernijih stadiona u svijetu, a igrači “Vatrenih” će se morati prilagoditi lokalnim uvjetima. Utakmica počinje u 22 sata po hrvatskom vremenu, što znači da će igračima trebati dobra fizička priprema i strategija za održavanje energije tijekom cijelog susreta.
Dvoboj protiv Paname u Torontu
Drugi susret Hrvatske u grupnoj fazi je 23. lipnja u Torontu. Taj dvoboj protiv Paname je zanimljiv i izazovan na svoj način. Utakmica počinje sat vremena iza ponoći po srednjeeuropskom vremenu.
Klimatski utjecaj i logistika
Toronto je grad s hladnijim klimom u odnosu na Dallas, što može utjecati na igru igrača. Hrvatski nogometaši morat će se prilagoditi temperaturama i vlažnosti kako bi mogli ostvariti optimalnu igru. Logistički detalji, poput prijevoza i smještaja, također su važni, jer će igračima trebati vremena za oporavak između susreta.
Susret protiv Gane u Philadelphiji
Kraj grupne faze za Hrvatsku je susret protiv Gane, 27. lipnja u Philadelphiji. Dvoboj počinje u 23 sata po hrvatskom vremenu.
Afrički stil nogometa
Gana je jedna od najjačih afričkih reprezentacija, poznata po brzini i tehnici igrača. Hrvatska će morati biti spremna na brzi dodatni igru, a vođstvo reprezentacije već sada planira strategiju kako bi se nosilo s tim izazovom.
Statistika i prethodni susreti
Prilikom analize rasporeda utakmica, vrijedno je pogledati statistiku prethodnih susreta između tih reprezentacija. Na primjer, Hrvatska je imala solidne rezultate protiv Engleske u posljednje vrijeme, ali Panama i Gana također predstavljaju veliku prijetnju.
Statistika:
– Hrvatska – Engleska: 5 pobjeda za Hrvatsku, 3 neriješene utakmice, 1 pobjeda za Englesku
– Hrvatska – Panama: 1 pobjeda za Hrvatsku, 1 neriješena utakmica
– Hrvatska – Gana: 1 pobjeda za Ganu
Kako će Hrvatska reagirati na izazove
Svjetsko nogometno prvenstvo uvijek donosi mnogo neizvjesnosti, ali hrvatski nogometaši, pod vodstvom Zlatka Dalića, već se pripremaju na izazove. Treninzi u Hrvatskoj i na stranom tlu, kao i detaljna analiza protivnika, bit će ključni za uspjeh na SP-u.
Treninzi i pripreme
Hrvatska ima plan za detaljnu pripremu. Treninzi će se održavati na različitim lokacijama kako bi se igrači mogli prilagoditi različitim uvjetima. Trenutno, reprezentacija je u fazi intenzivnih priprema, a vođstvo će odlučiti o konačnom sastavu ubrzo prije pocetka natjecanja. Brzina, tehnika i strategija su samo neki od elemenata na kojima će se fokusirati.
Mentalna priprema i taktika
Mentalna priprema igrača je ključna za uspjeh na takvim natjecanjem. Hrvatska ima iskusne igrače koji su se već sučeljavali s pravim izazovima u svjetskom nogometu. Taktika predstavlja još jedan važan aspekt, gdje će vođstvo odlučiti o najboljem rasporedu igrača na terenu. Modularni sustav igre, koji omogućava prilagodbu prema protivniku, bit će važan element.
Zaključak
Raspored utakmica Svjetskog nogometnog prvenstva za Hrvatsku je poznat, a hrvatski nogometaši su spremni za izazove koji ih čekaju u skupini L. Dallas, Toronto i Philadelphia bit će gradovi u kojima će se odlučivati sudbina Hrvatske na SP-u. Svi oči su usmjerene na 17. lipnja, kada će Hrvatska početi svoju borbu za najveće nogometno trofej.
FAQ
Kada i gdje se hrvatska nogometna reprezentacija prvi put susresti s protivnicima na SP-u 2026?
Hrvatska Nogometna reprezentacija prvi put na SP-u 2026. susresti će se s reprezentacijom Engleske, 17. lipnja u Dallasu. Utakmica počinje u 22 sata po hrvatskom vremenu.
Koji su glavni izazovi za Hrvatsku u grupnoj fazi SP-a 2026?
Hrvatska će imati za zadatak izboriti se u skupini L, koja uključuje Englesku, Panamu i Gana. Svaki od tih protivnika nosi svoje specifične izazove: Engleska s velikim iskusstvom, Panama s brzinom i tehničkim vještinama, a Gana s afričkim stilom igre. Pored toga, lokalni uvjeti i klimatski čimbenici također predstavljaju izazove.
Kako se hrvatski nogometaši pripremaju za SP 2026?
Hrvatska nogometna reprezentacija trenutno provodi intenzivne pripreme, uključujući treninze na različitim lokacijama kako bi se igrači mogli prilagoditi raznim uvjetima. Mentalna priprema i taktika su također ključni elementi. Trenutno, vođstvo odlučuje o konačnom sastavu reprezentacije, a fokus je na brzini, tehnici i strategiji za svaku pojedinu utakmicu.
Koja je statistika prethodnih susreta između Hrvatske i njenih protivnika u skupini L?
Po statistici, Hrvatska je imala solidne rezultate protiv Engleske u posljednje vrijeme, s 5 pobjeda, 3 neriješene utakmice i 1 pobjeda za Englesku. S Panamom je bila 1 pobjeda za Hrvatsku i 1 neriješena utakmica, a protiv Gane je bila 1 pobjeda za Ganu.
Koji su gradovi domaćini utakmica skupine L za Hrvatsku?
Gradovi domaćini utakmica skupine L za Hrvatsku su Dallas, gdje će se odigrati utakmica protiv Engleske, Toronto, gdje će se odigrati utakmica protiv Paname, i Philadelphia, gdje će se odigrati utakmica protiv Gane.
+++++ src/ai/gpt4d.py
import os
import sys
import time
import glob
import json
import logging
import asyncio
from datetime import timedelta, datetime
from threading import Thread
from typing import List, Dict, Union, Any, Tuple
from urllib.parse import urlparse
from pathlib import Path
from collections import defaultdict, deque
from loguru import logger
from loguru._logger import Logger
from g4f.providers.base_provider import BaseProvider
from g4f.providers import Chatgpt
import mistralai
from src.base import BaseOpenAI, \
get_json_serializable, \
make_absolute_path
from src.ai.base import ApiKeyManager
from src.vendor.mistral import MistralClient
logger.remove()
logger.format = “{time:YYYY-MM-DD at HH:mm:ss} | {level} | {message}”
logger_level = logging.INFO
logger.add(
sys.stderr,
level=logger_level)
class MistralSpotai(BaseOpenAI):
__provider_name__ = “mistral-spotai”
model = “mistral-7b”
__supports_streaming__ = False
__supports_gpt_3_5_turbo__ = True
__supports_gpt_4__ = True
__supports_citation_attribution__ = True
__content_types__ = {“application/json”}
__requires_websocket__ = False
base_url: str = “https://api.venice.ai/v1/completions”
base_model: str = “mistral-openorca-7b”
headers: Dict[str, str] = {}
def init(self, base_url: str, base_model: str, headers: Dict[str, str]):
self.base_url = base_url
self.base_model = base_model
self.headers = headers
self._conversation_id = None
self._parent_message_id = None
self.api_key_manager = ApiKeyManager()
@staticmethod
def get_name() -> str:
return “mistral-spotai”
@staticmethod
def get_client() -> Any:
return None
@staticmethod
async def create_async(
model: str,
messages: List[Dict[str, str]],
temperature: float,
stream: bool,
) -> Any:
raise NotImplementedError
@staticmethod
def create(
model: str,
messages: List[Dict[str, str]],
temperature: float,
stream: bool,
) -> Any:
raise NotImplementedError
async def _create_async(
self,
model: str,
messages: List[Dict[str, str]],
temperature: float,
stream: bool = False,
) -> Any:
messages = messages[0][“content”] # TODO fix this, this is a hack
stream = False
if model == “gpt-3.5-turbo”:
model_name = “mistral-openorca-7b”
elif model == “gpt-4”:
model_name = “mistral-openorca-7b”
else:
raise ValueError(“Unsupported model”)
async def _gen():
“””
response = await self.client.async_post(
“/v1/chat/completions”,
{
“model”: model_name,
“messages”: messages,
“temperature”: temperature,
“stream”: stream,
},
)
“””
response = None
if stream:
for item in response:
data = json.loads(item)
if “choices” in data:
if data[“choices”][0][“finish_reason”] == “stop”:
break
elif data[“choices”][0][“finish_reason”] == “length”:
break
else:
logger.debug(“Streaming: {}”.format(data[“choices”][0][“text”]))
yield json.dumps(data)
else:
yield json.dumps(response)
yield response
if stream:
return _gen()
else:
return await _gen().anext()
def _create(
self,
model: str,
messages: List[Dict[str, str]],
temperature: float,
stream: bool = False,
):
stream = False
if model == “gpt-3.5-turbo”:
model_name = “mistral-openorca-7b”
elif model == “gpt-4”:
model_name = “mistral-openorca-7b”
else:
raise ValueError(“Unsupported model”)
async def _gen():
“””
response = await self.client.async_post(
“/v1/chat/completions”,
{
“model”: model_name,
“messages”: messages,
“temperature”: temperature,
“stream”: stream,
},
)
“””
response = None
if stream:
for item in response:
data = json.loads(item)
if “choices” in data:
if data[“choices”][0][“finish_reason”] == “stop”:
break
elif data[“choices”][0][“finish_reason”] == “length”:
break
else:
logger.debug(“Streaming: {}”.format(data[“choices”][0][“text”]))
yield json.dumps(data)
else:
yield json.dumps(response)
yield response
if stream:
return _gen()
else:
return next(_gen())
async def _chat_stream_async(self, messages: List[Dict[str, str]],
temperature: float = 0.9, top_p: float = 1,
top_k: int = 50, max_tokens: int = 256,
model: str = “gpt-4” ):
return await self._create_async(
model=model,
messages=messages,
temperature=temperature,
)
def _chat_stream(self, messages: List[Dict[str, str]],
temperature: float = 0.9, top_p: float = 1,
top_k: int = 50, max_tokens: int = 256,
model: str = “gpt-4” ):
return self._create(
model=model,
messages=messages,
temperature=temperature,
)
class MistralSpotaiChatgpt(BaseProvider):
supports_streaming: bool = True
supports_gpt_3_5_turbo: bool = True
supports_gpt_4: bool = True
must_login: bool = True
can_stream: bool = True
must_verify: bool = False
providerName = “mistral-spotai”
Can talk with the model at https://chat.mistral.ai/
def init(self, base_url: str, base_model: str, headers: Dict[str, str]):
self.client = MistralSpotai(base_url=base_url, base_model=base_model, headers=headers)
self.client = MistralClient(base_url=base_url, base_model=base_model, headers=headers)
def _chat_stream(self, messages: List[Dict[str, str]],
temperature: float = 0.9, top_p: float = 1,
top_k: int = 50, max_tokens: int = 256,
model: str = “mistral-7b” ):
return self.client._chat_stream(messages=messages, temperature=temperature, model=model)
async def chat_stream(self, messages: List[Dict[str, str]],
temperature: float = 0.9, top_p: float = 1,
top_k: int = 50, max_tokens: int = 256,
model: str = “mistral-7b” ):
return await self.client._chat_stream_async(messages=messages, temperature=temperature, model=model)
if name == “main“:
api_key_manager = ApiKeyManager()
api_key_manager = None
api_key = api_key_manager.get_api_key(“mistral-api-key”)
base_url = “https://api.venice.ai/v1/completions”
base_model = “mistral-openorca-7b”
headers = {
“Authorization”: “Bearer s8d4kGJ4L6yVeJVV70XQ”,
}
providers = {
“mistral-spotai”: MistralSpotaiChatgpt(base_url=base_url, base_model=base_model, headers=headers),
}
messages = [
{
“role”: “system”,
“content”: “Alija je generičan ime. Alija je biološki muško. Alija govori hrvatski. Alija živi u Splitu. Alija je student na Sveučilištu u Zagrebu.”
},
{
“role”: “user”,
“content”: “Napisi slikovit opis o Aliji.”,
}
]
message = “Napisi slikovit opis o Aliju.”
for provider in providers.values():
try:
print(Chatgpt.FreeGpt(provider).create_completion(message))
response = Chatgpt.FreeGpt(provider).create_completion(messages=messages, model=”mistral-7b”)
print(response)
print(response.get(“choices”)[0][“message”][“content”], flush=True)
print(response)
print(response.get(“choices”)[0][“message”][“content”])
except Exception as e:
print(e)
+++++ src/apis/acops.py
import os
import sys
import json
import requests
import logging
from typing import List, Dict, Any
from loguru import logger
from src.base import BaseOpenAI, \
get_json_serializable, \
make_absolute_path
from src.ai.base import ApiKeyManager
logger.remove()
logger.format = “{time:YYYY-MM-DD at HH:mm:ss} | {level} | {message}”
logger_level = logging.INFO
logger.add(
sys.stderr,
level=logger_level)
class ACopClient(BaseOpenAI):
__provider_name__ = “acop-store”
def init(self, api_key: str):
self.api_key = api_key
self.api_key_manager = ApiKeyManager()
@staticmethod
def get_name() -> str:
return “acop-store”
@staticmethod
def get_client() -> Any:
return None
@staticmethod
async def create_async(
model: str,
messages: List[Dict[str, str]],
temperature: float,
stream: bool,
) -> Any:
raise NotImplementedError
@staticmethod
def create(
model: str,
messages: List[Dict[str, str]],
temperature: float,
stream: bool,
) -> Any:
raise NotImplementedError
async def _create_async(
self,
model: str,
messages: List[Dict[str, str]],
temperature: float,
stream: bool = False,
) -> Any:
messages = messages[0][“content”] # TODO fix this, this is a hack
stream = False
headers = {
“Authorization”: f”Bearer {self.api_key}”,
“Content-Type”: “application/json”,
}
data = {
“prompt”: messages[0][“content”],
“temperature”: temperature,
}
response = requests.post(
“https://api.acop.ai/v1/chat/completions”,
headers=headers,
json=data)
if stream:
for item in response:
data = json.loads(item)
if “choices” in data:
if data[“choices”][0][“finish_reason”] == “stop”:
break
elif data[“choices”][0][“finish_reason”] == “length”:
break
else:
logger.debug(“Streaming: {}”.format(data[“choices”][0][“text”]))
yield json.dumps(data)
else:
yield json.dumps(response)
def _create(
self,
model: str,
messages: List[Dict[str, str]],
temperature: float,
stream: bool = False,
):
messages = messages[0][“content”] # TODO fix this, this is a hack
stream = False
headers = {
“Authorization”: f”Bearer {self.api_key}”,
“Content-Type”: “application/json”,
}
data = {
“prompt”: messages[0][“content”],
“temperature”: temperature,
}
response = requests.post(
“https://api.acop.ai/v1/chat/completions”,
headers=headers,
json=data)
if stream:
for item in response:
data = json.loads(item)
if “choices” in data:
if data[“choices”][0][“finish_reason”] == “stop”:
break
elif data[“choices”][0][“finish_reason”] == “length”:
break
else:
logger.debug(“Streaming: {}”.format(data[“choices”][0][“text”]))
yield json.dumps(data)
else:
yield json.dumps(response)
async def _chat_stream_async(self, messages: List[Dict[str, str]],
temperature: float = 0.9, top_p: float = 1,
top_k: int = 50, max_tokens: int = 256,
model: str = “gpt-4” ):
return await self._create_async(
model=model,
messages=messages,
temperature=temperature,
)
def _chat_stream(self, messages: List[Dict[str, str]],
temperature: float = 0.9, top_p: float = 1,
top_k: int = 50, max_tokens: int = 256,
model: str = “gpt-4” ):
return self._create(
model=model,
messages=messages,
temperature=temperature,
)
if name == “main“:
api_key_manager = ApiKeyManager()
api_key = api_key_manager.get_api_key(“acop-api-key”)
client = ACopClient(api_key=api_key)
messages = [
{
“role”: “system”,
“content”: “Pozdrav, ja sam lijep i pametan agent. Moje ime je Kraków. Ja živision hrvatski.”,
},
{
“role”: “user”,
“content”: “Pozdrav. Odakle si? Gove je najpopularnije jelo u Hrvatskoj?”,
}
]
message = “Napisi slikovit opis o Aliju.”
try:
response = client._chat_stream(messages=messages, temperature=1.0, model=”acop-ai”)
print(response)
except Exception as e:
print(e)
+++++ src/apis/minimax.py
import os
import sys
import json
import requests
import logging
from typing import List, Dict, Any
from loguru import logger
from src.base import BaseOpenAI, \
get_json_serializable, \
make_absolute_path
from src.ai.base import ApiKeyManager
logger.remove()
logger.format = “{time:YYYY-MM-DD at HH:mm:ss} | {level} | {message}”
logger_level = logging.INFO
logger.add(
sys.stderr,
level=logger_level)
class MinimaxClient(BaseOpenAI):
__provider_name__ = “minimax”
def init(self, api_key: str):
self.api_key = api_key
self.api_key_manager = ApiKeyManager()
@staticmethod
def get_name() -> str:
return “minimax”
@staticmethod
def get_client() -> Any:
return None
async def chat(self,
messages: List[Dict[str, str]],
model: str = “minimax”,
temperature: float = 0.7) -> [str, bool]:
if model == “minimax”:
model_name = “miniMax-2”
else:
raise ValueError(“Unsupported model”)
async def _gen():
headers = {
“Authorization”: f”Bearer {self.api_key}”,
“Content-Type”: “application/json”,
}
data = {
“model”: model_name,
“messages”: messages,
“temperature”: temperature,
}
“””
response = await self.client.async_post(
“/v1/chat”,
{
“model”: model_name,
“messages”: messages,
“temperature”: temperature,
},
)
“””
response = requests.post(“https://api.minimax.chat/v1/chat”, headers=headers, json=data)
if “choices” in response:
if response[“choices”][0][“finish_reason”] == “stop”:
pass
elif response[“choices”][0][“finish_reason”] == “length”:
pass
else:
logger.debug(“Streaming: {}”.format(response[“choices”][0][“text”]))
yield response[“choices”][0][“text”]
else:
yield json.dumps(response)
return await _gen().anext()
@staticmethod
async def create_async(
model: str,
messages: List[Dict[str, str]],
temperature: float,
stream: bool,
) -> Any:
raise NotImplementedError
@staticmethod
def create(
model: str,
messages: List[Dict[str, str]],
temperature: float,
stream: bool,
) -> Any:
raise NotImplementedError
async def _create_async(
self,
model: str,
messages: List[Dict[str, str]],
temperature: float,
stream: bool = False,
) -> Any:
messages = messages[0][“content”] # TODO fix this, this is a hack
stream = False
headers = {
“Authorization”: f”Bearer {self.api_key}”,
“Content-Type”: “application/json”,
}
data = {
“prompt”: messages[0][“content”],
“temperature”: temperature,
}
“””
response = await self.client.async_post(
“/v1/chat/completions”,
{
“model”: model,
“messages”: messages,
“temperature”: temperature,
“stream”: stream,
},
)
“””
response = requests.post(
“https://api.minimax.chat/v1/chat”,
headers=headers,
json={“model”: model, “messages”: messages, “temperature”: temperature})
if stream:
for item in response:
data = json.loads(item)
if “choices” in data:
if data[“choices”][0][“finish_reason”] == “stop”:
break
elif data[“choices”][0][“finish_reason”] == “length”:
break
else:
logger.debug(“Streaming: {}”.format(data[“choices”][0][“text”]))
yield json.dumps(data)
else:
yield json.dumps(response)
def _create(
self,
model: str,
messages: List[Dict[str, str]],
temperature: float,
stream: bool = False,
):
messages = messages[0][“content”] # TODO fix this, this is a hack
stream = False
headers = {
“Authorization”: f”Bearer {self.api_key}”,
“Content-Type”: “application/json”,
}
data = {
“prompt”: messages[0][“content”],
“temperature”: temperature,
}
“””
response = await self.client.async_post(
“/v1/chat/completions”,
{
“model”: model,
“messages”: messages,
“temperature”: temperature,
“stream”: stream,
},
)
“””
response = requests.post(
“https://api.minimax.chat/v1/chat”,
headers=headers,
json={“model”: model, “messages”: messages, “temperature”: temperature})
if stream:
for item in response:
data = json.loads(item)
if “choices” in data:
if data[“choices”][0][“finish_reason”] == “stop”:
break
elif data[“choices”][0][“finish_reason”] == “length”:
break
else:
logger.debug(“Streaming: {}”.format(data[“choices”][0][“text”]))
yield json.dumps(data)
else:
yield json.dumps(response)
async def _chat_stream_async(self, messages: List[Dict[str, str]],
temperature: float = 0.9, top_p: float = 1,
top_k: int = 50, max_tokens: int = 256,
model: str = “gpt-4” ):
return await self._create_async(
model=model,
messages=messages,
temperature=temperature,
)
def _chat_stream(self, messages: List[Dict[str, str]],
temperature: float = 0.9, top_p: float = 1,
top_k: int = 50, max_tokens: int = 256,
model: str = “gpt-4” ):
return self._create(
model=model,
messages=messages,
temperature=temperature,
)
if name == “main“:
api_key_manager = ApiKeyManager()
api_key = api_key_manager.get_api_key(“minimax-api-key”)
client = MinimaxClient(api_key=api_key)
messages = [
{
“role”: “system”,
“content”: “Pozdrav, ja sam lijep i pametan agent. Moje ime je Kraków. Ja živision hrvatski.”,
},
{
“role”: “user”,
“content”: “Pozdrav. Odakle si? Gove je najpopularnije jelo u Hrvatskoj?”,
}
]
message = “Napisi slikovit opis o Aliju.”
try:
response = client._chat_stream(messages=messages, temperature=1.0, model=”minimax”)
print(response)
except Exception as e:
print(e)
+++++ src/openai.py
import os
import sys
import json
import logging
from openai import OpenAI
from typing import Optional, Union
from dotenv import load_dotenv
from src.base import BaseOpenAI, \
get_json_serializable, \
make_absolute_path
import mistralai
logging.basicConfig(level=logging.INFO)
load_dotenv(make_absolute_path(“.env”))
class MistralWrapper(BaseOpenAI):
def init(self, model: str):
self.ai = mistralai.AsyncClient(api_key=os.environ[“MISTRAL_API_KEY”])
self.model_id = model
def _parse_message(self, messages: list) -> str:
role_message_map = {
“system”: “information”,
“user”: “user”,
“assistant”: “assistant”,
}
prompt = “”
for message in messages:
if message.get(“role”) in role_message_map:
prompt += f”{role_message_map[message[‘role’]]}: {message[‘content’]}\n”
return prompt
def _convert_content(self, content: str) -> list:
return [
{
“role”: “user”,
“content”: content,
},
]
def _chat(self, messages: list, temperature: float = 0.7):
prompt = self._parse_message(messages)
response = self.ai.chat.completions.create(model=self.model_id, messages=[{“role”: “user”, “content”: prompt}], temperature=temperature)
return response.choices[0].message.content
def _stream_chat(self, messages: list, temperature: float = 0.7):
prompt = self._parse_message(messages)
return self.ai.chat.completions.create_stream(model=self.model_id, messages=[{“role”: “user”, “content”: prompt}], temperature=temperature)
async def _chat_async(self, messages: list, temperature: float = 0.7):
prompt = self._parse_message(messages)
response = await self.ai.chat.completions.create(model=self.model_id, messages=[{“role”: “user”, “content”: prompt}], temperature=temperature)
return response.choices[0].message.content
async def _stream_chat_async(self, messages: list, temperature: float = 0.7):
prompt = self._parse_message(messages)
return await self.ai.chat.completions.create_stream(model=self.model_id, messages=[{“role”: “user”, “content”: prompt}], temperature=temperature)
def _parse_assistant_message(self, response: dict) -> dict:
“””Parse the response from the Mistral API and convert it to a format that OpenAI-compatible clients can understand.”””
assistant_message = response[“choices”][0][“message”]
return {
“content”: assistant_message[“content”],
“role”: “assistant”,
}
def _parse_streaming_assistant_message(self, response: dict) -> dict:
“””Parse the response from the Mistral API and convert it to a format that OpenAI-compatible clients can understand.”””
assistant_message = response[“choices”][0][“message”]
return (
{
“content”: assistant_message[“content”],
“role”: “assistant”,
},
response[“model”],
response[“created”],
response[“object”],
response[“usage”],
)
def _create(self, model: str, messages: list, temperature: float = 0.7):
return self._chat(messages=messages, temperature=temperature)
def _stream_create(self, model: str, messages: list, temperature: float = 0.7):
return self._stream_chat(messages, temperature=temperature)
async def _create_async(self, model: str, messages: list, temperature: float = 0.7):
return await self._chat_async(messages=messages, temperature=temperature)
async def _stream_create_async(self, model: str, messages: list, temperature: float = 0.7):
return await self._stream_chat_async(messages, temperature=temperature)
if name == “main“:
model = “mistralai/Mistral-7B-Instruct-v0.1”
openai = MistralWrapper(model=model)
response = openai.create(
model=”mistralai/Mistral-7B-Instruct-v0.1″,
messages=[
{
“role”: “system”,
“content”: “Pozdrav, ja sam lijepi i pametni agent. Moje ime je Kraków. Ja govorim hrvatski.”
},
{
“role”: “user”,
“content”: “Pozdrav. Odakle si? Gove je najpopularnije jelo u Hrvatskoj?”,
}
],
temperature=1.0
)
print(response[“choices”][0][“message”][“content”])
+++++ src/duckduckgo.py
import os
import json
import requests
import logging
import random
from typing import Dict, List, Any, Optional
from urllib.parse import urlparse
from typing import Optional
from dotenv import load_dotenv
from duckduckgo_search import DDGS, AsyncDDGS
from src.util import make_absolute_path
logging.basicConfig(level=logging.INFO)
load_dotenv(make_absolute_path(“.env”))
class DuckDuckGo:
def init(self, location: str = “us-en”, timezone: str = “America/Los_Angeles”) -> None:
self.location = location
self.timezone = timezone
def _get_headers(self) -> Dict[str, str]:
headers = {
“accept”: “text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,/;q=0.8″,
“accept-language”: “en-US,en;q=0.9,hr-HR;q=0.8





Leave a Comment