tangzhy commited on
Commit
931ea15
1 Parent(s): 19c1927

Update app.py

Browse files
Files changed (1) hide show
  1. app.py +5 -3
app.py CHANGED
@@ -12,12 +12,14 @@ from transformers import (
12
  TextIteratorStreamer,
13
  )
14
 
15
- DESCRIPTION = """\
 
 
16
  # ORLM LLaMA-3-8B
17
 
18
  Hello! I'm ORLM-LLaMA-3-8B, here to automate your optimization modeling tasks! Check our [repo](https://github.com/Cardinal-Operations/ORLM) and [paper](https://arxiv.org/abs/2405.17743)!
19
 
20
- Please note that solution generation may be terminated if it exceeds 100 seconds. We strongly recommend running the demo locally using our [sample script](https://github.com/Cardinal-Operations/ORLM/blob/master/scripts/inference.py) for a smoother experience.
21
 
22
  If the demo successfully generates a code solution, execute it in your Python environment with `coptpy` installed to obtain the final optimal value for your task.
23
  """
@@ -47,7 +49,7 @@ Below is an operations research question. Build a mathematical model and corresp
47
  # Response:
48
  """
49
 
50
- @spaces.GPU(duration=90)
51
  def generate(
52
  message: str,
53
  chat_history: list[tuple[str, str]],
 
12
  TextIteratorStreamer,
13
  )
14
 
15
+ GENERATION_TIME=90
16
+
17
+ DESCRIPTION = f"""\
18
  # ORLM LLaMA-3-8B
19
 
20
  Hello! I'm ORLM-LLaMA-3-8B, here to automate your optimization modeling tasks! Check our [repo](https://github.com/Cardinal-Operations/ORLM) and [paper](https://arxiv.org/abs/2405.17743)!
21
 
22
+ Please note that solution generation may be terminated if it exceeds {GENERATION_TIME} seconds. We strongly recommend running the demo locally using our [sample script](https://github.com/Cardinal-Operations/ORLM/blob/master/scripts/inference.py) for a smoother experience.
23
 
24
  If the demo successfully generates a code solution, execute it in your Python environment with `coptpy` installed to obtain the final optimal value for your task.
25
  """
 
49
  # Response:
50
  """
51
 
52
+ @spaces.GPU(duration=GENERATION_TIME)
53
  def generate(
54
  message: str,
55
  chat_history: list[tuple[str, str]],