Just checking in
This commit is contained in:
parent
25ac5aefcd
commit
19e6fb0fdb
@ -1,7 +1,31 @@
|
||||
# About Backstory
|
||||
|
||||
This application was developed to achieve a few goals:
|
||||
The backstory about Backstory...
|
||||
|
||||
1. See if it is realistic to self-host AI LLMs. Turns out, it is -- with constraints.
|
||||
2. Provide a recent example of my capabilities; many of my projects while working for Intel were internally facing. The source code to this project is available on [GitHub](https://github.com/jketreno/backstory).
|
||||
3. My career at Intel was diverse. Over the years, I have worked on many projects almost everywhere in the computer ecosystem. That results in a resume that is either too long, or too short. This application is intended to provide a quick way for employers to ask the LLM about me.
|
||||
## Backstory is two things
|
||||
|
||||
1. Backstory serves as an interactive Q&A that let's potential employers ask questions about someone's work history (aka "Backstory".) Based on the content the job seeker has provided to the RAG system, that can provide insights into that individual's resume and curriculum vitae that are often left out when people are trying to fit everything onto one page.
|
||||
2. A resume builder -- if you have a job position, and you think this person might be a candidate, paste your job description and have a resume produced based on their data. If it looks interesting, reach out to them. If not, hopefully you've gained some insight into what drives them.
|
||||
|
||||
-or-
|
||||
|
||||
2. As a potential job seeker, you can self host this environment and generate resumes for yourself.
|
||||
|
||||
While this project was generally built for self-hosting with open source models, you can use any of the frontier models. The API adapters in this project can be configured to use infrastructure hosted from Anthropic, Google, Grok, and OpenAI (alphabetical.) For information, see [https://github.com/jketreno/backstory/README.md](https://github.com/jketreno/backstory/README.md#Frontier_Models).
|
||||
|
||||
|
||||
## This application was developed to achieve a few goals:
|
||||
|
||||
1. See if it is realistic to self-host AI LLMs. Turns out, it is -- with constraints. I've been meaning to write a blog post about what to buy to build an AI PC that can run the latest "small" (7B) parameter models.
|
||||
2. Provide a recent example of my capabilities; many of my projects while working for Intel were internally facing. The source code to this project is available on [GitHub](https://github.com/jketreno/backstory). It doesn't touch on much of my history of work, however it does represent the pace at which I can adapt and develop useful solutions to fill a gap.
|
||||
3. My career at Intel was diverse. Over the years, I have worked on many projects almost everywhere in the computer ecosystem. That results in a resume that is either too long, or too short. This application is intended to provide a quick way for employers to ask the LLM about me. You can view my resume in totality, or use the Resume Builder to post your job position to see how I fit. Or go the Backstory and ask questions about the projects mentioned in my resume.
|
||||
|
||||
## Some questions
|
||||
|
||||
Q. <ChatQuery text="Why aren't you providing this as a Platform As a Service (PaaS) application?"/>
|
||||
|
||||
A. I could; but I don't want to store your data. I also don't want to have to be on the hook for support of this service. I like it, it's fun, but it's not what I want as my day-gig, you know? If it was, I wouldn't have built this app...
|
||||
|
||||
Q. <ChatQuery text="Why can't I just ask Backstory these questions?"/>
|
||||
|
||||
A. Try it. See what you find out :)
|
@ -28,13 +28,9 @@ import CssBaseline from '@mui/material/CssBaseline';
|
||||
import ResetIcon from '@mui/icons-material/History';
|
||||
import SendIcon from '@mui/icons-material/Send';
|
||||
import ExpandMoreIcon from '@mui/icons-material/ExpandMore';
|
||||
import Card from '@mui/material/Card';
|
||||
import CardContent from '@mui/material/CardContent';
|
||||
|
||||
import PropagateLoader from "react-spinners/PropagateLoader";
|
||||
|
||||
import { MuiMarkdown } from "mui-markdown";
|
||||
|
||||
import { ResumeBuilder } from './ResumeBuilder';
|
||||
import { Message, MessageList } from './Message';
|
||||
import { SeverityType } from './Snack';
|
||||
@ -51,15 +47,14 @@ import '@fontsource/roboto/700.css';
|
||||
const welcomeMarkdown = `
|
||||
# Welcome to Backstory
|
||||
|
||||
Backstory was written by James Ketrenos in order to provide answers to questions potential employers may have about his work history. In addition to being a RAG enabled expert system, the LLM is configured with real-time access to weather, stocks, the current time, and can answer questions about the contents of a website.
|
||||
Backstory was written by James Ketrenos in order to provide answers to questions potential employers may have about his work history. In addition to being a RAG enabled expert system, the LLM has access to real-time data.
|
||||
|
||||
You can ask things like:
|
||||
|
||||
<ChatQuery text="What is James Ketrenos' work history?"/>
|
||||
<ChatQuery text="What programming languages has James used?"/>
|
||||
<ChatQuery text="What is the weather where James is from?"/>
|
||||
<ChatQuery text="What are the headlines from CNBC?"/>
|
||||
<ChatQuery text="What are the stock value of the most traded companies?"/>
|
||||
<ChatQuery text="What are James' professional strengths?"/>
|
||||
<ChatQuery text="What are today's headlines on CNBC.com?"/>
|
||||
|
||||
You can click the text above to submit that query, or type it in yourself (or whatever questions you may have.)
|
||||
|
||||
@ -194,7 +189,6 @@ const Controls = ({ tools, rags, systemPrompt, toggleTool, toggleRag, messageHis
|
||||
<AccordionActions style={{ flexDirection: "column" }}>
|
||||
<TextField
|
||||
variant="outlined"
|
||||
autoFocus
|
||||
fullWidth
|
||||
multiline
|
||||
type="text"
|
||||
@ -808,6 +802,8 @@ const App = () => {
|
||||
const sendQuery = async (query: string) => {
|
||||
if (!query.trim()) return;
|
||||
|
||||
setTab(0);
|
||||
|
||||
const userMessage = [{ role: 'user', content: query }];
|
||||
|
||||
let scrolledToBottom;
|
||||
@ -1153,11 +1149,7 @@ const App = () => {
|
||||
<CustomTabPanel tab={tab} index={2}>
|
||||
<Box className="ChatBox">
|
||||
<Box className="Conversation">
|
||||
<Card sx={{ flexGrow: 1, }} className={'About ChatBox'}>
|
||||
<CardContent>
|
||||
<MuiMarkdown>{about}</MuiMarkdown>
|
||||
</CardContent>
|
||||
</Card>
|
||||
<Message {...{ message: { role: 'assistant', content: about }, submitQuery }} />
|
||||
</Box>
|
||||
</Box>
|
||||
</CustomTabPanel>
|
||||
|
@ -40,6 +40,18 @@ const backstoryTheme = createTheme({
|
||||
},
|
||||
},
|
||||
components: {
|
||||
MuiLink: {
|
||||
styleOverrides: {
|
||||
root: {
|
||||
color: '#4A7A7D', // Dusty Teal (your secondary color)
|
||||
textDecoration: 'none',
|
||||
'&:hover': {
|
||||
color: '#D4A017', // Golden Ochre on hover
|
||||
textDecoration: 'underline',
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
MuiButton: {
|
||||
styleOverrides: {
|
||||
root: {
|
||||
|
@ -1,6 +1,7 @@
|
||||
import React, { useState } from 'react';
|
||||
import React, { useEffect, useState, useCallback } from 'react';
|
||||
import {
|
||||
Typography,
|
||||
Card,
|
||||
Button,
|
||||
Tabs,
|
||||
Tab,
|
||||
@ -22,9 +23,10 @@ import {
|
||||
SwapHoriz,
|
||||
} from '@mui/icons-material';
|
||||
import { SxProps, Theme } from '@mui/material';
|
||||
import { MuiMarkdown } from "mui-markdown";
|
||||
import PropagateLoader from "react-spinners/PropagateLoader";
|
||||
|
||||
import { MessageData } from './MessageMeta';
|
||||
import { Message } from './Message';
|
||||
|
||||
interface DocumentComponentProps {
|
||||
title: string;
|
||||
@ -37,8 +39,31 @@ interface DocumentViewerProps {
|
||||
sx?: SxProps<Theme>,
|
||||
};
|
||||
|
||||
// Document component
|
||||
const Document: React.FC<DocumentComponentProps> = ({ title, children }) => (
|
||||
<Box
|
||||
sx={{
|
||||
display: 'flex',
|
||||
flexDirection: 'column',
|
||||
flexGrow: 1,
|
||||
overflow: 'hidden',
|
||||
}}
|
||||
>
|
||||
{
|
||||
title !== "" &&
|
||||
<Box sx={{ display: 'flex', p: 1, mt: -1, bgcolor: 'primary.light', color: 'primary.contrastText' }}>
|
||||
<Typography variant="h2">{title}</Typography>
|
||||
</Box>
|
||||
}
|
||||
<Box sx={{ display: 'flex', p: 1, flexGrow: 1, overflow: 'auto' }}>
|
||||
{children}
|
||||
</Box>
|
||||
</Box>
|
||||
);
|
||||
|
||||
const DocumentViewer: React.FC<DocumentViewerProps> = ({generateResume, resume, sx} : DocumentViewerProps) => {
|
||||
const [jobDescription, setJobDescription] = useState<string>("");
|
||||
const [processing, setProcessing] = useState<boolean>(false);
|
||||
const theme = useTheme();
|
||||
const isMobile = useMediaQuery(theme.breakpoints.down('md'));
|
||||
|
||||
@ -47,6 +72,17 @@ const DocumentViewer: React.FC<DocumentViewerProps> = ({generateResume, resume,
|
||||
// State for controlling split ratio on desktop
|
||||
const [splitRatio, setSplitRatio] = useState<number>(50);
|
||||
|
||||
useEffect(() => {
|
||||
if (processing && resume !== undefined) {
|
||||
setProcessing(false);
|
||||
}
|
||||
}, [processing, resume, setProcessing]);
|
||||
|
||||
const triggerGeneration = useCallback((jobDescription: string) => {
|
||||
setProcessing(true);
|
||||
generateResume(jobDescription);
|
||||
}, [setProcessing, generateResume]);
|
||||
|
||||
// Handle tab change for mobile
|
||||
const handleTabChange = (_event: React.SyntheticEvent, newValue: number): void => {
|
||||
setActiveDocMobile(newValue);
|
||||
@ -64,30 +100,10 @@ const DocumentViewer: React.FC<DocumentViewerProps> = ({generateResume, resume,
|
||||
|
||||
const handleKeyPress = (event: any) => {
|
||||
if (event.key === 'Enter' && event.ctrlKey) {
|
||||
generateResume(jobDescription);
|
||||
triggerGeneration(jobDescription);
|
||||
}
|
||||
};
|
||||
|
||||
// Document component
|
||||
const Document: React.FC<DocumentComponentProps> = ({ title, children }) => (
|
||||
<Box
|
||||
sx={{
|
||||
display: 'flex',
|
||||
flexDirection: 'column',
|
||||
flexGrow: 1,
|
||||
overflow: 'hidden',
|
||||
}}
|
||||
>
|
||||
{ title !== "" &&
|
||||
<Box sx={{ display: 'flex', p: 2, bgcolor: 'primary.light', color: 'primary.contrastText' }}>
|
||||
<Typography variant="h6">{title}</Typography>
|
||||
</Box> }
|
||||
<Box sx={{ display: 'flex', p: 2, flexGrow: 1, overflow: 'auto' }}>
|
||||
{children}
|
||||
</Box>
|
||||
</Box>
|
||||
);
|
||||
|
||||
// Mobile view
|
||||
if (isMobile) {
|
||||
return (
|
||||
@ -125,9 +141,24 @@ const DocumentViewer: React.FC<DocumentViewerProps> = ({generateResume, resume,
|
||||
/>
|
||||
</Document>
|
||||
<Button onClick={(e: any) => { generateResume(jobDescription); } }>Generate</Button>
|
||||
</>) : (
|
||||
<Document title="">{ resume !== undefined && <MuiMarkdown children={resume.content.trim()}/> }</Document>
|
||||
)}
|
||||
</>) : (<>
|
||||
<Document title="">{resume !== undefined && <Message message={resume} />}</Document>
|
||||
<Box sx={{
|
||||
display: "flex",
|
||||
flexDirection: "column",
|
||||
alignItems: "center",
|
||||
justifyContent: "center",
|
||||
mb: 1
|
||||
}}>
|
||||
<PropagateLoader
|
||||
size="10px"
|
||||
loading={processing}
|
||||
aria-label="Loading Spinner"
|
||||
data-testid="loader"
|
||||
/>
|
||||
</Box>
|
||||
{resume !== undefined && <Card sx={{ display: "flex", flexGrow: 1, overflow: "auto", minHeight: "fit-content", p: 1 }}><Typography><b>NOTE:</b> As with all LLMs, hallucination is always a possibility. If this resume seems too good to be true, expand the <b>LLM information for this query</b> section and click the links to the relavent RAG source document to read the details. Or go back to 'Backstory' and ask a question.</Typography></Card>}
|
||||
</>)}
|
||||
</Box>
|
||||
</Box>
|
||||
);
|
||||
@ -162,11 +193,28 @@ const DocumentViewer: React.FC<DocumentViewerProps> = ({generateResume, resume,
|
||||
</Tooltip>
|
||||
</Box>
|
||||
<Divider orientation="vertical" flexItem />
|
||||
<Box sx={{ display: 'flex', width: `${100 - splitRatio}%`, pl: 1, flexGrow: 1 }}>
|
||||
<Document title="Resume">{ resume !== undefined && <MuiMarkdown children={resume.content.trim()}/> }</Document>
|
||||
<Box sx={{ display: 'flex', width: `${100 - splitRatio}%`, pl: 1, flexGrow: 1, flexDirection: 'column' }}>
|
||||
<Document title="">{resume !== undefined && <Message message={resume} />}</Document>
|
||||
<Box sx={{
|
||||
display: "flex",
|
||||
flexDirection: "column",
|
||||
alignItems: "center",
|
||||
justifyContent: "center",
|
||||
mb: 1
|
||||
}}>
|
||||
<PropagateLoader
|
||||
size="10px"
|
||||
loading={processing}
|
||||
aria-label="Loading Spinner"
|
||||
data-testid="loader"
|
||||
/>
|
||||
</Box>
|
||||
{resume !== undefined && <Card sx={{ display: "flex", flexGrow: 1, overflow: "auto", minHeight: "fit-content", p: 1 }}><Typography><b>NOTE:</b> As with all LLMs, hallucination is always a possibility. If this resume seems too good to be true, expand the <b>LLM information for this query</b> section and click the links to the relavent RAG source document to read the details. Or go back to 'Backstory' and ask a question.</Typography></Card>}
|
||||
</Box>
|
||||
</Box>
|
||||
|
||||
{/* Split control panel */}
|
||||
|
||||
<Paper sx={{ p: 2, display: 'flex', alignItems: 'center', justifyContent: 'center' }}>
|
||||
<Stack direction="row" spacing={2} alignItems="center" sx={{ width: '60%' }}>
|
||||
<IconButton onClick={() => setSplitRatio(Math.max(20, splitRatio - 10))}>
|
||||
|
@ -4,13 +4,13 @@ import Button from '@mui/material/Button';
|
||||
import CardContent from '@mui/material/CardContent';
|
||||
import CardActions from '@mui/material/CardActions';
|
||||
import Collapse from '@mui/material/Collapse';
|
||||
import { MuiMarkdown } from "mui-markdown";
|
||||
import Typography from '@mui/material/Typography';
|
||||
import ExpandMoreIcon from '@mui/icons-material/ExpandMore';
|
||||
import { ExpandMore } from './ExpandMore';
|
||||
|
||||
import { MessageData, MessageMeta } from './MessageMeta';
|
||||
import { ChatBubble } from './ChatBubble';
|
||||
import { StyledMarkdown } from './StyledMarkdown';
|
||||
|
||||
type MessageList = MessageData[];
|
||||
|
||||
@ -53,14 +53,7 @@ const Message = ({ message, submitQuery, isFullWidth }: MessageInterface) => {
|
||||
<ChatBubble isFullWidth={isFullWidth} isUser={message.role === 'user'} sx={{ flexGrow: 1, pb: message.metadata ? 0 : "8px", mb: 1, mt: 1 }}>
|
||||
<CardContent>
|
||||
{message.role === 'assistant' ?
|
||||
<MuiMarkdown children={formattedContent} overrides={{
|
||||
ChatQuery: {
|
||||
component: ChatQuery,
|
||||
props: {
|
||||
submitQuery
|
||||
}, // Optional: pass default props if needed
|
||||
},
|
||||
}} />
|
||||
<StyledMarkdown {...{ content: formattedContent, submitQuery }} />
|
||||
:
|
||||
<Typography variant="body2" sx={{ color: 'text.secondary' }}>
|
||||
{message.content}
|
||||
@ -91,9 +84,10 @@ const Message = ({ message, submitQuery, isFullWidth }: MessageInterface) => {
|
||||
|
||||
export type {
|
||||
MessageInterface,
|
||||
MessageList
|
||||
MessageList,
|
||||
};
|
||||
export {
|
||||
Message
|
||||
Message,
|
||||
ChatQuery,
|
||||
};
|
||||
|
||||
|
@ -91,21 +91,13 @@ const ResumeBuilder = ({scrollToBottom, isScrolledToBottom, setProcessing, proce
|
||||
|
||||
const generateResume = async (jobDescription: string) => {
|
||||
if (!jobDescription.trim()) return;
|
||||
// setResume(undefined);
|
||||
|
||||
let scrolledToBottom;
|
||||
|
||||
scrollToBottom();
|
||||
setResume(undefined);
|
||||
|
||||
try {
|
||||
scrolledToBottom = isScrolledToBottom();
|
||||
setProcessing(true);
|
||||
|
||||
// Add initial processing message
|
||||
setGenerateStatus({ role: 'assistant', content: 'Processing request...' });
|
||||
if (scrolledToBottom) {
|
||||
setTimeout(() => { scrollToBottom() }, 50);
|
||||
}
|
||||
|
||||
// Make the fetch request with proper headers
|
||||
const response = await fetch(connectionBase + `/api/generate-resume/${sessionId}`, {
|
||||
@ -121,12 +113,8 @@ const ResumeBuilder = ({scrollToBottom, isScrolledToBottom, setProcessing, proce
|
||||
const token_guess = 500;
|
||||
const estimate = Math.round(token_guess / lastEvalTPS + contextStatus.context_used / lastPromptTPS);
|
||||
|
||||
scrolledToBottom = isScrolledToBottom();
|
||||
setSnack(`Job description sent. Response estimated in ${estimate}s.`, "info");
|
||||
startCountdown(Math.round(estimate));
|
||||
if (scrolledToBottom) {
|
||||
setTimeout(() => { scrollToBottom() }, 50);
|
||||
}
|
||||
|
||||
if (!response.ok) {
|
||||
throw new Error(`Server responded with ${response.status}: ${response.statusText}`);
|
||||
@ -161,20 +149,15 @@ const ResumeBuilder = ({scrollToBottom, isScrolledToBottom, setProcessing, proce
|
||||
|
||||
// Force an immediate state update based on the message type
|
||||
if (update.status === 'processing') {
|
||||
scrolledToBottom = isScrolledToBottom();
|
||||
// Update processing message with immediate re-render
|
||||
setGenerateStatus({ role: 'info', content: update.message });
|
||||
console.log(update.num_ctx);
|
||||
if (scrolledToBottom) {
|
||||
setTimeout(() => { scrollToBottom() }, 50);
|
||||
}
|
||||
|
||||
// Add a small delay to ensure React has time to update the UI
|
||||
await new Promise(resolve => setTimeout(resolve, 0));
|
||||
|
||||
} else if (update.status === 'done') {
|
||||
// Replace processing message with final result
|
||||
scrolledToBottom = isScrolledToBottom();
|
||||
setGenerateStatus(undefined);
|
||||
setResume(update.message);
|
||||
const metadata = update.message.metadata;
|
||||
@ -183,16 +166,9 @@ const ResumeBuilder = ({scrollToBottom, isScrolledToBottom, setProcessing, proce
|
||||
setLastEvalTPS(evalTPS ? evalTPS : 35);
|
||||
setLastPromptTPS(promptTPS ? promptTPS : 35);
|
||||
updateContextStatus();
|
||||
if (scrolledToBottom) {
|
||||
setTimeout(() => { scrollToBottom() }, 50);
|
||||
}
|
||||
} else if (update.status === 'error') {
|
||||
// Show error
|
||||
scrolledToBottom = isScrolledToBottom();
|
||||
setGenerateStatus({ role: 'error', content: update.message });
|
||||
if (scrolledToBottom) {
|
||||
setTimeout(() => { scrollToBottom() }, 50);
|
||||
}
|
||||
}
|
||||
} catch (e) {
|
||||
setSnack("Error generating resume", "error")
|
||||
@ -207,34 +183,22 @@ const ResumeBuilder = ({scrollToBottom, isScrolledToBottom, setProcessing, proce
|
||||
const update = JSON.parse(buffer);
|
||||
|
||||
if (update.status === 'done') {
|
||||
scrolledToBottom = isScrolledToBottom();
|
||||
setGenerateStatus(undefined);
|
||||
setResume(update.message);
|
||||
if (scrolledToBottom) {
|
||||
setTimeout(() => { scrollToBottom() }, 500);
|
||||
}
|
||||
}
|
||||
} catch (e) {
|
||||
setSnack("Error processing job description", "error")
|
||||
}
|
||||
}
|
||||
|
||||
scrolledToBottom = isScrolledToBottom();
|
||||
stopCountdown();
|
||||
setProcessing(false);
|
||||
if (scrolledToBottom) {
|
||||
setTimeout(() => { scrollToBottom() }, 50);
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Fetch error:', error);
|
||||
setSnack("Unable to process job description", "error");
|
||||
scrolledToBottom = isScrolledToBottom();
|
||||
setGenerateStatus({ role: 'error', content: `Error: ${error}` });
|
||||
setProcessing(false);
|
||||
stopCountdown();
|
||||
if (scrolledToBottom) {
|
||||
setTimeout(() => { scrollToBottom() }, 50);
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
@ -255,7 +219,6 @@ const ResumeBuilder = ({scrollToBottom, isScrolledToBottom, setProcessing, proce
|
||||
<Box className="Conversation">
|
||||
<TextField
|
||||
variant="outlined"
|
||||
autoFocus
|
||||
fullWidth
|
||||
multiline
|
||||
rows="10"
|
||||
|
Loading…
x
Reference in New Issue
Block a user